Accreditation Bodies
Accreditation Bodies
Accreditation Bodies
Supercharge your career with our Multi-Cloud Engineer Bootcamp
KNOW MORECloud computing refers to the delivery of computing services, such as servers, storage, databases, software, and analytics, over the internet to enable faster innovation, flexible resources, and economies of scale. Cloud computing is becoming increasingly popular due to its scalability, accessibility, and cost-efficiency. Whether you’re a beginner or preparing for an advanced-level interview, our set of expert curated interview questions will help you understand the concepts in detail. The various topics of the question set are different types of cloud, different services, cloud computing tools, Azure, Databases, and more. Our set of cloud computing interview questions and answers will help you prepare the interview confidently.
Filter By
Clear all
National Institute of Standards and Technology(NIST) defined Cloud computing as a model facilitate on-demand globally accessible network to a shared pool of computing resources (e.g., networks, servers, storage, applications, and services) that can be provisioned by self service portal provided by cloud service provider.
Cloud Computing is a new class of network based computing services that is available over the Internet, This model is similar to Utility Computing a collection/group of integrated and networked hardware, software and Internet infrastructure (called a platform).
Cloud Computing is a collection of layers formed together to deliver a IP based computing, virtualization is a layer/module inside cloud computing architecture which will enable the providers to deliver the IaaS "Infrastructure as a Service" on the fly.
Virtualization is a Software which creates “separated” multiple images of the hardware and software on the same machine. This makes possible to install multiple OS, multiple software and multiple applications on the same physical machine.
The globalization of business, difficult economic environment and the on-demand consumption model for consumers have increased the pressure on organizations to be agile and cost effective. Cloud computing helps organization to competitive and expand. The key drivers of cloud computing are cost, risk and agility. The cloud computing drivers are depicted in below diagram:
Cloud types depends how to describe the services are delivered as well as underlying ownership. Cloud deployment types describe the nature of the specialised services that are offered.
Public Clouds – Public Clouds- Public cloud is the most common and popular cloud option adopted by users. The IT infrastructure resources like compute, network, storage in secured manner at low cost is available at public cloud environment. these IT infrastructures are shared amongst multiple clients therefore it is cheaper to use. All the resources are accessed and managed by web browser over internet.Public cloud services provided offering are Infrastructure as a service (IaaS), platform as a Service (PaaS) ans Software as a service (SaaS). Some of the public cloud offering are office 365, salesforce etc.
Advantage of Public Cloud:
Disadvantages Of Public Cloud:
Private Clouds – A private cloud consists of computing resources used exclusively by one business or organisation. The private cloud can be physically located at organisation’s on-site data center or it can be hosted by a third-party service provider. private cloud, the services and infrastructure are always maintained on a private network and the hardware and software are dedicated solely to your organisation.
Advantages Of Private Cloud:
Disadvantages Of Private Cloud:
A common question in cloud computing interview questions for freshers, don't miss this one.
Hybrid Clouds – Hybrid cloud combines the benefit of public and private clouds to reduce cost and distribute workload as per business demand. A hybrid cloud allows flow of data between private and public clouds in secure manner. It gives more flexibility and deployment option to the enterprise organizations.
Advantages Of Hybrid Cloud:
Disadvantages Of Hybrid Cloud:
Currently,lists as below:
The basic characteristics of Cloud computing mentioned below:
Azure computing is virtualized environments backed by services provider hardware (Datacenter) to meet the on-demand resources like cloud computing, storage, web apps etc. by internet using pay as you go model. Cloud Computing is the delivery of services like server storage, networking services, WebApps, databases, analytics and intelligence etc. & it provides the innovation, resources flexibility.
Basically, we no need to set up a data centre for each and every service as cloud computing offers all of these services in virtualized environments which we can utilize and enable the services to meet the business requirements.
Azure Cloud computing the best examples are Azure Iaas, Paas & SaaS services and Azure cloud platform which provides all services like, Big data, Compute, Analytics, reporting services, Databases, Open sources etc. which will enable the faster solution with geographical availability than traditional services in this competitive world.
Microsoft Azure is a flexible, open and enterprise-grade cloud computing platform which is more fast, secure, trusted, intelligent and enable to hybrid environments.
MS Azure is Virtualized environments where we will access all the services and deployed without any hardware requirements and software license. It charges to pay as you go model. If I consume for 1 hr. it will charge for 1 hr. only.
Unsurprisingly, this one pops up often in cloud computing basic interview questions.
Azure load balancer works on layer 4 and distributes the traffic across the VMS. The load balancer is of 2 types, Internal load balancer which used to the internal application and external load balancer which used for external application. Let say if you have a web application running on a set of VMs and you want to load balance then internally or externally then you can utilize the Azure load balancer. You can configure the health prob and another rule for your web application. Even if you want to apply the NAT rules you can set up the same.
It will help your infrastructure and application to protect from DDoS attacks. It works in HTTPS load balancers to provide a defence of your infrastructure. We can allow/deny the rule for the same. Cloud Armor’s are flexible in rules language which enables the customization of defence and mitigate the attacks. Even it has predefined rules to defend against cross-site scripting (XSS) and SQL injection (SQLi) application-aware attacks. If you are running a web application then it will help you on protecting from SQL injection and DDos attacks and more based on the allow and deny rules you have configured.
Expect to come across this popular question in basic cloud computing interview questions.
VPC provides connectivity from your on-premise and across all regions without exposure to the internet. It Provides connectivity to computing virtual machine instances, Kubernetes Engine clusters, App Engine Flex instances, and other resources based on the projects. we can use multiple VPCs in various projects.
It’s associated with firewall rules and routes for global resources, not individual any specific regions. Even it is allowed to share the VPC for multiple projects.
Commonly used for the Google cloud platform and in a hybrid scenario.
Cloud Storage are used to store or retrieve the data worldwide. We can integrate into apps with a single API. It’s restful online storage for WebApps to store and access the data using google cloud platforms. It provides geo-redundancy with the highest level of availability and performance. Cloud storage has low-latency, high-QPS content serving to users distributed across geographic regions.
Common Use Case:
It’s a Web framework which can be deployed in the Google cloud platform. We will deploy the google app engine, it’s fully managed the automatic engine and provide better security and reliability. it supports Java, PHP, Node.js, Python, C#, Net, Ruby and automatically scalable when traffic is more. It’s a highly available application which auto upgrade and downgrade the instance as per usage. We can manage the resources using the command line tools and debug the source code & run the API easily using DevOps tools like visual studio, PowerShell, SDK, cloud source repositories.
We can secure the application while using the App firewall and managing SSL/TLS certificates.
A common question in cloud computing questions and answers, don't miss this one.
Yes, you can replicate the S3 bucket data across the region. Bucket features allow you to copy the objects across different AWS regions.
It provides computing to AWS services, if you want to deploy a VMS then you need to use the EC2 instance and can deploy in any region. It’s a highly available and scalable instance in AWS to deploy heavy workloads in Amazon EC2 instance. Even it provides the paired key to secure the remote connection. EC2 instance used to deploy the application, SQL DB and any IaaS based application. Cost for the EC2 instance based on the VMS usage per second. Even you can use this kind of solution for a heavy workload.
Below is the storage I have used in my various projects.
In lieu to the availability, it is the time duration the provider ensures your services are available, regardless of your cloud types (Public, Private, Community, or Hybrid) and service types (SaaS, PaaS, or IaaS). Commonly this metric is revealed with the percentage of uptime as its’ basis. Moreover, uptime is the amount of time the respective service is available and operationally online in a specific time interval. And so, if the uptime is 99.99% in a year, the total duration you will be unable to access the service, widely known as downtime, is no more than 52 minutes and 36 seconds in 12 months.
Availability of the services varies from one cloud to another cloud provider. Therefore, you need to know your availability requirements in the first place which includes but not limited to a business, mission-critical, time-critical system you have and your expected uptime/acceptable downtime.
Be mindful that the definition and measurement of availability are also different from one provider to another.
That’s said, upon the identification, search for the vendor with service availability that meets or, even more, exceeds your requirements. Ideally, you should choose the vendor that guarantees, not only publishes, their availability means that they will compensate your organization for missing the promised metrics and thresholds. Always take a look and learn their Service Level Agreement (SLA) carefully and thoroughly. Comprehend their policies, terms, conditions and provision of compensation in case of an outage in clauses
If you are keen to understand from a technical perspective, you might want to know how the provider manage the outage (unplanned downtime), their Business Continuity Plan (BCP) and Disaster Recovery Plan (DRP) and how they handle their maintenance (planned downtime) too.
By default, when you store, use, share or communicate your data in the cloud, usually, your data is in a raw, unencrypted format, known as ‘plaintext’, unless you have encrypted your data before being saved or transmitted.
If you leave your data unencrypted, you will face the risk that anyone who gains access to your account can read, copy or delete your data. This leaves your data leaked or exposed to unauthorized individuals and entities. Thus, end-to-end data encryption including your emails if stored in Cloud servers, at rest, in-use and in motion, is a must.
On the other hand, from the provider’s point-of-view, they will provide secure storage space and impose confidentiality obligations by limiting user access to those who are authorized to view, edit, add, delete the data based on your requests. What’s more, they will also protect the data from accidental or purposeful unauthorized access by internal or external actors.
Over and above that, you should gather the following information on data confidentiality policies, controls, practices, and technologies the provider has put in place:
Whether the vendor provides various ways to securely access our data and services based on certain Access Control Matrix (ACM) constitutes of the users, groups, permissions, privileges and credentials they offer.
Whether the vendor provides log files to capture key activities occurring in our cloud environment so we will be able to monitor, analyze them and do follow up, for the purpose of an audit trail in particular.
Whether you as the customer maintain full control of your data and has the responsibility for managing your data, not only the provider’s services and resources. Ask for the guarantee that they do not access or use your data for any purpose without your consent. Even more, they don’t utilize your content or derive information for marketing or advertising.
Whether you could choose which region, country, or a city in which your data is stored and what type of storage deployed. Ensure the provider ensure they don’t move, modify, add, delete or replicate your data without your prior consent.
Encryption provided by the vendor: the type (at rest, in transit, in-use), the algorithm (Symmetric such as Advanced Encryption Standard (AES) or Asymmetric with the likes of Rivest–Shamir–Adleman (RSA) and Elliptic Curve Cryptography (ECC)), the encryption keys and the Key Management.
Whether the provider has any exception policy in intentionally disclosing our data to other parties usually due to a legal obligation, illegal conduct, and or binding order. If it happens, you need to know to whom your data is being unveiled, for what purpose and the provider needs to notify you prior to the disclosure.
As you might already be aware, data integrity, one of the key aspects in Information Security, means that the degree to which data is consistent, accurate and complete over its entire lifecycle.
To maintain it, you and the cloud provider must, hand-in-hand, provide such assurance.
First, assure the data cannot be modified by an unauthorized individual, entity, or program. This could happen by deploying Access Controls through Access Control Matrix (ACM) or Access Control List (ACL) revealing username, role, privilege, menu, function and object. The forensic tool may be also needed to recover from the accidental deletion by authorized users. In addition, implement also another control, checksum, to verify integrity.
Second, have data backup for occurrences like a power outage, database crash, storage failure. Given that the data is corrupted, try to identify the root cause then recover it in immediate. If it doesn’t go through, restore correct data from the backup. Regardless of the storage media utilized for the backup, always put it in separate logical or even better physical premise and location. A secure, confidential, safe one, obviously. Security policy, both logical and physical, applied to primary and backup data must be the same.
Third, implement algorithms and protocols namely Message-Digest algorithm 5 (MD5), Advanced Encryption Scheme (AES), Secure Hash Algorithm (SHA) and Rivest–Shamir–Adleman (RSA) to provide maximum levels of integrity management from any tampering or unauthorized access, specifically for data stored in the public cloud.
Fourth, getting data integrity verified through IT Audit activities. It could be possibly conducted by internal (from your side/provider end) or external entities (third-party/independent). As the 3rd layer of defence inside an organization, IT Auditor will be assessing, validating and testing IT General Controls and IT Application Controls as necessary to verify the consistency, accuracy and completeness of certain static as well as dynamic data.
In more general terms, data and information privacy is the rights you have on having various controls over how your data and information are managed across its entire lifecycle – when acquired, maintained, used, published, transferred, stored, archived and disposed.
Even though it looks quite similar with data and information security, in most cases these two definitions are found overlapped, they have a major difference to call to mind.
Privacy centre of attention is on how data and information are used and governed through certain policies, laws and regulations. On the contrary, the primary focal point of security is how data and information are protected from countless threats and vulnerabilities. In consequence, the last-mentioned isn’t adequate enough to deal with privacy.
What you could possibly do is collecting any information from the cloud provider as much as you could on:
How your data and information are processed in the cloud together with but not limited to where the provider is from, their head and in-country office, storage media, storage/server location, backup media and its location.
How they enable users to have proper controls over their data and information across its lifecycle.
What are the controls – administrative, technical and physical – the provider deploys such as policy, procedure, mechanism, standard related to data and information privacy?
How they assure our data and information are appropriately managed and the compensation they bring into the table if the privacy is broken.
The entity that is responsible for ensuring compliance to a certain standard, applicable law and regulation, along with regulatory requirements.
Whether there is any subcontractor involved in providing products and services to the cloud provider and to what extent this vendor involved in your data and information processing.
Standard and framework on data and information privacy the provider follow and comply with. You also need to know and understand their implications on your data and information privacy.
Law and regulation on data and information privacy the provider comply with. Be aware of similar law and regulation your country may have. Also acquainted with their implications and consequences.
Identify the processes on how the provider deal with cross-border data transfer if we store and process our data in multiple sites across several geographical premises in a great number of countries.
Expect to come across this popular question in cloud computing scenario based questions.
Make sure you develop a business case in the first place that consists of a minimum of three options put into the spotlight and one recommendation. Those alternatives unveiled could possibly be, for instance, #1 public cloud, #2 private cloud, #3 hybrid cloud, #4 business as usual or do nothing – assuming what your organization needs is, for instance, Infrastructure as a Service (IaaS).
In detail, a business case is a written document typically containing material related to a new business or business improvement idea intended to convince the respected decision-makers to take any action. Into the bargain, it is aimed to justify the investment of resources and finances then obtain the stakeholder’s approval based on research, analysis and facts.
According to the commonly accepted industry practice, the business case shall constitute:
A high-level view explaining the problem the proposed alternative is intended to solve, major considerations, desired deliverable, as well as the predicted business and financial aspects the recommendation shall achieve.
What problems to solve by implementing IaaS or what opportunity will your office rejoice the benefit from with IaaS deployment?
Four alternatives mentioned earlier are slated here.
A structured approach in which we can compare the solutions for effective decision-making speaks volumes. Example: SWOT analysis, Real Options Decision Tree.
An advanced level of analysis is needed to evaluate whether the solution being pursued is also financially viable e.g. Cost-Benefit Analysis/Ratio (CBA or CBR), Net Present Value (NPV), Internal Rate of Return (IRR), Return on Investment (ROI).
Identify risks of each option (public, private, hybrid, stay as is) and how to deal with them through mitigation activities.
Provides a realistic picture of how each proposed alternative will be rolled out in high-level point-of-view including its approach, timeframe, benefits, costs, and quality.
Assumptions should be validated and confirmed since the viability of alternative is dependent on them. The similar treatment goes to dependencies because they will become the prerequisites for a specific alternative/solution.
Disclose one from four alternatives that have been assessed with Business Value Analysis and Cost-Benefit Analysis as the recommended option.
Numerous success factors that should be thought about are:
Start with development and testing environment first. Leave the production system and its configuration out to minimize the probability and more importantly impact of the concerns you may encounter.
Choose to move the lowest time and mission-critical systems because if there is any incident, your organization’s business will suffer.
Identify and migrate the systems with the simplest architecture as the action will be considered as low risk.
Assess your systems entirely then decide which one you are keen to migrate. Analyze minimum specifications, configurations and actual usage in various situations (high, medium, low workload). Upon completion, identify and choose the appropriate cloud environment to match, whenever and wherever possible, all your organization’s requirements.
Be mindful of the licensing including the model and cost as well as Terms and Conditions of the cloud service (IaaS, SaaS, PaaS) and model (public, private, hybrid) you will procure ever since they differ from vendor to vendor and cloud service provider.
You should assess and evaluate SLA thoroughly including the compensation if the provider is unable to achieve the agreed metrics.
Ensure that you identify and discover application/system dependencies to avoid unplanned outages and limited functionality that usually occurs when the migration is completed.
Review each architecture (Application, Data, Infrastructure) comprehensively to achieve optimization of the cloud platform.
Inquire about the processes, activities, methods, tools needed and or offered by the provider to migrate from their cloud into another cloud.
Ask the provider on their network availability and bandwidth requirements because you and other end users will be accessing cloud services and products anywhere, any time.
In general, the cloud adoption activities should have a pre-adoption review, planning, execution, testing and post-adoption review to make sure things go well for you.
On condition that most of, or better, all the answers are yes, you could do this assignment alone. Otherwise, you could identify the adoption partner based on scores of measurements like the vendor’s capability, experience, resources, support, tools, portfolio and their client’s testimonials.
Both metrics and KPI will help you to understand the current state of your IT environment and determine whether your adoption is successfully completed.
Develop your cloud-adoption plan that accommodates the below factors:
In big-bang (do it all at once) scenario, it drives a huge change over a longer period of time as you move your entire computing components over and run a test to see if it works as expected. Presuming you take short-sprint (do it a little bit at a time) option, you migrate your computing component over, validating it then continuing these activities until all components are moved to the cloud.
You are urged to make sure everything is working by conducting the test. It could be manual or automated, based on plenty of scenarios, by capitalizing the previously agreed baseline metrics and KPIs as key success criteria.
It constitutes three main points: what went well (good things), what’s the room for improvement (bad things), and what’s the action plan (to improve the bad).
Constitutes of basic understanding of cloud computing from both business and technical point of view, migration from on-premise to the cloud, also the governance of cloud computing environments. Issued by a non-profit, Information Technology trade association, the Computing Technology Industry Association (CompTIA). No prerequisite required; nevertheless, the examinee is recommended to have six months of working experience in IT services environment at the minimum.
Slightly different from Cloud Essentials, it validates your skills in maintaining and optimizing cloud infrastructure services. Consequently, it will assess our competence to perform data centre jobs effectively and efficiently such as configuration, deployment, security, troubleshooting, maintenance and management.
Offered by an independent exam and certification company EXIN, it covers cloud computing basic concepts and principles, tests the technical knowledge namely Security and Compliance as well as looks at general aspects inclusive of implementation, management, and evaluation.
For those who are relatively new to cloud computing, this credential is assessing your basic knowledge of Cloud Computing concepts. Developed by joint forces between EXIN and an international member-based organization Cloud Credential Council (CCC), it tests your understanding of the main concepts of Cloud Services Model, Virtualization, Cloud Technologies and Applications, Security, Risk, Compliance, Governance, Adoption, and Service Management.
This Amazon Web Services (AWS) Certified Solution Architect’s accreditation is divided into two paths: Associate and Professional. The first is aimed to assess the individual knowledge in architecting and deploying secure, robust systems on AWS while, on the one hand, it’s also a prerequisite to achieving the professional certification. In the second place, it also validates your ability to define solutions based on customer/end-user requirements using architectural design principles and provides implementation guidance to your organization based on best practices throughout the Project Life Cycle.
What’s more, the professional path targets individuals with two or more years of hands-on experience in designing and deploying cloud architecture and architecting and implementing dynamically scalable, highly available, fault-tolerant, and reliable applications on AWS. It also validates the exam taker’s competence in migrating complex, multi-tier applications on the platform, designing and deploying enterprise-wide scalable operations and implementing cost-control strategies.
As a professional Cloud Architect, you are expected to have the necessary skills and knowledge to enable your organization to leverage Google Cloud technologies. By securing this testament, your ability to design, plan, develop, implement, manage and provision robust, secure, scalable, highly available and reliable cloud architecture using Google Cloud Platform (GCP) along with dynamic solutions to drive business objectives is recognized.
It's no surprise that this one pops up often in interview questions about cloud computing.
To date, many ISO standards have been applied to the cloud. Taking out the expired and withdrawn versions, here is the list:
Information Technology -- Cloud computing – Overview and vocabulary
Information Technology -- Cloud computing -- Reference architecture
Information Technology -- Cloud Data Management Interface (CDMI)
Information Technology -- Cloud computing -- Service level agreement (SLA) framework -- Part 1: Overview and concepts
Cloud computing -- Service level agreement (SLA) framework -- Part 2: Metric model
Information Technology -- Cloud computing -- Service level agreement (SLA) framework -- Part 3: Core conformance requirements
Cloud computing -- Service level agreement (SLA) framework -- Part 4: Components of security and of protection of PII (Personally Identifiable Information)
Information Technology -- Virtualization Management Specification
Cloud Infrastructure Management Interface (CIMI) Model and RESTful HTTP-based Protocol -- An Interface for Managing Cloud Infrastructure
Information Technology -- Cloud computing -- Interoperability and portability
Information Technology -- Cloud computing -- Cloud services and devices: Data flow, data categories and data use
Information Technology -- Cloud computing -- Guidance for policy development
Information Technology -- Cloud computing -- Framework of trust for processing of multi-sourced data
Information Technology -- Security techniques -- Code of practice for information security controls based on ISO/IEC 27002 for cloud services
Information Technology -- Security techniques -- Code of practice for protection of PII in public clouds acting as PII processors
Like any other ISO standards, conforming to them has many benefits for the provider’s businesses: building credibility at the international level, saving time and money by identifying and solving recurring problems, and improving and enhancing the system and process efficiency and effectiveness. On top of that, it is also living proof, publicly accessible, that the provider has properly managed their information security, including its risk, fulfilled their audit requirements and established trust both internally and externally that controls are properly placed and implemented in order to serve their customers better and hence increase their satisfaction level.
You, as the user, are urged to assess their ISO certification. Critical points to reflect on are: which product, service, or location does it actually cover? Is the certification for the entire organization or only for their head office exclusive of their branches? Who issues the certification and whether the issuer is one of the ISO-accredited bodies? For certain, you must see the original certificate and witness what information revealed there.
Hypervisors is a software which is used to virtualise physical server to logical servers to optimise resource utilization Hypervisors are divided into two types.
Bare metal hypervisor are deployed over physical server are classified as Type one hypervisor. Some examples of the type 1 hypervisors are Microsoft Hyper-V hypervisor, VMware ESXi, Citrix XenServer.
When hypervisor run on top of OS then it its type2 hypervisor and examples are. Kvm, oracle virtualbox
Multi cloud is cloud deployment model where IT infrastructure resource like compute, storage, network band width are used from multi cloud service or in house Data centre to complete business transaction. Its pooling of resources from different cloud service provider or combination of IT resource from in house Data centre and cloud services. This model is good use case where business function resources can not be met from one location.
A common question in basic interview questions on cloud computing, don't miss this one.
The cloud hosting drivers when identifying workloads are following
Business application which handles missing critical, ERP and data sensitive information are not fit for cloud hosting. The applications which are running other than Intel platform are also not fit for immediate migration to cloud platform.
n that case, we need to create the Storage accounts V1 or V2 based on the requirements and create the file storage and create the directory. We will click on the connect button and map the drive to customer servers.
It collects the logs based on Azure monitor and stores in log analytics workspace for analyzing and sending alerts. Even we can query and find the specific alerts or logs if required. It’s basically a monitoring tool which is monitoring most of the Azure services and it will collect the logs from the various ways.
I will click on new and go to market place search for WebApps and provide the details and create the same.
Blob storage is used to store the massive amount of unstructured data like jpeg file or archived files. It’s a cloud-based solution, It provides the durability and high availability & It’s a secure, manageable solution for larger data. We can access the storage account easily using the HTTP/https, api etc.
Few of the scenarios you will use the blob storage accounts.
Below is the list
Google stack drivers provide depth diagnostics and monitor the health of the App Engine, which will monitor Google google services and sent out an alert for the same.
It’s collected the logs based on matrices, logs, and events from google cloud infrastructure, Application and other operations which are running in Google platform. Based on the logs collection, it’s observed the speed RCA and reduce the time to resolution. Even it does not require any integration to provide support to developers.
RDBMS is easy to set up and operate. It’s a highly scalable relational database in the AWS cloud. RDBMS is a cost-effective solution. We can resize the capacity of the RDBMS when it’s not in use. Helps us to reduce the administration, patching & backup task while automating the process.
Amazon RDS is available on several database instance types - optimized for memory, performance or I/O - and provides you with six familiar database engines, including Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle Database, and SQL Server. we can use the AWS Database Migration Service to easily migrate or replicate your existing databases to Amazon RDS.
Azure recovery Vault services are used to take the backup of VMs and other services and it will provide the migration feature which can be utilized if we need to migrate the on-premise VMs to Azure.
It’s also used for on-premises migration to (Hyper-V, VMware, Physical server migration to Azure). Its dose supports backup for Azure VMs (Linux/Windows), Azure File storage, PaaS SQL, WebApps, SQL DB ON Azure VMs etc. We can configure the daily backup policy and schedule the backup. Maximum backup can be retail up 999 years. It provides fine-grain access management through RBAC. We can configure the site recovery using Azure Portal for backup and migrating the on-premises environment to Azure.
This vendor-neutral accreditation issued by Cloud Security Alliance certifies our understanding of security issues and best practices over a broad range of cloud computing domains ranging from architecture, governance, compliance, operations, encryption, to
Getting the certification provided by Cloud Security Alliance and International Information System Security Certification Consortium known as (ISC)² indicates you have advanced technical skills and knowledge to design, manage and secure data, applications and infrastructure in the cloud using best practices, policies and procedures.
Nevertheless, it is considered one of the advanced credentials among others – consequently, those who are interested in pursuing are required to have five years of working experience in IT fields on a full-time basis. Three of them shall correlate with information security whereas one year must pertain to architectural concept, design requirements, security in cloud data, cloud platform and infrastructure, cloud application also operations, legal and compliance.
If and only if you have earned CCSK, the previously mentioned one-year requirement will be waived. Supposing that you already hold Certified Information Systems Security Professional (CISSP) certification, it can replace the entire five-year requirement.
This EXIN certification focuses on the interconnection of three areas: Service Management, Cloud Computing, and IT Security. That being the case, you will be automatically granted it without any cost when you possess their three foundational certificates: EXIN Information Security, EXIN Cloud Computing Foundation, and (IT) Service Management.
Issued by Cloud Credential Council (CCC) and managed by EXIN, it recognizes skills and knowledge an individual possesses on security, risk and compliance cloud computing issues. In detail, the certificate primary focus is on the intersection between business and technical security challenges in an enterprise’s cloud computing environment. Five years of working experience in enterprise security with a deep understanding of cloud computing services and deployment models are the recommended prerequisites.
Speaking about applicable laws and regulations to certain data and information, there is a term called ‘data sovereignty’ or ‘information sovereignty’. In this case, since you’re asking about data, the first expression will be capitalized.
Basically, it is subject to numerous laws and regulations of the country in which the data is located or stored, used and transmitted – both sent and received.
Since those early days, it’s been one of the key challenges when an individual/organization wants to move into the cloud, the government/authority insist that the data should never leave their jurisdiction, which directly means we couldn’t place it in our desired services.
Thus far, there is no international policy, standard, or agreement which provides one set of data sovereignty’s requirements that all countries should be following.
Day in day out, it gains more weight and in response, many countries have established and regulated compliance requirements by amending the current laws or enacting new legislation that requires customer data kept within the country it resides. Over the past few years, this kind of obligations has been lately enforced in Vietnam, Brunei, Iran, China, Brazil, India, Australia, South Korea, Nigeria, Russia and Indonesia.
Additionally, the laws and regulations vary by country whilst some are, in fact, stricter than the others. Some of them mandate their citizens’ data is stored on physical servers within the country’s physical borders. Australia, for example, commands the provider to reveal what information is being sent outside the country.
In addition to that, the European Union (EU) restrict the transfer of Personally Identifiable Information (PII) to countries outside their member countries. PII itself is the type of data that could potentially identify a specific individual and refers to a relatively narrow range of data such as name, address, birth date, credit card number or bank account.
What You Could Do You’d better know the storage, server and any other device where your data will reside, what’s in the fine print, whether the provider has already complied to data sovereignty laws in the country where your data is located at. If your government body requires you to, in this context, store your data at the country where you are based, make sure two things. First, the provider has its’ storage deployed there. Second, their obligations to applicable laws and regulations on data sovereignty are already fulfilled.
The cloud providers have what they call ‘Cloud Management’ as tools, aimed as administrative control over public, private and hybrid clouds. What’s more, the software is intended for the users to manage, ranging from capabilities, availability, security, utilization, resource allocations, workflow, automation, workload balancing, capacity planning, monitoring, controlling, orchestration, provisioning, budgeting, cost and expense, performance, reporting and even the migration of cloud products and services we have subscribed into.
On top of it, there are two types of cloud management software. In-house software developed and offered by public, private and hybrid cloud provider and the second is mass product’s that comes from a third-party vendor to complement the aforementioned tools.
Over and above, if you choose public cloud then often time you will be given the option to manage your services with third-party tools simply because the servers, storage, networking, and other infrastructure operations are taken care of by the providers.
Besides, if you are a private cloud user, the tool is required to create the virtualization and virtualized computing resources, deal with resource allocation, security, monitoring, tracking as well as billing through a self-service portal.
Cloud management is more complex to handle when it comes to hybrid cloud due to obligation on having to deal with the network, computing, and storage devices across multiple domains including but not limited to installation, configuration, administration of instances, images, user accounts, and their access rights as part of Identity and Access Management.
Regardless of a native and third-party tool designed to provide rich functionality across one or multiple cloud providers, the platform must be able to provide the following features at the very least:
Since your company is yet to decide which vendor to pick up, a few frameworks you could check out is:
If you find it challenging to deal with User Experience (UX) as the complexity and diversity of cloud systems emerges from time to time, this framework from National Institute of Standards and Technology (NIST) under U.S. Department of Commerce comes into the picture.
You could evaluate your cloud UX and the user expectations in a more structured way through five attributes and 21 elements provided.
Developed by NIST Cloud Computing Reference Architecture and Taxonomy Working Group, this paper is intended to portray a high-level conceptual model for defining requirements, structures along with the operations of cloud computing.
On top of that, it is divided into two parts. One is the complete overview of actors including their roles and the architectural components for managing and providing cloud services with the likes of service deployment, service orchestration, cloud service management, security and privacy. The other is Taxonomy presented in own section and appendices constituting of terms, definitions and examples of cloud services.
Without question, there is no such thing as a one-framework-fits-all. Each of them has its pros and cons. Due to this reason, it’s important to analyze the available frameworks and put the Cost and Benefit approach, for instance, or any other key metrics into consideration.
Else, you could develop your own framework or explore the opportunity to design a hybrid framework by combining yours with existing framework out there, or even more, drawing this further, combining a handful of frameworks to help your organizations meeting their unique requirements as well as business objectives.
If you need to assess security risks of a cloud provider, this framework will bear the fruit while on the other hand, it provides fundamental security concepts and principles in 13 domains and 133 controls for the vendor to follow.
Shortly known as CCM, from the vendor’s perspective, it will improve or enhance security control environments by emphasizing business information security control requirements, identifying and mitigating from security threats and vulnerabilities in the cloud. The matrix also offers cloud taxonomy and terminology, security measurements, standardized security risk, IT risk and operational risk when notably managing one or all of them.
Sr. No. | Cloud Control Matrix - Domains | No. of Controls for Each Domain (Cloud Security Alliance) |
---|---|---|
1. | AIS: Application & Interface Security | 4 |
2. | AAC: Audit Assurance & Compliance | 3 |
3. | BCR: Business Continuity Management & Operational Resilience | 11 |
4. | CCC: Change Control & Configuration Management | 5 |
5. | DSI: Data Security & Information Lifecycle Management | 7 |
6. | DCS: Datacenter Security | 9 |
7. | EKM: Encryption & Key Management | 4 |
8. | GRM: Governance and Risk Management | 11 |
9. | HRS: Human Resources | 11 |
10. | IAM: Identity & Access Management | 13 |
11. | IVS: Infrastructure & Virtualization Security | 13 |
12. | IPY: Interoperability & Portability | 5 |
13. | MOS: Mobile Security | 20 |
14. | SEF: Security Incident Management, E-Discovery & Cloud Forensics | 5 |
15. | STA: Supply Chain Management, Transparency and Accountability | 9 |
16. | TVM: Threat and Vulnerability Management | 3 |
If you consider adopting public cloud computing, then this 80-page document shall come to the light. It will give you a big picture of security and privacy challenges and crucial points to consider when you outsource your data, applications and infrastructure to a public cloud provider in which they own and operate the infrastructure and computational resources aside from the fact they deliver services to the public via a multi-tenant platform.
As this paper tells us, it does not recommend any specific cloud computing service, service arrangement, service agreement, service provider, or deployment model. Such consequence is each organization is encouraged to apply their very own guidelines when analyzing their requirements, inclusive of security and privacy, and to assess, select, engage, and oversee the public cloud services that can fulfil those requirements at the most.
Other than two frameworks explained above, you could also bring another document titled ‘Security Guidance for Critical Areas of Focus in Cloud Computing v4.0’ from Cloud Security Alliance (CSA) into play. Developed based on previous iterations of the security guidance, dedicated research, and public participation from their members, working groups, and industry experts within their community, it provides how to manage and mitigate security and risks in adopting cloud computing technology while also pledge guidance and insights to support business goals.
The document issued by ISACA is intended to both actors, cloud users and cloud providers, so they could assess the design and operating effectiveness of the cloud computing internal controls (administrative, physical, technical) and security, identify internal control discrepancies and deficiencies within the end-user organization and its interface with the service provider. In essence, we could refer to this guide to, after all, provide the results of an audit assessment and our ability to rely upon our own IT department and or the cloud provider’s attestations on internal controls.
As the title stands and tells us, this white paper from SANS Institute guides us on how to conduct a security audit on our cloud environment and also is aimed for the cloud provider to audit their cloud environment.
It constitutes of audit methodology, audit checklist, standards, laws and regulations we could put into service to witness security risks and in the end test the respective controls.
Area to be audited is as follows:
As we might already know, ISACA develops IT Assurance Framework (ITAF) as a guideline that provides information and direction for the practice of IT audit and assurance. IT also offers tools, techniques, methodologies, and templates to direct the application of IT audit and assurance processes. Read up on ITAF sections 3400 – IT Management Processes, sections 3600 – IT Audit and Assurance Processes, and keep an eye on sections 3800 – IT Audit and Assurance Management.
Well, safely say the components such as audit objective, scope, risk, plan, methodology/approach, along with its procedures (processes and techniques), are much the same as other types of IT or IS Audit engagement.
The main thing is, in the cloud, with shared resourcing, multitenancy and geolocation, the boundaries are difficult to define and isolate meanwhile the end-user specific transactional information is difficult to obtain. As such, IT Assurance needs to become more real-time, continuous and process-oriented vs. transactional in focus, while the cloud providers need to provide greater transparency to their clients.
Objective
Organizations should strive to align their business objectives with the objectives of the audit. During the planning stage, the auditor shall identify what the objectives then have them agreed with the auditee. From the auditor end, they are going to use the objectives as a way of concluding on the evidence they obtain. Some of the notable objectives are:
computing service provider’s internal controls.
Above controls also includes IT application controls, not merely IT general controls that are aimed to provide assurance of specific application, its functionality and suitability.
To get an idea on the controls including their objectives on the cloud environment, have a look at ISACA Control Objectives for Information and Related Technologies (COBIT). Even though it is developed as a general control framework, some of the control objectives have some applicability to the cloud.
Scope
When it comes to IT general controls, the auditor from the customer’s end shall do the review on:
If your IAM system is integrated with the cloud computing system
To interface with and manage cloud computing incidents
As an access point to the internet
If the cloud is part of your application infrastructure
It is also important to note that the controls that are maintained by a vendor are not included in the scope of a cloud computing audit.
It is a common practice an organization may use these two approaches to measure a cloud provider:
Inclusive of vendor risk assessment, vendor due diligence, vendor rating/tiering, vendor Scope of Work, vendor agreement, and vendor Service Level Agreement (SLA)
Third-party auditor whether provided by the cloud provider or the end-user.
Procedure
Whether it’s rolled out by your internal function, the vendor’s organizational unit, or by the third party, the auditor will turn stacks of processes and techniques to account to obtain evidence through inquiry of data and document, assessment, confirmation, recalculation, reperformance, observation, meeting, discussion, inspection, analytics, and confirmation.
Cloud Governance is basically a set of standardized policies and practices involving people, process and technology related to cloud computing environment and designed to ensure the organization and more importantly business objectives are met without surpassing risk tolerance and compliance requirements.
Business goals and objectives are varied between one and another entity, however, the most commonly found is the performance, budget/cost optimization, customer satisfaction, employee attraction and retention and resource productivity.
According to The Open Group, governance answers three huge questions. First, are we doing the right things? Second, are we doing things in the right way? Last, how do we know that we have done both?
The global consortium that enables the achievement of business objectives through IT standards think that Cloud Computing Governance is a view of IT Governance focused on accountability, defining decision rights and any other way balancing benefit/value, risk, and resources in a cloud environment. At large, it is a subset of overall business governance which includes IT Governance and Enterprise Architecture (EA) Governance.
You could put their Cloud Computing Governance Framework to use. As a pool of business-driven policies and principles that establish the appropriate degree of investments and control around the Cloud Computing lifecycle and its processes, your organization could make certain all expenses and costs associated are aligned with your company business objectives, foster data integrity organization-wide, stimulate innovation, and manage the risk of data loss and or non-compliance with regulations, they say.
As it will help our organization to identify vulnerabilities before a compromise could take place, the process is started by identifying and assigning severity levels to security defects through manual and automated techniques in a certain period of time. Be mindful that since this is related to cloud computing, there are two types of PT. First is the test the provider does to its own platform and second the test you could do to their resources, specifically for your systems. Importantly, not all cloud vendors allow penetration testing.
Ideally, the assessment shall target different layers of technology from Host, Network, Storage, Server, Virtualization, Operating System, Middleware, Runtime, Database, and Application by highly considering your cloud models (SaaS, PaaS, IaaS, etc.) and cloud deployment models.
National Institute of Standards and Technology(NIST) defined Cloud computing as a model facilitate on-demand globally accessible network to a shared pool of computing resources (e.g., networks, servers, storage, applications, and services) that can be provisioned by self service portal provided by cloud service provider.
Cloud Computing is a new class of network based computing services that is available over the Internet, This model is similar to Utility Computing a collection/group of integrated and networked hardware, software and Internet infrastructure (called a platform).
Cloud Computing is a collection of layers formed together to deliver a IP based computing, virtualization is a layer/module inside cloud computing architecture which will enable the providers to deliver the IaaS "Infrastructure as a Service" on the fly.
Virtualization is a Software which creates “separated” multiple images of the hardware and software on the same machine. This makes possible to install multiple OS, multiple software and multiple applications on the same physical machine.
The globalization of business, difficult economic environment and the on-demand consumption model for consumers have increased the pressure on organizations to be agile and cost effective. Cloud computing helps organization to competitive and expand. The key drivers of cloud computing are cost, risk and agility. The cloud computing drivers are depicted in below diagram:
Cloud types depends how to describe the services are delivered as well as underlying ownership. Cloud deployment types describe the nature of the specialised services that are offered.
Public Clouds – Public Clouds- Public cloud is the most common and popular cloud option adopted by users. The IT infrastructure resources like compute, network, storage in secured manner at low cost is available at public cloud environment. these IT infrastructures are shared amongst multiple clients therefore it is cheaper to use. All the resources are accessed and managed by web browser over internet.Public cloud services provided offering are Infrastructure as a service (IaaS), platform as a Service (PaaS) ans Software as a service (SaaS). Some of the public cloud offering are office 365, salesforce etc.
Advantage of Public Cloud:
Disadvantages Of Public Cloud:
Private Clouds – A private cloud consists of computing resources used exclusively by one business or organisation. The private cloud can be physically located at organisation’s on-site data center or it can be hosted by a third-party service provider. private cloud, the services and infrastructure are always maintained on a private network and the hardware and software are dedicated solely to your organisation.
Advantages Of Private Cloud:
Disadvantages Of Private Cloud:
A common question in cloud computing interview questions for freshers, don't miss this one.
Hybrid Clouds – Hybrid cloud combines the benefit of public and private clouds to reduce cost and distribute workload as per business demand. A hybrid cloud allows flow of data between private and public clouds in secure manner. It gives more flexibility and deployment option to the enterprise organizations.
Advantages Of Hybrid Cloud:
Disadvantages Of Hybrid Cloud:
Currently,lists as below:
The basic characteristics of Cloud computing mentioned below:
Azure computing is virtualized environments backed by services provider hardware (Datacenter) to meet the on-demand resources like cloud computing, storage, web apps etc. by internet using pay as you go model. Cloud Computing is the delivery of services like server storage, networking services, WebApps, databases, analytics and intelligence etc. & it provides the innovation, resources flexibility.
Basically, we no need to set up a data centre for each and every service as cloud computing offers all of these services in virtualized environments which we can utilize and enable the services to meet the business requirements.
Azure Cloud computing the best examples are Azure Iaas, Paas & SaaS services and Azure cloud platform which provides all services like, Big data, Compute, Analytics, reporting services, Databases, Open sources etc. which will enable the faster solution with geographical availability than traditional services in this competitive world.
Microsoft Azure is a flexible, open and enterprise-grade cloud computing platform which is more fast, secure, trusted, intelligent and enable to hybrid environments.
MS Azure is Virtualized environments where we will access all the services and deployed without any hardware requirements and software license. It charges to pay as you go model. If I consume for 1 hr. it will charge for 1 hr. only.
Unsurprisingly, this one pops up often in cloud computing basic interview questions.
Azure load balancer works on layer 4 and distributes the traffic across the VMS. The load balancer is of 2 types, Internal load balancer which used to the internal application and external load balancer which used for external application. Let say if you have a web application running on a set of VMs and you want to load balance then internally or externally then you can utilize the Azure load balancer. You can configure the health prob and another rule for your web application. Even if you want to apply the NAT rules you can set up the same.
It will help your infrastructure and application to protect from DDoS attacks. It works in HTTPS load balancers to provide a defence of your infrastructure. We can allow/deny the rule for the same. Cloud Armor’s are flexible in rules language which enables the customization of defence and mitigate the attacks. Even it has predefined rules to defend against cross-site scripting (XSS) and SQL injection (SQLi) application-aware attacks. If you are running a web application then it will help you on protecting from SQL injection and DDos attacks and more based on the allow and deny rules you have configured.
Expect to come across this popular question in basic cloud computing interview questions.
VPC provides connectivity from your on-premise and across all regions without exposure to the internet. It Provides connectivity to computing virtual machine instances, Kubernetes Engine clusters, App Engine Flex instances, and other resources based on the projects. we can use multiple VPCs in various projects.
It’s associated with firewall rules and routes for global resources, not individual any specific regions. Even it is allowed to share the VPC for multiple projects.
Commonly used for the Google cloud platform and in a hybrid scenario.
Cloud Storage are used to store or retrieve the data worldwide. We can integrate into apps with a single API. It’s restful online storage for WebApps to store and access the data using google cloud platforms. It provides geo-redundancy with the highest level of availability and performance. Cloud storage has low-latency, high-QPS content serving to users distributed across geographic regions.
Common Use Case:
It’s a Web framework which can be deployed in the Google cloud platform. We will deploy the google app engine, it’s fully managed the automatic engine and provide better security and reliability. it supports Java, PHP, Node.js, Python, C#, Net, Ruby and automatically scalable when traffic is more. It’s a highly available application which auto upgrade and downgrade the instance as per usage. We can manage the resources using the command line tools and debug the source code & run the API easily using DevOps tools like visual studio, PowerShell, SDK, cloud source repositories.
We can secure the application while using the App firewall and managing SSL/TLS certificates.
A common question in cloud computing questions and answers, don't miss this one.
Yes, you can replicate the S3 bucket data across the region. Bucket features allow you to copy the objects across different AWS regions.
It provides computing to AWS services, if you want to deploy a VMS then you need to use the EC2 instance and can deploy in any region. It’s a highly available and scalable instance in AWS to deploy heavy workloads in Amazon EC2 instance. Even it provides the paired key to secure the remote connection. EC2 instance used to deploy the application, SQL DB and any IaaS based application. Cost for the EC2 instance based on the VMS usage per second. Even you can use this kind of solution for a heavy workload.
Below is the storage I have used in my various projects.
In lieu to the availability, it is the time duration the provider ensures your services are available, regardless of your cloud types (Public, Private, Community, or Hybrid) and service types (SaaS, PaaS, or IaaS). Commonly this metric is revealed with the percentage of uptime as its’ basis. Moreover, uptime is the amount of time the respective service is available and operationally online in a specific time interval. And so, if the uptime is 99.99% in a year, the total duration you will be unable to access the service, widely known as downtime, is no more than 52 minutes and 36 seconds in 12 months.
Availability of the services varies from one cloud to another cloud provider. Therefore, you need to know your availability requirements in the first place which includes but not limited to a business, mission-critical, time-critical system you have and your expected uptime/acceptable downtime.
Be mindful that the definition and measurement of availability are also different from one provider to another.
That’s said, upon the identification, search for the vendor with service availability that meets or, even more, exceeds your requirements. Ideally, you should choose the vendor that guarantees, not only publishes, their availability means that they will compensate your organization for missing the promised metrics and thresholds. Always take a look and learn their Service Level Agreement (SLA) carefully and thoroughly. Comprehend their policies, terms, conditions and provision of compensation in case of an outage in clauses
If you are keen to understand from a technical perspective, you might want to know how the provider manage the outage (unplanned downtime), their Business Continuity Plan (BCP) and Disaster Recovery Plan (DRP) and how they handle their maintenance (planned downtime) too.
By default, when you store, use, share or communicate your data in the cloud, usually, your data is in a raw, unencrypted format, known as ‘plaintext’, unless you have encrypted your data before being saved or transmitted.
If you leave your data unencrypted, you will face the risk that anyone who gains access to your account can read, copy or delete your data. This leaves your data leaked or exposed to unauthorized individuals and entities. Thus, end-to-end data encryption including your emails if stored in Cloud servers, at rest, in-use and in motion, is a must.
On the other hand, from the provider’s point-of-view, they will provide secure storage space and impose confidentiality obligations by limiting user access to those who are authorized to view, edit, add, delete the data based on your requests. What’s more, they will also protect the data from accidental or purposeful unauthorized access by internal or external actors.
Over and above that, you should gather the following information on data confidentiality policies, controls, practices, and technologies the provider has put in place:
Whether the vendor provides various ways to securely access our data and services based on certain Access Control Matrix (ACM) constitutes of the users, groups, permissions, privileges and credentials they offer.
Whether the vendor provides log files to capture key activities occurring in our cloud environment so we will be able to monitor, analyze them and do follow up, for the purpose of an audit trail in particular.
Whether you as the customer maintain full control of your data and has the responsibility for managing your data, not only the provider’s services and resources. Ask for the guarantee that they do not access or use your data for any purpose without your consent. Even more, they don’t utilize your content or derive information for marketing or advertising.
Whether you could choose which region, country, or a city in which your data is stored and what type of storage deployed. Ensure the provider ensure they don’t move, modify, add, delete or replicate your data without your prior consent.
Encryption provided by the vendor: the type (at rest, in transit, in-use), the algorithm (Symmetric such as Advanced Encryption Standard (AES) or Asymmetric with the likes of Rivest–Shamir–Adleman (RSA) and Elliptic Curve Cryptography (ECC)), the encryption keys and the Key Management.
Whether the provider has any exception policy in intentionally disclosing our data to other parties usually due to a legal obligation, illegal conduct, and or binding order. If it happens, you need to know to whom your data is being unveiled, for what purpose and the provider needs to notify you prior to the disclosure.
As you might already be aware, data integrity, one of the key aspects in Information Security, means that the degree to which data is consistent, accurate and complete over its entire lifecycle.
To maintain it, you and the cloud provider must, hand-in-hand, provide such assurance.
First, assure the data cannot be modified by an unauthorized individual, entity, or program. This could happen by deploying Access Controls through Access Control Matrix (ACM) or Access Control List (ACL) revealing username, role, privilege, menu, function and object. The forensic tool may be also needed to recover from the accidental deletion by authorized users. In addition, implement also another control, checksum, to verify integrity.
Second, have data backup for occurrences like a power outage, database crash, storage failure. Given that the data is corrupted, try to identify the root cause then recover it in immediate. If it doesn’t go through, restore correct data from the backup. Regardless of the storage media utilized for the backup, always put it in separate logical or even better physical premise and location. A secure, confidential, safe one, obviously. Security policy, both logical and physical, applied to primary and backup data must be the same.
Third, implement algorithms and protocols namely Message-Digest algorithm 5 (MD5), Advanced Encryption Scheme (AES), Secure Hash Algorithm (SHA) and Rivest–Shamir–Adleman (RSA) to provide maximum levels of integrity management from any tampering or unauthorized access, specifically for data stored in the public cloud.
Fourth, getting data integrity verified through IT Audit activities. It could be possibly conducted by internal (from your side/provider end) or external entities (third-party/independent). As the 3rd layer of defence inside an organization, IT Auditor will be assessing, validating and testing IT General Controls and IT Application Controls as necessary to verify the consistency, accuracy and completeness of certain static as well as dynamic data.
In more general terms, data and information privacy is the rights you have on having various controls over how your data and information are managed across its entire lifecycle – when acquired, maintained, used, published, transferred, stored, archived and disposed.
Even though it looks quite similar with data and information security, in most cases these two definitions are found overlapped, they have a major difference to call to mind.
Privacy centre of attention is on how data and information are used and governed through certain policies, laws and regulations. On the contrary, the primary focal point of security is how data and information are protected from countless threats and vulnerabilities. In consequence, the last-mentioned isn’t adequate enough to deal with privacy.
What you could possibly do is collecting any information from the cloud provider as much as you could on:
How your data and information are processed in the cloud together with but not limited to where the provider is from, their head and in-country office, storage media, storage/server location, backup media and its location.
How they enable users to have proper controls over their data and information across its lifecycle.
What are the controls – administrative, technical and physical – the provider deploys such as policy, procedure, mechanism, standard related to data and information privacy?
How they assure our data and information are appropriately managed and the compensation they bring into the table if the privacy is broken.
The entity that is responsible for ensuring compliance to a certain standard, applicable law and regulation, along with regulatory requirements.
Whether there is any subcontractor involved in providing products and services to the cloud provider and to what extent this vendor involved in your data and information processing.
Standard and framework on data and information privacy the provider follow and comply with. You also need to know and understand their implications on your data and information privacy.
Law and regulation on data and information privacy the provider comply with. Be aware of similar law and regulation your country may have. Also acquainted with their implications and consequences.
Identify the processes on how the provider deal with cross-border data transfer if we store and process our data in multiple sites across several geographical premises in a great number of countries.
Expect to come across this popular question in cloud computing scenario based questions.
Make sure you develop a business case in the first place that consists of a minimum of three options put into the spotlight and one recommendation. Those alternatives unveiled could possibly be, for instance, #1 public cloud, #2 private cloud, #3 hybrid cloud, #4 business as usual or do nothing – assuming what your organization needs is, for instance, Infrastructure as a Service (IaaS).
In detail, a business case is a written document typically containing material related to a new business or business improvement idea intended to convince the respected decision-makers to take any action. Into the bargain, it is aimed to justify the investment of resources and finances then obtain the stakeholder’s approval based on research, analysis and facts.
According to the commonly accepted industry practice, the business case shall constitute:
A high-level view explaining the problem the proposed alternative is intended to solve, major considerations, desired deliverable, as well as the predicted business and financial aspects the recommendation shall achieve.
What problems to solve by implementing IaaS or what opportunity will your office rejoice the benefit from with IaaS deployment?
Four alternatives mentioned earlier are slated here.
A structured approach in which we can compare the solutions for effective decision-making speaks volumes. Example: SWOT analysis, Real Options Decision Tree.
An advanced level of analysis is needed to evaluate whether the solution being pursued is also financially viable e.g. Cost-Benefit Analysis/Ratio (CBA or CBR), Net Present Value (NPV), Internal Rate of Return (IRR), Return on Investment (ROI).
Identify risks of each option (public, private, hybrid, stay as is) and how to deal with them through mitigation activities.
Provides a realistic picture of how each proposed alternative will be rolled out in high-level point-of-view including its approach, timeframe, benefits, costs, and quality.
Assumptions should be validated and confirmed since the viability of alternative is dependent on them. The similar treatment goes to dependencies because they will become the prerequisites for a specific alternative/solution.
Disclose one from four alternatives that have been assessed with Business Value Analysis and Cost-Benefit Analysis as the recommended option.
Numerous success factors that should be thought about are:
Start with development and testing environment first. Leave the production system and its configuration out to minimize the probability and more importantly impact of the concerns you may encounter.
Choose to move the lowest time and mission-critical systems because if there is any incident, your organization’s business will suffer.
Identify and migrate the systems with the simplest architecture as the action will be considered as low risk.
Assess your systems entirely then decide which one you are keen to migrate. Analyze minimum specifications, configurations and actual usage in various situations (high, medium, low workload). Upon completion, identify and choose the appropriate cloud environment to match, whenever and wherever possible, all your organization’s requirements.
Be mindful of the licensing including the model and cost as well as Terms and Conditions of the cloud service (IaaS, SaaS, PaaS) and model (public, private, hybrid) you will procure ever since they differ from vendor to vendor and cloud service provider.
You should assess and evaluate SLA thoroughly including the compensation if the provider is unable to achieve the agreed metrics.
Ensure that you identify and discover application/system dependencies to avoid unplanned outages and limited functionality that usually occurs when the migration is completed.
Review each architecture (Application, Data, Infrastructure) comprehensively to achieve optimization of the cloud platform.
Inquire about the processes, activities, methods, tools needed and or offered by the provider to migrate from their cloud into another cloud.
Ask the provider on their network availability and bandwidth requirements because you and other end users will be accessing cloud services and products anywhere, any time.
In general, the cloud adoption activities should have a pre-adoption review, planning, execution, testing and post-adoption review to make sure things go well for you.
On condition that most of, or better, all the answers are yes, you could do this assignment alone. Otherwise, you could identify the adoption partner based on scores of measurements like the vendor’s capability, experience, resources, support, tools, portfolio and their client’s testimonials.
Both metrics and KPI will help you to understand the current state of your IT environment and determine whether your adoption is successfully completed.
Develop your cloud-adoption plan that accommodates the below factors:
In big-bang (do it all at once) scenario, it drives a huge change over a longer period of time as you move your entire computing components over and run a test to see if it works as expected. Presuming you take short-sprint (do it a little bit at a time) option, you migrate your computing component over, validating it then continuing these activities until all components are moved to the cloud.
You are urged to make sure everything is working by conducting the test. It could be manual or automated, based on plenty of scenarios, by capitalizing the previously agreed baseline metrics and KPIs as key success criteria.
It constitutes three main points: what went well (good things), what’s the room for improvement (bad things), and what’s the action plan (to improve the bad).
Constitutes of basic understanding of cloud computing from both business and technical point of view, migration from on-premise to the cloud, also the governance of cloud computing environments. Issued by a non-profit, Information Technology trade association, the Computing Technology Industry Association (CompTIA). No prerequisite required; nevertheless, the examinee is recommended to have six months of working experience in IT services environment at the minimum.
Slightly different from Cloud Essentials, it validates your skills in maintaining and optimizing cloud infrastructure services. Consequently, it will assess our competence to perform data centre jobs effectively and efficiently such as configuration, deployment, security, troubleshooting, maintenance and management.
Offered by an independent exam and certification company EXIN, it covers cloud computing basic concepts and principles, tests the technical knowledge namely Security and Compliance as well as looks at general aspects inclusive of implementation, management, and evaluation.
For those who are relatively new to cloud computing, this credential is assessing your basic knowledge of Cloud Computing concepts. Developed by joint forces between EXIN and an international member-based organization Cloud Credential Council (CCC), it tests your understanding of the main concepts of Cloud Services Model, Virtualization, Cloud Technologies and Applications, Security, Risk, Compliance, Governance, Adoption, and Service Management.
This Amazon Web Services (AWS) Certified Solution Architect’s accreditation is divided into two paths: Associate and Professional. The first is aimed to assess the individual knowledge in architecting and deploying secure, robust systems on AWS while, on the one hand, it’s also a prerequisite to achieving the professional certification. In the second place, it also validates your ability to define solutions based on customer/end-user requirements using architectural design principles and provides implementation guidance to your organization based on best practices throughout the Project Life Cycle.
What’s more, the professional path targets individuals with two or more years of hands-on experience in designing and deploying cloud architecture and architecting and implementing dynamically scalable, highly available, fault-tolerant, and reliable applications on AWS. It also validates the exam taker’s competence in migrating complex, multi-tier applications on the platform, designing and deploying enterprise-wide scalable operations and implementing cost-control strategies.
As a professional Cloud Architect, you are expected to have the necessary skills and knowledge to enable your organization to leverage Google Cloud technologies. By securing this testament, your ability to design, plan, develop, implement, manage and provision robust, secure, scalable, highly available and reliable cloud architecture using Google Cloud Platform (GCP) along with dynamic solutions to drive business objectives is recognized.
It's no surprise that this one pops up often in interview questions about cloud computing.
To date, many ISO standards have been applied to the cloud. Taking out the expired and withdrawn versions, here is the list:
Information Technology -- Cloud computing – Overview and vocabulary
Information Technology -- Cloud computing -- Reference architecture
Information Technology -- Cloud Data Management Interface (CDMI)
Information Technology -- Cloud computing -- Service level agreement (SLA) framework -- Part 1: Overview and concepts
Cloud computing -- Service level agreement (SLA) framework -- Part 2: Metric model
Information Technology -- Cloud computing -- Service level agreement (SLA) framework -- Part 3: Core conformance requirements
Cloud computing -- Service level agreement (SLA) framework -- Part 4: Components of security and of protection of PII (Personally Identifiable Information)
Information Technology -- Virtualization Management Specification
Cloud Infrastructure Management Interface (CIMI) Model and RESTful HTTP-based Protocol -- An Interface for Managing Cloud Infrastructure
Information Technology -- Cloud computing -- Interoperability and portability
Information Technology -- Cloud computing -- Cloud services and devices: Data flow, data categories and data use
Information Technology -- Cloud computing -- Guidance for policy development
Information Technology -- Cloud computing -- Framework of trust for processing of multi-sourced data
Information Technology -- Security techniques -- Code of practice for information security controls based on ISO/IEC 27002 for cloud services
Information Technology -- Security techniques -- Code of practice for protection of PII in public clouds acting as PII processors
Like any other ISO standards, conforming to them has many benefits for the provider’s businesses: building credibility at the international level, saving time and money by identifying and solving recurring problems, and improving and enhancing the system and process efficiency and effectiveness. On top of that, it is also living proof, publicly accessible, that the provider has properly managed their information security, including its risk, fulfilled their audit requirements and established trust both internally and externally that controls are properly placed and implemented in order to serve their customers better and hence increase their satisfaction level.
You, as the user, are urged to assess their ISO certification. Critical points to reflect on are: which product, service, or location does it actually cover? Is the certification for the entire organization or only for their head office exclusive of their branches? Who issues the certification and whether the issuer is one of the ISO-accredited bodies? For certain, you must see the original certificate and witness what information revealed there.
Hypervisors is a software which is used to virtualise physical server to logical servers to optimise resource utilization Hypervisors are divided into two types.
Bare metal hypervisor are deployed over physical server are classified as Type one hypervisor. Some examples of the type 1 hypervisors are Microsoft Hyper-V hypervisor, VMware ESXi, Citrix XenServer.
When hypervisor run on top of OS then it its type2 hypervisor and examples are. Kvm, oracle virtualbox
Multi cloud is cloud deployment model where IT infrastructure resource like compute, storage, network band width are used from multi cloud service or in house Data centre to complete business transaction. Its pooling of resources from different cloud service provider or combination of IT resource from in house Data centre and cloud services. This model is good use case where business function resources can not be met from one location.
A common question in basic interview questions on cloud computing, don't miss this one.
The cloud hosting drivers when identifying workloads are following
Business application which handles missing critical, ERP and data sensitive information are not fit for cloud hosting. The applications which are running other than Intel platform are also not fit for immediate migration to cloud platform.
n that case, we need to create the Storage accounts V1 or V2 based on the requirements and create the file storage and create the directory. We will click on the connect button and map the drive to customer servers.
It collects the logs based on Azure monitor and stores in log analytics workspace for analyzing and sending alerts. Even we can query and find the specific alerts or logs if required. It’s basically a monitoring tool which is monitoring most of the Azure services and it will collect the logs from the various ways.
I will click on new and go to market place search for WebApps and provide the details and create the same.
Blob storage is used to store the massive amount of unstructured data like jpeg file or archived files. It’s a cloud-based solution, It provides the durability and high availability & It’s a secure, manageable solution for larger data. We can access the storage account easily using the HTTP/https, api etc.
Few of the scenarios you will use the blob storage accounts.
Below is the list
Google stack drivers provide depth diagnostics and monitor the health of the App Engine, which will monitor Google google services and sent out an alert for the same.
It’s collected the logs based on matrices, logs, and events from google cloud infrastructure, Application and other operations which are running in Google platform. Based on the logs collection, it’s observed the speed RCA and reduce the time to resolution. Even it does not require any integration to provide support to developers.
RDBMS is easy to set up and operate. It’s a highly scalable relational database in the AWS cloud. RDBMS is a cost-effective solution. We can resize the capacity of the RDBMS when it’s not in use. Helps us to reduce the administration, patching & backup task while automating the process.
Amazon RDS is available on several database instance types - optimized for memory, performance or I/O - and provides you with six familiar database engines, including Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle Database, and SQL Server. we can use the AWS Database Migration Service to easily migrate or replicate your existing databases to Amazon RDS.
Azure recovery Vault services are used to take the backup of VMs and other services and it will provide the migration feature which can be utilized if we need to migrate the on-premise VMs to Azure.
It’s also used for on-premises migration to (Hyper-V, VMware, Physical server migration to Azure). Its dose supports backup for Azure VMs (Linux/Windows), Azure File storage, PaaS SQL, WebApps, SQL DB ON Azure VMs etc. We can configure the daily backup policy and schedule the backup. Maximum backup can be retail up 999 years. It provides fine-grain access management through RBAC. We can configure the site recovery using Azure Portal for backup and migrating the on-premises environment to Azure.
This vendor-neutral accreditation issued by Cloud Security Alliance certifies our understanding of security issues and best practices over a broad range of cloud computing domains ranging from architecture, governance, compliance, operations, encryption, to
Getting the certification provided by Cloud Security Alliance and International Information System Security Certification Consortium known as (ISC)² indicates you have advanced technical skills and knowledge to design, manage and secure data, applications and infrastructure in the cloud using best practices, policies and procedures.
Nevertheless, it is considered one of the advanced credentials among others – consequently, those who are interested in pursuing are required to have five years of working experience in IT fields on a full-time basis. Three of them shall correlate with information security whereas one year must pertain to architectural concept, design requirements, security in cloud data, cloud platform and infrastructure, cloud application also operations, legal and compliance.
If and only if you have earned CCSK, the previously mentioned one-year requirement will be waived. Supposing that you already hold Certified Information Systems Security Professional (CISSP) certification, it can replace the entire five-year requirement.
This EXIN certification focuses on the interconnection of three areas: Service Management, Cloud Computing, and IT Security. That being the case, you will be automatically granted it without any cost when you possess their three foundational certificates: EXIN Information Security, EXIN Cloud Computing Foundation, and (IT) Service Management.
Issued by Cloud Credential Council (CCC) and managed by EXIN, it recognizes skills and knowledge an individual possesses on security, risk and compliance cloud computing issues. In detail, the certificate primary focus is on the intersection between business and technical security challenges in an enterprise’s cloud computing environment. Five years of working experience in enterprise security with a deep understanding of cloud computing services and deployment models are the recommended prerequisites.
Speaking about applicable laws and regulations to certain data and information, there is a term called ‘data sovereignty’ or ‘information sovereignty’. In this case, since you’re asking about data, the first expression will be capitalized.
Basically, it is subject to numerous laws and regulations of the country in which the data is located or stored, used and transmitted – both sent and received.
Since those early days, it’s been one of the key challenges when an individual/organization wants to move into the cloud, the government/authority insist that the data should never leave their jurisdiction, which directly means we couldn’t place it in our desired services.
Thus far, there is no international policy, standard, or agreement which provides one set of data sovereignty’s requirements that all countries should be following.
Day in day out, it gains more weight and in response, many countries have established and regulated compliance requirements by amending the current laws or enacting new legislation that requires customer data kept within the country it resides. Over the past few years, this kind of obligations has been lately enforced in Vietnam, Brunei, Iran, China, Brazil, India, Australia, South Korea, Nigeria, Russia and Indonesia.
Additionally, the laws and regulations vary by country whilst some are, in fact, stricter than the others. Some of them mandate their citizens’ data is stored on physical servers within the country’s physical borders. Australia, for example, commands the provider to reveal what information is being sent outside the country.
In addition to that, the European Union (EU) restrict the transfer of Personally Identifiable Information (PII) to countries outside their member countries. PII itself is the type of data that could potentially identify a specific individual and refers to a relatively narrow range of data such as name, address, birth date, credit card number or bank account.
What You Could Do You’d better know the storage, server and any other device where your data will reside, what’s in the fine print, whether the provider has already complied to data sovereignty laws in the country where your data is located at. If your government body requires you to, in this context, store your data at the country where you are based, make sure two things. First, the provider has its’ storage deployed there. Second, their obligations to applicable laws and regulations on data sovereignty are already fulfilled.
The cloud providers have what they call ‘Cloud Management’ as tools, aimed as administrative control over public, private and hybrid clouds. What’s more, the software is intended for the users to manage, ranging from capabilities, availability, security, utilization, resource allocations, workflow, automation, workload balancing, capacity planning, monitoring, controlling, orchestration, provisioning, budgeting, cost and expense, performance, reporting and even the migration of cloud products and services we have subscribed into.
On top of it, there are two types of cloud management software. In-house software developed and offered by public, private and hybrid cloud provider and the second is mass product’s that comes from a third-party vendor to complement the aforementioned tools.
Over and above, if you choose public cloud then often time you will be given the option to manage your services with third-party tools simply because the servers, storage, networking, and other infrastructure operations are taken care of by the providers.
Besides, if you are a private cloud user, the tool is required to create the virtualization and virtualized computing resources, deal with resource allocation, security, monitoring, tracking as well as billing through a self-service portal.
Cloud management is more complex to handle when it comes to hybrid cloud due to obligation on having to deal with the network, computing, and storage devices across multiple domains including but not limited to installation, configuration, administration of instances, images, user accounts, and their access rights as part of Identity and Access Management.
Regardless of a native and third-party tool designed to provide rich functionality across one or multiple cloud providers, the platform must be able to provide the following features at the very least:
Since your company is yet to decide which vendor to pick up, a few frameworks you could check out is:
If you find it challenging to deal with User Experience (UX) as the complexity and diversity of cloud systems emerges from time to time, this framework from National Institute of Standards and Technology (NIST) under U.S. Department of Commerce comes into the picture.
You could evaluate your cloud UX and the user expectations in a more structured way through five attributes and 21 elements provided.
Developed by NIST Cloud Computing Reference Architecture and Taxonomy Working Group, this paper is intended to portray a high-level conceptual model for defining requirements, structures along with the operations of cloud computing.
On top of that, it is divided into two parts. One is the complete overview of actors including their roles and the architectural components for managing and providing cloud services with the likes of service deployment, service orchestration, cloud service management, security and privacy. The other is Taxonomy presented in own section and appendices constituting of terms, definitions and examples of cloud services.
Without question, there is no such thing as a one-framework-fits-all. Each of them has its pros and cons. Due to this reason, it’s important to analyze the available frameworks and put the Cost and Benefit approach, for instance, or any other key metrics into consideration.
Else, you could develop your own framework or explore the opportunity to design a hybrid framework by combining yours with existing framework out there, or even more, drawing this further, combining a handful of frameworks to help your organizations meeting their unique requirements as well as business objectives.
If you need to assess security risks of a cloud provider, this framework will bear the fruit while on the other hand, it provides fundamental security concepts and principles in 13 domains and 133 controls for the vendor to follow.
Shortly known as CCM, from the vendor’s perspective, it will improve or enhance security control environments by emphasizing business information security control requirements, identifying and mitigating from security threats and vulnerabilities in the cloud. The matrix also offers cloud taxonomy and terminology, security measurements, standardized security risk, IT risk and operational risk when notably managing one or all of them.
Sr. No. | Cloud Control Matrix - Domains | No. of Controls for Each Domain (Cloud Security Alliance) |
---|---|---|
1. | AIS: Application & Interface Security | 4 |
2. | AAC: Audit Assurance & Compliance | 3 |
3. | BCR: Business Continuity Management & Operational Resilience | 11 |
4. | CCC: Change Control & Configuration Management | 5 |
5. | DSI: Data Security & Information Lifecycle Management | 7 |
6. | DCS: Datacenter Security | 9 |
7. | EKM: Encryption & Key Management | 4 |
8. | GRM: Governance and Risk Management | 11 |
9. | HRS: Human Resources | 11 |
10. | IAM: Identity & Access Management | 13 |
11. | IVS: Infrastructure & Virtualization Security | 13 |
12. | IPY: Interoperability & Portability | 5 |
13. | MOS: Mobile Security | 20 |
14. | SEF: Security Incident Management, E-Discovery & Cloud Forensics | 5 |
15. | STA: Supply Chain Management, Transparency and Accountability | 9 |
16. | TVM: Threat and Vulnerability Management | 3 |
If you consider adopting public cloud computing, then this 80-page document shall come to the light. It will give you a big picture of security and privacy challenges and crucial points to consider when you outsource your data, applications and infrastructure to a public cloud provider in which they own and operate the infrastructure and computational resources aside from the fact they deliver services to the public via a multi-tenant platform.
As this paper tells us, it does not recommend any specific cloud computing service, service arrangement, service agreement, service provider, or deployment model. Such consequence is each organization is encouraged to apply their very own guidelines when analyzing their requirements, inclusive of security and privacy, and to assess, select, engage, and oversee the public cloud services that can fulfil those requirements at the most.
Other than two frameworks explained above, you could also bring another document titled ‘Security Guidance for Critical Areas of Focus in Cloud Computing v4.0’ from Cloud Security Alliance (CSA) into play. Developed based on previous iterations of the security guidance, dedicated research, and public participation from their members, working groups, and industry experts within their community, it provides how to manage and mitigate security and risks in adopting cloud computing technology while also pledge guidance and insights to support business goals.
The document issued by ISACA is intended to both actors, cloud users and cloud providers, so they could assess the design and operating effectiveness of the cloud computing internal controls (administrative, physical, technical) and security, identify internal control discrepancies and deficiencies within the end-user organization and its interface with the service provider. In essence, we could refer to this guide to, after all, provide the results of an audit assessment and our ability to rely upon our own IT department and or the cloud provider’s attestations on internal controls.
As the title stands and tells us, this white paper from SANS Institute guides us on how to conduct a security audit on our cloud environment and also is aimed for the cloud provider to audit their cloud environment.
It constitutes of audit methodology, audit checklist, standards, laws and regulations we could put into service to witness security risks and in the end test the respective controls.
Area to be audited is as follows:
As we might already know, ISACA develops IT Assurance Framework (ITAF) as a guideline that provides information and direction for the practice of IT audit and assurance. IT also offers tools, techniques, methodologies, and templates to direct the application of IT audit and assurance processes. Read up on ITAF sections 3400 – IT Management Processes, sections 3600 – IT Audit and Assurance Processes, and keep an eye on sections 3800 – IT Audit and Assurance Management.
Well, safely say the components such as audit objective, scope, risk, plan, methodology/approach, along with its procedures (processes and techniques), are much the same as other types of IT or IS Audit engagement.
The main thing is, in the cloud, with shared resourcing, multitenancy and geolocation, the boundaries are difficult to define and isolate meanwhile the end-user specific transactional information is difficult to obtain. As such, IT Assurance needs to become more real-time, continuous and process-oriented vs. transactional in focus, while the cloud providers need to provide greater transparency to their clients.
Objective
Organizations should strive to align their business objectives with the objectives of the audit. During the planning stage, the auditor shall identify what the objectives then have them agreed with the auditee. From the auditor end, they are going to use the objectives as a way of concluding on the evidence they obtain. Some of the notable objectives are:
computing service provider’s internal controls.
Above controls also includes IT application controls, not merely IT general controls that are aimed to provide assurance of specific application, its functionality and suitability.
To get an idea on the controls including their objectives on the cloud environment, have a look at ISACA Control Objectives for Information and Related Technologies (COBIT). Even though it is developed as a general control framework, some of the control objectives have some applicability to the cloud.
Scope
When it comes to IT general controls, the auditor from the customer’s end shall do the review on:
If your IAM system is integrated with the cloud computing system
To interface with and manage cloud computing incidents
As an access point to the internet
If the cloud is part of your application infrastructure
It is also important to note that the controls that are maintained by a vendor are not included in the scope of a cloud computing audit.
It is a common practice an organization may use these two approaches to measure a cloud provider:
Inclusive of vendor risk assessment, vendor due diligence, vendor rating/tiering, vendor Scope of Work, vendor agreement, and vendor Service Level Agreement (SLA)
Third-party auditor whether provided by the cloud provider or the end-user.
Procedure
Whether it’s rolled out by your internal function, the vendor’s organizational unit, or by the third party, the auditor will turn stacks of processes and techniques to account to obtain evidence through inquiry of data and document, assessment, confirmation, recalculation, reperformance, observation, meeting, discussion, inspection, analytics, and confirmation.
Cloud Governance is basically a set of standardized policies and practices involving people, process and technology related to cloud computing environment and designed to ensure the organization and more importantly business objectives are met without surpassing risk tolerance and compliance requirements.
Business goals and objectives are varied between one and another entity, however, the most commonly found is the performance, budget/cost optimization, customer satisfaction, employee attraction and retention and resource productivity.
According to The Open Group, governance answers three huge questions. First, are we doing the right things? Second, are we doing things in the right way? Last, how do we know that we have done both?
The global consortium that enables the achievement of business objectives through IT standards think that Cloud Computing Governance is a view of IT Governance focused on accountability, defining decision rights and any other way balancing benefit/value, risk, and resources in a cloud environment. At large, it is a subset of overall business governance which includes IT Governance and Enterprise Architecture (EA) Governance.
You could put their Cloud Computing Governance Framework to use. As a pool of business-driven policies and principles that establish the appropriate degree of investments and control around the Cloud Computing lifecycle and its processes, your organization could make certain all expenses and costs associated are aligned with your company business objectives, foster data integrity organization-wide, stimulate innovation, and manage the risk of data loss and or non-compliance with regulations, they say.
As it will help our organization to identify vulnerabilities before a compromise could take place, the process is started by identifying and assigning severity levels to security defects through manual and automated techniques in a certain period of time. Be mindful that since this is related to cloud computing, there are two types of PT. First is the test the provider does to its own platform and second the test you could do to their resources, specifically for your systems. Importantly, not all cloud vendors allow penetration testing.
Ideally, the assessment shall target different layers of technology from Host, Network, Storage, Server, Virtualization, Operating System, Middleware, Runtime, Database, and Application by highly considering your cloud models (SaaS, PaaS, IaaS, etc.) and cloud deployment models.
Cloud computing delivers computing services like servers, storage, databases, networking, software, analytics, intelligence and many more over the Internet (“which is called Cloud”) to offer faster innovation, flexible resources and economies of scale. The organization which offers these services is called a Cloud provider.
Today, the scope of Cloud Computing is huge as it is a very fast emerging business standard. Many organizations are experiencing the fruits of Cloud applications in a few different ways. Also, features like less cost, faster speed, globally scalable, more productivity, and, most important, data protection from potential threats are responsible for a big shift from the traditional way businesses to cloud computing services.
The rapid shift to the Cloud in an era of innovation has offered many organizations to employ a cloud-first approach to product design and some technology and business innovations available as cloud services. Microsoft is the global leading provider of Cloud computing services for businesses of all sizes. Many companies offer Cloud Computing services and refer Cloud Computing providers. These top Cloud Computing companies are Microsoft, SAP, Oracle, Google, IBM, at&t, and Salesforce. etc.
According to the 2019 Cloud Computing report by Forrester, Dave Bartoletti, Vice President and Principal Analyst at Forrester, has pegged 2019 as the year of widespread enterprise adoption of cloud to power digital transformation efforts. Moreover, he also stated that "In 2019, cloud computing will be shorthand for the best way to turn disruptive ideas into amazing software."
Individuals skilled in areas like AI, cloud computing, digital marketing and cyber security are predicted to be in high demand in 2019,” Katie Bardaro, lead economist and vice president of data analytics at PayScale, told FOX Business.
We have brought hand-picked top Cloud computing interview questions after lots of detailed research to help you in your interview. These Cloud computing interview questions and answers for experienced and freshers alone will help you excel in the Cloud job interview and provide you an edge over your competitors. Therefore, to succeed in the interview, you need to go through these questions and practice these Cloud computing interview questions as much as possible. You can look for many other courses in cloud computing to upskill your career more progressively.
If you want to make your career in Cloud, you need not worry, as the set of Cloud Computing interview questions designed by experts will guide you to get through the Cloud interviews. Stay in tune with the following interview questions and prepare beforehand to become familiar with the questions you may encounter while searching for a dream job. You can enroll in our cloud Architecting on AWS Certification Training to be better prepared for other cloud career roles.
Hope these Cloud Computing Interview Questions will help you to crack the interview. All the best!
Happy job hunting!
Submitted questions and answers are subjecct to review and editing,and may or may not be selected for posting, at the sole discretion of Knowledgehut.
Get a 1:1 Mentorship call with our Career Advisor
By tapping submit, you agree to KnowledgeHut Privacy Policy and Terms & Conditions