Cloud Control Matrix(CCM), Governance Risk and Compliance (GRC)
- Description
- Curriculum
- Reviews
Citadel Cloud Management provides an in-depth curriculum centered around the Cloud Control Matrix (CCM) created by the Cloud Security Alliance (CSA). This curriculum is aimed at helping enterprises develop and refine their cloud security strategies by leveraging the CCM framework to assess and manage risks associated with cloud computing providers.
Curriculum Content:
- Introduction to the Cloud Control Matrix (CCM):
- Overview of the CCM framework and its purpose in cloud security.
- Explanation of how CCM helps in defining security requirements and assessing cloud providers.
Understanding the relationship of CCM with other industry-accepted security standards and frameworks,
such as ISO 27001/27002, NIST, PCI-DSS, and more.
See references: https://cloudsecurityalliance.org/blog/2020/10/16/what-is-the-cloud-controls-matrix-ccm and https://cpl.thalesgroup.com/faq/data-security-cloud/what-cloud-security-alliance
Detailed Domain Coverage:
- Application & Interface Security: Principles governing application security, data integrity, and customer access requirements.
- Audit Assurance & Compliance: Processes for audit planning, independent audits, and mapping to regulations and standards.
- Business Continuity Management & Operational Resilience: Strategies for business continuity planning, testing, and maintenance.
- Change Control & Configuration Management: Handling changes, acquiring new applications or data, and managing development and quality testing.
- Data Security & Information Lifecycle Management: Best practices for managing data flow, inventory, and lifecycle.
- Data Center Security: Physical security controls, asset management, and access control for data centers.
- Encryption & Key Management: Policies for key management, encryption, and protecting sensitive data.
- Governance & Risk Management: Risk assessments, policy enforcement, and oversight in managing data-focused risks.
- Human Resources Security: Governance of employee-related security aspects, including termination, mobile device management, and training.
- Identity & Access Management: Credential management, segregation of duties, and access restrictions.
- Infrastructure & Virtualization Security: Intrusion detection, vulnerability management, and OS hardening.
- Interoperability & Portability: Use of APIs, data requests, and ensuring portability between services.
- Mobile Security: Management of mobile devices, anti-malware practices, and app store policies.
- Security Incident Management, Cloud Forensics & E-Discovery: Incident reporting, response management, and legal preparation.
- Supply Chain Management, Accountability & Transparency: Controls related to data quality, incident reporting, and supply chain metrics.
- Threat & Vulnerability Management: Managing antivirus, patch management, and addressing vulnerabilities.
Mapping to Standards and Frameworks:
- CCM v4 is mapped to various standards such as ISO/IEC 27001/27002/27017/27018, CIS Controls V8, and others.
- CCM v3.0.1 mappings include standards like NIST SP 800-53, PCI DSS, and ISACA COBIT.
- Understanding how fulfilling CCM controls can help meet requirements of multiple standards and regulations simultaneously.
Application and Implementation:
- Practical application of the CCM framework to develop a cloud security strategy.
- Using the CCM spreadsheet to align cloud security controls with multiple frameworks and simplify compliance.
-
1(Principles of Cybersecurity) Framework and Security FundamentalsText lesson
Standards, guidelines and practices follows the organization mechanism:
- Describe their current cybersecurity posture;
- Describe their target state for cybersecurity;
- Identify and prioritize opportunities for improvement within the context of a continuous and repeatable process;
- Assess progress toward the target state;
- Communicate among internal and external stakeholders about cybersecurity risk. -
2Systems AdministrationText lesson
-
3Network and System SecurityText lesson
-
4Defensive SecurityText lesson
-
5Offensive SecurityText lesson
-
6Data and Infrastructure SecurityText lesson
-
7Application SecurityText lesson
-
8Advance Threat ProtectionText lesson
-
9Risk Governance and ComplianceText lesson
-
10Mobile SecurityText lesson
-
11Baseline set(domains) of security controlsText lesson
The Cloud Controls Matrix (CCM) is a baseline set of security controls created by the Cloud Security Alliance to help enterprises assess the risk associated with a cloud computing provider.
-
12Application and Interface SecurityText lesson
Applications and programming interfaces (APIs) shall be designed, developed, deployed, and tested in accordance with leading industry standards (e.g., OWASP for web applications) and adhere to applicable legal, statutory, or regulatory compliance obligations.
-
13Audit Assurance and ComplianceText lesson
AUDIT ASSURANCE AND COMPLIANCE
Audit assurance and compliance starts with audit planning and ends with understanding a control framework based on regulations and standards. This part of the matrix includes independent audits, audit planning, and information system regulatory mapping.
CATEGORIES ON NIST SPECIAL PUBLICATION 800-53 Revision 4 and Revision 5
AUDIT PLANNING: Audit plans shall be developed and maintained to address business process disruptions. Auditing plans shall focus on reviewing the effectiveness of the implementation of security operations. All audit activities must be agreed upon prior to executing any audits. Read: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/aac/aac-01/
INDEPENDENT AUDITS: Independent reviews and assessments shall be performed at least annually to ensure that the organization addresses nonconformities of established policies, standards, authorization, vulnerability monitoring and scanning, procedures, and compliance obligations. Read: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/aac/aac-02/
INFORMATION SYSTEM REGULATORY MAPPING: Organizations shall create and maintain a control framework which captures standards, regulatory, legal, and statutory requirements relevant for their business needs. The control framework shall be reviewed at least annually to ensure changes that could affect the business processes are reflected. Read NIST Publications: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/aac/aac-03/
Read more: https://linfordco.com/blog/audit-procedures-testing
The Five Types of Audit Tests: These Include (listed in order of complexity from lowest to highest): Inquiry, Observation, Examination or Inspection of evidence, Re-performance, and Computer Assisted Audit Technique (CAAT).
Inquiry: The auditor asks appropriate management and staff about the controls in place at the service organization to determine some relevant information. This method is often used in conjunction with other, more reliable methods. For example, an auditor may inquire of management if visitors to the data center are escorted at all times if the auditor is not able to observe this activity while on site. No control objective or criteria should ever be supported by controls only tested through inquiry procedures.
Observation: Activities and operations are tested using observation. This method is useful when there is no documentation of the operation of a control, such as observing that a security camera is in place or observing that a fire suppression system is installed.
Examination or Inspection of Evidence: This method is used to determine whether or not manual controls are being performed. For instance, are backups scheduled to run on a regular basis? Are forms being filled out appropriately? This method often includes reviewing written documentation and records such as employee manuals, visitor logs, and system databases.
Re-performance: Re-performance (sometimes called recalculation) is used when the three above methods combined fail to provide sufficient assurance that a control is operating effectively or this method can be used to prove by itself to demonstrate that controls are operating effectively. This method of testing (as well as a CAAT) is the strongest type of testing to show the operating effectiveness of a control. Re-performance requires the auditor to manually execute the control, such as re-performing a calculation that a system automatically calculates to confirm that the system performs the control correctly.
CAAT: This method can be used to analyze large volumes of data, or just be able to analyze every transaction rather than just a sample of all transactions. Software is generally used to perform a CAAT, which can range from using a spreadsheet to using specialized databases or software designed specifically for data analytics (e.g. ACL).
-
14Business Continuity Management and Operational ResilienceText lesson
Operational resilience and business continuity are measures that organizations use to help with such risk mitigation. They can also help you avoid unintended consequences, missed opportunities, and operational failures.
-
15Change Control and Configuration ManagementText lesson
Change and Configuration Management (CCM) is seen as a way to manage the Cross Life Cycle i.e. controlling of creation and maintenance of application systems across the whole life cycle. This requires Integrating, Onboarding and Implementing, Managing Change, release and tasks, Using reports flashboard and dashbards, Administering and Developing within IT organization.
-
16Data Security and Information Lifecycle ManagementText lesson
-
17Data Center SecurityText lesson
What is Data Center Security?
Data center security is the practice of applying security controls to the data center. Components of a data center design include routers, switches, firewalls, storage systems, servers, and application-delivery controllers and follows the workload across physical data centers and multicloud environments to protect applications, infrastructure, data, and users. The practice applies from traditional data centers based on physical servers to more modern data centers based on virtualized servers. It also applies to data centers in the public cloud.
Steps to secure an application with threat modeling:
- Identify the potential threats and threat actors to the application.
- Identify the functions of the application and its objectives.
- Analyze the assets that can be targeted by cybercriminals
- Determine the type of attacks and attack vectors that can be used and the type of vulnerabilities that can be exploited by these vectors
- Set the security measures that can be put in place to mitigate the risksThe goal is to protect it from threats from data security types that could compromise the confidentiality, integrity and availability of business information assets or intellectual property. Critical needs are visibility, segmentation and Threat Protection.
Read reference article: https://versprite.com/blog/security-operations/software-development-lifecycle-threat-modeling/
Advantages of Embedding Threat Modeling in all the Phases of the SDLC:
1. Risk management: It allows risks to be managed proactively from the early stages of SDLC.
2. Security requirements: Deriving the security requirements to mitigate potential risks of vulnerability exploits and targeted attacks against application components.
3. Secure design: Ability to identify security design flaws, their exposure to threat actors and attacks, and prioritize fixing them by issuing new design documentation prior to the next phase of the SDLC.
4. Security issue prioritization: Determining the risk exposure and impact of threats targeting issues identified during secure code reviews and prioritizing them for mitigation.
5. Security testing: Deriving security tests from use and abuse cases of the application for testing of the effectiveness of security measures in mitigating threats and attacks targeting the application.
6. Secure release of applications after development: Allowing business to make informed risk decisions prior to releasing the application based on the mitigation of high-risk vulnerabilities and assertion of testing countermeasures that mitigate specific threats.
7. Secure release of application after an incident: Determining and identifying additional countermeasures that can be deployed.
CONTROLS:DCS-01: Asset Management
Assets must be classified in terms of business criticality, service-level expectations, and operational continuity requirements. A complete inventory of business-critical assets located at all sites and/or geographical locations and their usage over time shall be maintained and updated regularly, and assigned ownership by defined roles and responsibilities. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-01/DCS-02: Controlled Access Points
Physical security perimeters (e.g., fences, walls, barriers, guards, gates, electronic surveillance, physical authentication mechanisms, reception desks, and security patrols) shall be implemented to safeguard sensitive data and information systems. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-02/DCS-03: Equipment Identification
Automated equipment identification shall be used as a method of connection authentication. Location-aware technologies may be used to validate connection authentication integrity based on known equipment location. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-03/DCS-04: Off-Site Authorization
Authorization must be obtained prior to relocation or transfer of hardware, software, or data to an offsite premises. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-04/DCS-05: Off-Site Equipment
Policies and procedures shall be established for the secure disposal of equipment (by asset type) used outside the organization’s premises. This shall include a wiping solution or destruction process that renders recovery of information impossible. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-05/DCS-06: Policy
Policies and procedures shall be established, and supporting business processes implemented, for maintaining a safe and secure working environment in offices, rooms, facilities, and secure areas storing sensitive information. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-06/DCS-07: Secure Area Authorization
Ingress and egress to secure areas shall be constrained and monitored by physical access control mechanisms to ensure that only authorized personnel are allowed access. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-07/DCS-08: Unauthorized Persons Entry
Ingress and egress points such as service areas and other points where unauthorized personnel may enter the premises shall be monitored, controlled and, if possible, isolated from data storage and processing facilities to prevent unauthorized data corruption, compromise, and loss. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-08/DCS-09: User Access
Physical access to information assets and functions by users and support personnel shall be restricted. Read article: https://csf.tools/reference/cloud-controls-matrix/version-3-0-1/dcs/dcs-09/ -
18Encryption and Key ManagementText lesson
What is encryption key management?
Encryption key management is the administration of policies and procedures for protecting, storing, organizing, and distributing encryption keys. Encryption keys (also called cryptographic keys) are the strings of bits generated to encode and decode data and voice transmissions.
For video on encryption: https://study.com/academy/lesson/types-of-encryption-keys.html
10 Best Practices for Encryption Key Management and Data Security
1. Encryption Key Algorithm and Size:
A number of factors come into play here, namely usage factor, lifespan, performance and most importantly, the security aspect. The sensitivity of the data should ascertain the length of the key, be it 128/256 bit key sizes for AES or for RSA – 2048 and 4096 bits. At the same time, very lengthy keys can also result in performance issues. Agility is another very important attribute to have, since it allows for changes to algorithms and keys over time. Over time, algorithms tend to get weaker and hence, it is important to be able to change encryption keys from time to time. Support for multiple standards in terms of algorithms can also be considered, as this may be required in the case of acquisitions or mergers, when other organizations use different encryption standards. Furthermore, the usage of asymmetric keys for data-in-motion and symmetric keys for data-at-rest is also advisable.
2. Centralization of Key Management System:
Organizations tend to use several hundred or even thousands of encryption keys. Proper and secure storage of these keys can become a massive problem, especially when you require access to such keys on an immediate basis. Hence is the need for a centralized key management system.
The best practice for an organization would be to have an in-house key management service. However, oftentimes this may not be possible and the use of third-party services may be adopted for a more sophisticated approach. Such keys are usually stored away from encrypted data. This serves as an added advantage in the case of a data breach, as the encryption keys is unlikely to get compromised.
The centralized process is also beneficial in terms of processing, as the encryption-decryption process happens locally, but the storage, rotation, generation, etc. happens away from the actual location of the data.
3. Secure Storage:
Considering that encryption keys are often the target for cybercriminals and attackers, it is a good option to have a hardware security module (HSM) in place for their storage. The usage of HSM assures the organization of strong physical, as well as logical protection.
An organization must have a plan for physical security as well:
- Limiting physical access control to critical systems.
- Maintaining fire safety measures.
- Ensuring structural integrity, in the case of natural hazards.
- Protecting from utilities (such as heating or air-conditioning systems) that could cause malfunctions.
4. Using Automation:
The use of manual key management is not only time consuming as a process but also leads to the possibility of errors, considering the scalability factor at large organizations. A smart way to manage this is to make use of automation. For example, using automation to generate, rotate and renew keys after certain set times can be a very good practice to adopt.
5. Access and Audit Logs:
Encryption keys must only be accessed by those who require it. This can be defined in the centralized process for key management such that it allows for access only to authorized users. Also, it is imperative to not have only one user with sole access to the key as this will create a problem if the user happens to lose their credentials or things are somehow corrupted.
Audit logs are another vital part of encryption key management. Logs must detail the history of each key, be it creation, deletion and its usage cycle. All operations pertaining to such keys are required to be recorded with regard to its activity, what accessed it and when did it access the said key. This is a good practice that takes care of two needs, one for the compliance part and secondly, for investigation if any key were to be compromised. Their analysis and reporting at regular intervals is also a beneficial process.
6. Backup Capabilities:
The loss of an encryption key essentially means that that the data it protects is rendered unrecoverable. Thus is the need to have a robust key backup facility. This ensures the availability of keys whenever required. Also take note of here is that the backed up keys should also be encrypted using proper encryption standards to ensure their protection.
7. Encryption Key Life Cycle Management:
Each encryption key has a life span. The working life cycle of the key has to be managed properly by
following the steps mentioned below.
-Generation of key: The generated key should have a very high percentage of randomness. Using a
trusted NIST certified random number generator is always recommended.
- Rotation of Key: A cumbersome issue that arises for organizations is during the expiration or the
change of encryption keys. This is when it is compulsory to decrypt and then re-encrypt all the data.
However, the use of a key profile for every encrypted file or data can be helpful. The key profile
allows one to identify the encryption resources for the decryption of the database. On expiration of
keys, the key profile takes care of the encryption process using the new key. For existing data, it
identifies the actual key.
- Retirement of key: A key should be permanently deleted when it is no more in use. It reduces the
usage of unused keys and protects the system.
8. Third-party Integration:
Organizations will surely use external devices. These will be spread across the network to perform
their functions. However, such devices tend to not interact with databases as much. So, to enable
their functionality, encryption methods chosen should be compatible in nature, with respect to the
third-party applications they interact with.
The biggest risks with third party API integration are SQL injection, cross site scripting, denial of
service, spoofing, malware code and many more. So, API security can be a big problem. In this
critical situation, API Management Platforms can give us some relief. These platforms provide
monitoring, analytics, alerting, life-cycle management features (APIs) to ensure the safety and security of your business. Some of the popular API management tools are Google Apigee, IBM API Connect, Amazon API Gateway and Azure API Gateway.9. The Principle of Least Privilege:
The principle of least privilege states that organizations must provide administrative rights solely on the basis of user roles. This limits assigning administrative rights to applications and in the process, reduces exposure to internal and external threats. By limiting access and following a role-based access control approach, one can limit the possibility of potential damage. This principle of least privilege is also applicable to all connected software applications, systems, devices and other non-human tools. To implement the principle of least privilege principle effectively, a centralized control and management system is essential. The centralized privilege control system will reduce "privilege creep" and ensure minimum level of access to human and non- human entities.
10. Termination of Keys:
The ability to revoke and terminate keys is essential for any organization. This is largely applicable when data is compromised and in doing so, unauthorized users are denied the possibility of having keys to access sensitive data. Read article: https://www.techopedia.com/2/30767/security/10-best-practices-for-encryption-key- management-and-data-securit
-
19Governance and Risk ManagementText lesson
GRC is a system intended to correct the "silo mentality" that leads departments within an organization to hoard information and resources. They are integrated into every department for greater efficiency and to reduce risks, costs, and duplication of effort.
-
20Human ResourcesText lesson
-
21Identity and Access ManagementText lesson
-
22Interoperability and PortabilityText lesson
-
23Mobile SecurityText lesson
-
24Security Incident Management, Cloud Forensics, and E-DiscoveryText lesson
-
25Supply Chain Management Accountability and TransparencyText lesson
-
26Threat and Vulnerability ManagementText lesson
-
27Performing SOC2 Compliance Audit on Microsoft Azure CloudText lesson
-
28Performing SOC2 Compliance Audit on Google CloudText lesson
-
29SOC (System and Organizational Controls) Compliance OverviewText lesson
SOC(System and Organizational controls) compliance refers to a type of certification in which a service organization has completed a third-party audit that demonstrates that it has certain controls in place. Generally, this refers to SOC 1, SOC 2, or SOC 3 compliance; however, SOC for Cybersecurity and SOC for Supply Chain certifications exist.
-
30Performing SOC2 Compliance Audit on AWS CloudText lesson
-
31Title I: Public Company Accounting Oversight Board (PCAOB)Text lesson
-
32Title II: Auditor IndependenceText lesson
Title II of the Sarbanes Oxley Act addresses auditor independence. It prohibits the registered external auditor of a public company from providing certain nonaudit services to that public company audit client.
-
33Title III: Corporate ResponsibilityText lesson
-
34Title IV: Enhanced Financial DisclosuresText lesson
-
35Title V: Analyst Conflicts of InterestText lesson
-
36Title VI: Commission Resources and AuthorityText lesson
-
37Title VII: Studies and ReportsText lesson
-
38Title VIII: Corporate and Criminal Fraud AccountabilityText lesson
-
39Title IX: White Collar Crime Penalty EnhancementText lesson
-
40Title X: Corporate Tax ReturnsText lesson
-
41Title XI: Corporate Fraud AccountabilityText lesson
-
42The Sarbanes-Oxley ActText lesson
-
43Inventory and Control of Hardware AssetsText lesson
-
44Inventory and Control of Software AssetsText lesson
-
45Data ProtectionText lesson
-
46Secure Configuration of Enterprise Assets and SoftwareText lesson
-
47Account ManagementText lesson
-
48Access Control ManagementText lesson
-
49Continuous Vulnerability ManagementText lesson
-
50Audit Log ManagementText lesson
-
51Email and Web Browser ProtectionsText lesson
-
52Malware DefensesText lesson
-
53Data Recovery CapabilitiesText lesson
-
54Limitation and Control of Network Ports, Protocols, and ServicesText lesson
-
55Network Monitoring and DefenseText lesson
-
56Implement a Security Awareness and Training ProgramText lesson
-
57Wireless Access ControlText lesson
-
58Application Software SecurityText lesson
-
59Incident Response and ManagementText lesson
-
60Penetration Tests and Red Team ExercisesText lesson