Quiz-summary
0 of 19 questions completed
Questions:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
Information
Premium Practice Questions
You have already completed the quiz before. Hence you can not start it again.
Quiz is loading...
You must sign in or sign up to start the quiz.
You have to finish following quiz, to start this quiz:
Results
0 of 19 questions answered correctly
Your time:
Time has elapsed
Categories
- Not categorized 0%
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- Answered
- Review
-
Question 1 of 19
1. Question
A US-based financial services firm regulated by the Securities and Exchange Commission (SEC) is migrating its internal financial reporting systems to a multi-cloud environment. To comply with the Sarbanes-Oxley Act (SOX), the firm must ensure that access to sensitive financial data is strictly controlled and fully auditable across all platforms. The IT security team is evaluating methods to manage user identities and permissions without creating administrative silos or increasing the risk of credential theft. Which strategy best addresses these requirements while maintaining a strong security posture?
Correct
Correct: Implementing a centralized Identity Provider (IdP) with SAML federation provides a single point of authentication and policy enforcement. This approach ensures that the principle of least privilege is applied consistently across the entire infrastructure. It also generates a unified audit trail, which is essential for demonstrating compliance with US federal regulations like the Sarbanes-Oxley Act (SOX) and SEC requirements by providing a single source of truth for access logs.
Incorrect: Establishing independent identity silos creates significant administrative overhead and increases the likelihood of inconsistent access policies or orphaned accounts. The strategy of replicating credentials to multiple native identity stores increases the attack surface by storing sensitive password hashes in multiple locations. Focusing only on shared administrative credentials directly violates the fundamental security principle of individual accountability and makes it impossible to perform accurate forensic analysis or meet regulatory auditing standards.
Takeaway: Centralized identity federation provides consistent policy enforcement and unified auditing necessary for meeting US regulatory compliance in multi-cloud architectures.
Incorrect
Correct: Implementing a centralized Identity Provider (IdP) with SAML federation provides a single point of authentication and policy enforcement. This approach ensures that the principle of least privilege is applied consistently across the entire infrastructure. It also generates a unified audit trail, which is essential for demonstrating compliance with US federal regulations like the Sarbanes-Oxley Act (SOX) and SEC requirements by providing a single source of truth for access logs.
Incorrect: Establishing independent identity silos creates significant administrative overhead and increases the likelihood of inconsistent access policies or orphaned accounts. The strategy of replicating credentials to multiple native identity stores increases the attack surface by storing sensitive password hashes in multiple locations. Focusing only on shared administrative credentials directly violates the fundamental security principle of individual accountability and makes it impossible to perform accurate forensic analysis or meet regulatory auditing standards.
Takeaway: Centralized identity federation provides consistent policy enforcement and unified auditing necessary for meeting US regulatory compliance in multi-cloud architectures.
-
Question 2 of 19
2. Question
A United States financial institution is updating its access control architecture to better align with NIST SP 800-162 standards and ensure compliance with the Sarbanes-Oxley Act (SOX). The security team needs a model that can evaluate real-time variables, such as the user’s current risk score, the sensitivity of the financial record, and whether the request originates from a known corporate network. Which authorization model is most effective for meeting these dynamic, multi-factor requirements?
Correct
Correct: Attribute-Based Access Control (ABAC) is the most flexible model because it uses a policy-based approach that evaluates attributes of the subject, resource, and environment. In a US regulatory context, this allows organizations to enforce complex logic required by NIST frameworks, such as restricting access based on geographic location or time-of-day, which is essential for modern compliance and risk management.
Incorrect: Relying on Role-Based Access Control often leads to role explosion when trying to account for environmental variables like location or time, as roles are typically static and tied to job functions. The strategy of using Discretionary Access Control is generally insufficient for federal compliance because it delegates security decisions to individual resource owners, lacking the centralized oversight needed for SOX audits. Choosing Mandatory Access Control provides high security through sensitivity labels but is often too rigid for commercial financial environments, as it does not natively support the evaluation of environmental attributes like network origin or user risk scores.
Takeaway: ABAC enables dynamic, context-aware security decisions by evaluating subject, resource, and environmental attributes against centralized policies.
Incorrect
Correct: Attribute-Based Access Control (ABAC) is the most flexible model because it uses a policy-based approach that evaluates attributes of the subject, resource, and environment. In a US regulatory context, this allows organizations to enforce complex logic required by NIST frameworks, such as restricting access based on geographic location or time-of-day, which is essential for modern compliance and risk management.
Incorrect: Relying on Role-Based Access Control often leads to role explosion when trying to account for environmental variables like location or time, as roles are typically static and tied to job functions. The strategy of using Discretionary Access Control is generally insufficient for federal compliance because it delegates security decisions to individual resource owners, lacking the centralized oversight needed for SOX audits. Choosing Mandatory Access Control provides high security through sensitivity labels but is often too rigid for commercial financial environments, as it does not natively support the evaluation of environmental attributes like network origin or user risk scores.
Takeaway: ABAC enables dynamic, context-aware security decisions by evaluating subject, resource, and environmental attributes against centralized policies.
-
Question 3 of 19
3. Question
A United States-based financial services firm is preparing for a tabletop exercise to evaluate its response to a potential ransomware attack. The Chief Information Security Officer (CISO) wants to ensure the exercise aligns with NIST SP 800-61 guidelines and meets SEC expectations for operational resilience. Which of the following should be the primary focus of this exercise to best improve the organization’s incident response capability?
Correct
Correct: Tabletop exercises are discussion-based sessions where team members meet in an informal, classroom-style setting to discuss their roles during an emergency and their responses to a particular situation. The primary goal is to validate the Incident Response Plan (IRP) by testing the communication flow and decision-making processes across different departments. In a United States regulatory environment, the SEC emphasizes the importance of coordinated management and timely disclosure of material incidents, making the interaction between technical, legal, and executive teams the most critical element to simulate.
Incorrect: Verifying EDR blocking capabilities is a technical control validation that is typically handled through automated testing or red teaming rather than a discussion-based tabletop. Testing physical security controls focuses on environmental security and does not address the procedural or communicative aspects of responding to a cyber incident. Measuring database restoration times is a specific metric for Disaster Recovery functional testing and does not evaluate the strategic decision-making framework required during an active security breach.
Takeaway: Tabletop exercises primarily validate the procedural coordination and communication strategies required to manage complex security incidents effectively.
Incorrect
Correct: Tabletop exercises are discussion-based sessions where team members meet in an informal, classroom-style setting to discuss their roles during an emergency and their responses to a particular situation. The primary goal is to validate the Incident Response Plan (IRP) by testing the communication flow and decision-making processes across different departments. In a United States regulatory environment, the SEC emphasizes the importance of coordinated management and timely disclosure of material incidents, making the interaction between technical, legal, and executive teams the most critical element to simulate.
Incorrect: Verifying EDR blocking capabilities is a technical control validation that is typically handled through automated testing or red teaming rather than a discussion-based tabletop. Testing physical security controls focuses on environmental security and does not address the procedural or communicative aspects of responding to a cyber incident. Measuring database restoration times is a specific metric for Disaster Recovery functional testing and does not evaluate the strategic decision-making framework required during an active security breach.
Takeaway: Tabletop exercises primarily validate the procedural coordination and communication strategies required to manage complex security incidents effectively.
-
Question 4 of 19
4. Question
A Chief Information Security Officer at a publicly traded financial institution in the United States is refining the internal governance framework to satisfy SEC cybersecurity disclosure requirements. The CISO has already established a high-level security policy that mandates the protection of sensitive financial data. The next step involves creating a document that specifies the mandatory, uniform technical requirements and specific technologies that all departments must use to implement the policy. Which document type is most appropriate for this purpose?
Correct
Correct: Standards are mandatory rules that define specific technologies, protocols, or configurations to ensure consistency across the enterprise. They serve as the bridge between high-level policy goals and the technical implementation, ensuring that all business units adhere to the same security requirements as expected by United States regulatory bodies.
Incorrect: Relying solely on guidelines is insufficient because they offer discretionary recommendations rather than mandatory requirements, which fails to ensure the necessary uniformity for regulatory compliance. The strategy of using procedures is also incorrect as they focus on chronological, step-by-step instructions for completing a task rather than defining the technical specifications themselves. Opting for security frameworks is incorrect because frameworks provide a broad structure or set of best practices for managing security, rather than the specific, mandatory technical requirements for individual systems.
Takeaway: Standards establish mandatory, uniform technical requirements to ensure consistent policy implementation across an organization.
Incorrect
Correct: Standards are mandatory rules that define specific technologies, protocols, or configurations to ensure consistency across the enterprise. They serve as the bridge between high-level policy goals and the technical implementation, ensuring that all business units adhere to the same security requirements as expected by United States regulatory bodies.
Incorrect: Relying solely on guidelines is insufficient because they offer discretionary recommendations rather than mandatory requirements, which fails to ensure the necessary uniformity for regulatory compliance. The strategy of using procedures is also incorrect as they focus on chronological, step-by-step instructions for completing a task rather than defining the technical specifications themselves. Opting for security frameworks is incorrect because frameworks provide a broad structure or set of best practices for managing security, rather than the specific, mandatory technical requirements for individual systems.
Takeaway: Standards establish mandatory, uniform technical requirements to ensure consistent policy implementation across an organization.
-
Question 5 of 19
5. Question
A compliance officer at a US-based broker-dealer is reviewing the firm’s encryption policy after a NIST update regarding legacy cryptographic standards. The firm currently uses Triple DES (3DES) to protect sensitive transaction logs stored in a cloud environment. To maintain alignment with federal data protection standards and ensure long-term security against cryptanalytic advances, which symmetric algorithm should the firm transition to?
Correct
Correct: AES is the FIPS-validated symmetric algorithm recommended by NIST for protecting sensitive information. It offers significantly better performance and security than legacy ciphers like 3DES and is the standard for US financial regulatory compliance.
Incorrect: Using the original Data Encryption Standard is inappropriate because its 56-bit key is easily cracked by modern hardware. Selecting an asymmetric method like RSA for bulk data encryption is technically incorrect because asymmetric algorithms are too slow for large datasets. The strategy of using Blowfish is flawed because its 64-bit block size makes it susceptible to collision attacks in high-speed networks.
Takeaway: AES is the current NIST-approved symmetric standard for protecting sensitive data in US financial and federal sectors.
Incorrect
Correct: AES is the FIPS-validated symmetric algorithm recommended by NIST for protecting sensitive information. It offers significantly better performance and security than legacy ciphers like 3DES and is the standard for US financial regulatory compliance.
Incorrect: Using the original Data Encryption Standard is inappropriate because its 56-bit key is easily cracked by modern hardware. Selecting an asymmetric method like RSA for bulk data encryption is technically incorrect because asymmetric algorithms are too slow for large datasets. The strategy of using Blowfish is flawed because its 64-bit block size makes it susceptible to collision attacks in high-speed networks.
Takeaway: AES is the current NIST-approved symmetric standard for protecting sensitive data in US financial and federal sectors.
-
Question 6 of 19
6. Question
A financial services corporation based in New York is updating its business continuity framework to comply with federal interagency guidelines on operational resilience. The project lead is currently overseeing the Business Impact Analysis (BIA) for the firm’s core clearing and settlement services. Which of the following actions is most essential during this phase to establish the foundation for subsequent recovery time requirements?
Correct
Correct: Determining the Maximum Tolerable Downtime (MTD) is the most essential step because it represents the total time a business function can be down before the organization faces significant, irreparable damage. This value is a business-level decision that provides the upper limit for the Recovery Time Objective (RTO), ensuring that recovery strategies are directly linked to the organization’s survival needs and regulatory obligations under US financial standards.
Incorrect: Simply performing a quantitative risk assessment to find the ALE focuses on the probability and cost of threats rather than the time-sensitive impact on business continuity. The strategy of establishing a secondary hot site is a recovery implementation step that should only occur after the BIA has defined the necessary recovery speeds. Focusing only on technical playbooks for restoration describes the creation of a Disaster Recovery Plan (DRP), which is a separate document that relies on the outputs of the BIA rather than being a part of the analysis itself.
Takeaway: The Business Impact Analysis must first establish the Maximum Tolerable Downtime to define the boundaries for all recovery objectives and strategies.
Incorrect
Correct: Determining the Maximum Tolerable Downtime (MTD) is the most essential step because it represents the total time a business function can be down before the organization faces significant, irreparable damage. This value is a business-level decision that provides the upper limit for the Recovery Time Objective (RTO), ensuring that recovery strategies are directly linked to the organization’s survival needs and regulatory obligations under US financial standards.
Incorrect: Simply performing a quantitative risk assessment to find the ALE focuses on the probability and cost of threats rather than the time-sensitive impact on business continuity. The strategy of establishing a secondary hot site is a recovery implementation step that should only occur after the BIA has defined the necessary recovery speeds. Focusing only on technical playbooks for restoration describes the creation of a Disaster Recovery Plan (DRP), which is a separate document that relies on the outputs of the BIA rather than being a part of the analysis itself.
Takeaway: The Business Impact Analysis must first establish the Maximum Tolerable Downtime to define the boundaries for all recovery objectives and strategies.
-
Question 7 of 19
7. Question
A large healthcare organization based in the United States is preparing to share patient treatment data with a third-party research firm for longitudinal studies. To comply with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, the Chief Information Security Officer (CISO) must ensure that the data remains useful for research while minimizing the risk of re-identification. The research firm requires the clinical data but does not need to know the specific identities of the patients. Which approach provides the strongest technical protection for patient privacy in this scenario?
Correct
Correct: Tokenization provides a robust technical control by substituting sensitive identifiers with non-sensitive tokens. This process ensures that even if the research firm’s environment is compromised, the actual patient identities remain protected. Under HIPAA, this aligns with the principle of de-identification, which significantly reduces the regulatory burden and the risk of unauthorized disclosure of Protected Health Information (PHI).
Incorrect: Relying solely on a Business Associate Agreement is an administrative control that establishes legal liability but does not provide a technical barrier against data breaches. Simply encrypting the database during transfer only secures the data while it is in transit and does not protect the privacy of the records once they are decrypted for use by the researcher. The strategy of partial masking of only a few fields is often insufficient because the combination of remaining demographic data can still lead to successful re-identification through data linkage.
Takeaway: Technical de-identification through tokenization is superior to administrative agreements for protecting data privacy when sharing sensitive information with third parties.
Incorrect
Correct: Tokenization provides a robust technical control by substituting sensitive identifiers with non-sensitive tokens. This process ensures that even if the research firm’s environment is compromised, the actual patient identities remain protected. Under HIPAA, this aligns with the principle of de-identification, which significantly reduces the regulatory burden and the risk of unauthorized disclosure of Protected Health Information (PHI).
Incorrect: Relying solely on a Business Associate Agreement is an administrative control that establishes legal liability but does not provide a technical barrier against data breaches. Simply encrypting the database during transfer only secures the data while it is in transit and does not protect the privacy of the records once they are decrypted for use by the researcher. The strategy of partial masking of only a few fields is often insufficient because the combination of remaining demographic data can still lead to successful re-identification through data linkage.
Takeaway: Technical de-identification through tokenization is superior to administrative agreements for protecting data privacy when sharing sensitive information with third parties.
-
Question 8 of 19
8. Question
A lead security engineer at a major U.S. financial services firm is reviewing the firewall rule base following a quarterly compliance audit. The audit identified several any-to-any rules and legacy configurations that no longer align with the organization’s risk appetite. The engineer must implement a process to ensure long-term policy integrity while maintaining operational availability for critical trading systems. Which approach provides the most effective balance of security and operational continuity when remediating these overly permissive firewall rules?
Correct
Correct: Implementing a formal change management process ensures that every modification is documented and vetted for risk. Shadowing analysis identifies rules that are never reached because of preceding rules. Impact assessments and rollback procedures protect the availability of critical U.S. financial infrastructure during the remediation process.
Incorrect: Immediately removing rules without testing risks significant operational downtime for critical business functions. Relying solely on automated scripts based on hit counts can inadvertently disable essential disaster recovery or annual processing paths. The strategy of reordering rules does not eliminate the inherent risk of overly permissive access; it merely changes the processing sequence.
Takeaway: Effective firewall management requires balancing the principle of least privilege with rigorous change control and impact analysis to maintain availability.
Incorrect
Correct: Implementing a formal change management process ensures that every modification is documented and vetted for risk. Shadowing analysis identifies rules that are never reached because of preceding rules. Impact assessments and rollback procedures protect the availability of critical U.S. financial infrastructure during the remediation process.
Incorrect: Immediately removing rules without testing risks significant operational downtime for critical business functions. Relying solely on automated scripts based on hit counts can inadvertently disable essential disaster recovery or annual processing paths. The strategy of reordering rules does not eliminate the inherent risk of overly permissive access; it merely changes the processing sequence.
Takeaway: Effective firewall management requires balancing the principle of least privilege with rigorous change control and impact analysis to maintain availability.
-
Question 9 of 19
9. Question
A United States-based financial institution is upgrading its internal Public Key Infrastructure (PKI) to comply with federal data protection standards and NIST guidelines. The Chief Information Security Officer (CISO) wants to ensure a clear separation of duties between the entity that validates user identities and the entity that issues digital certificates. In this architecture, which task is specifically assigned to the Registration Authority (RA)?
Correct
Correct: The Registration Authority (RA) serves as the administrative interface of a PKI. Its primary role is to verify the identity of the person or system requesting a certificate. This ensures that the Certificate Authority (CA) only issues certificates to legitimate, authenticated entities. This separation of duties is a fundamental security principle in NIST-aligned cryptographic frameworks used within the United States to prevent a single point of administrative failure.
Incorrect: The task of digitally signing and issuing certificates is the primary responsibility of the Certificate Authority, which maintains the root of trust. Managing the publication of revocation lists is a function of the Certificate Authority or a specialized directory service to ensure real-time status updates for relying parties. Centralized key escrow is a specific recovery mechanism that is distinct from the identity verification process performed during the registration phase and is often handled by a separate key management system.
Takeaway: The Registration Authority performs identity verification and offloads administrative burdens from the Certificate Authority to maintain a secure trust model.
Incorrect
Correct: The Registration Authority (RA) serves as the administrative interface of a PKI. Its primary role is to verify the identity of the person or system requesting a certificate. This ensures that the Certificate Authority (CA) only issues certificates to legitimate, authenticated entities. This separation of duties is a fundamental security principle in NIST-aligned cryptographic frameworks used within the United States to prevent a single point of administrative failure.
Incorrect: The task of digitally signing and issuing certificates is the primary responsibility of the Certificate Authority, which maintains the root of trust. Managing the publication of revocation lists is a function of the Certificate Authority or a specialized directory service to ensure real-time status updates for relying parties. Centralized key escrow is a specific recovery mechanism that is distinct from the identity verification process performed during the registration phase and is often handled by a separate key management system.
Takeaway: The Registration Authority performs identity verification and offloads administrative burdens from the Certificate Authority to maintain a secure trust model.
-
Question 10 of 19
10. Question
A major US-based financial institution must provide credit card transaction records to an external data science firm for fraud pattern analysis. To comply with PCI DSS and SEC data protection expectations, the Chief Information Security Officer (CISO) requires a solution that removes sensitive data from the analytics environment. The solution must also allow the institution to link the analysis results back to specific customers internally. Which method should the CISO select to best meet these requirements?
Correct
Correct: Tokenization is the preferred method because it replaces sensitive data with non-sensitive placeholders that have no mathematical relationship to the original value. This approach significantly reduces the compliance scope under US standards like PCI DSS. By maintaining a secure, centralized vault, the institution retains the exclusive ability to map the tokens back to the original customer identities for internal use.
Incorrect: The strategy of using dynamic data masking to redact digits often fails to provide sufficient protection against sophisticated re-identification techniques or data reconstruction. Relying solely on cryptographic hashing is unsuitable because hashes are designed to be one-way, making it difficult for the institution to reverse the process efficiently. Choosing to share encryption keys with an external partner introduces significant key management risks and extends the security perimeter to an untrusted third-party environment.
Takeaway: Tokenization provides a non-mathematical substitute for sensitive data, effectively reducing regulatory scope while maintaining a secure path for authorized data reversal.
Incorrect
Correct: Tokenization is the preferred method because it replaces sensitive data with non-sensitive placeholders that have no mathematical relationship to the original value. This approach significantly reduces the compliance scope under US standards like PCI DSS. By maintaining a secure, centralized vault, the institution retains the exclusive ability to map the tokens back to the original customer identities for internal use.
Incorrect: The strategy of using dynamic data masking to redact digits often fails to provide sufficient protection against sophisticated re-identification techniques or data reconstruction. Relying solely on cryptographic hashing is unsuitable because hashes are designed to be one-way, making it difficult for the institution to reverse the process efficiently. Choosing to share encryption keys with an external partner introduces significant key management risks and extends the security perimeter to an untrusted third-party environment.
Takeaway: Tokenization provides a non-mathematical substitute for sensitive data, effectively reducing regulatory scope while maintaining a secure path for authorized data reversal.
-
Question 11 of 19
11. Question
During a periodic internal review at a large financial services firm regulated by the SEC, the Chief Audit Executive is refining the internal audit charter to align with Sarbanes-Oxley (SOX) compliance requirements. The firm aims to evaluate the effectiveness of its identity and access management (IAM) controls over the past fiscal year. To ensure the internal audit process remains robust and provides an unbiased assessment of the security posture, which organizational structure is most appropriate?
Correct
Correct: Reporting to the Board or an independent audit committee ensures that the audit function is not influenced by the management of the departments it is auditing. This independence is a cornerstone of professional auditing standards and is critical for SOX compliance in the United States.
Incorrect
Correct: Reporting to the Board or an independent audit committee ensures that the audit function is not influenced by the management of the departments it is auditing. This independence is a cornerstone of professional auditing standards and is critical for SOX compliance in the United States.
-
Question 12 of 19
12. Question
A Chief Information Security Officer at a major US-based brokerage firm is enhancing the organization’s threat detection capabilities following a series of sophisticated phishing attempts. The firm must comply with SEC cybersecurity disclosure rules and seeks to improve its identification of lateral movement within the network. The security team proposes implementing a User and Entity Behavior Analytics (UEBA) system to monitor internal traffic. What is the most significant benefit of this approach for the firm’s security posture?
Correct
Correct: Behavioral analysis functions by establishing a baseline of normal behavior for users and entities. By monitoring for deviations from this baseline, the system can detect sophisticated threats, such as credential theft or insider abuse, that do not match known malware signatures. This proactive approach is essential for meeting the rigorous security standards and disclosure requirements expected by United States financial regulators like the SEC.
Incorrect: Relying on static rules is characteristic of traditional access control or basic monitoring rather than true behavioral analysis which requires dynamic learning. The strategy of replacing all antivirus software with heuristics oversimplifies the defense-in-depth model and ignores the specific strengths of signature-based tools for known threats. Opting for a solution that focuses solely on encryption for Bank Secrecy Act compliance misidentifies the primary function of behavioral analytics, which is detection and monitoring rather than data-at-rest or data-in-transit protection.
Takeaway: Behavioral analysis detects unknown threats by identifying deviations from established normal patterns of user and system activity.
Incorrect
Correct: Behavioral analysis functions by establishing a baseline of normal behavior for users and entities. By monitoring for deviations from this baseline, the system can detect sophisticated threats, such as credential theft or insider abuse, that do not match known malware signatures. This proactive approach is essential for meeting the rigorous security standards and disclosure requirements expected by United States financial regulators like the SEC.
Incorrect: Relying on static rules is characteristic of traditional access control or basic monitoring rather than true behavioral analysis which requires dynamic learning. The strategy of replacing all antivirus software with heuristics oversimplifies the defense-in-depth model and ignores the specific strengths of signature-based tools for known threats. Opting for a solution that focuses solely on encryption for Bank Secrecy Act compliance misidentifies the primary function of behavioral analytics, which is detection and monitoring rather than data-at-rest or data-in-transit protection.
Takeaway: Behavioral analysis detects unknown threats by identifying deviations from established normal patterns of user and system activity.
-
Question 13 of 19
13. Question
A Chief Information Security Officer at a major United States financial services firm is updating the organization’s disaster recovery strategy. To ensure compliance with federal regulations regarding operational resilience, the CISO needs to establish the maximum time a critical clearing system can be offline before causing significant harm. Which activity should the security team prioritize to determine these specific recovery parameters?
Correct
Correct: A Business Impact Analysis (BIA) is the formal process used to identify critical business functions and the impact of their disruption, which directly informs the creation of Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO). By analyzing how downtime affects the organization quantitatively and qualitatively, the BIA provides the data necessary to set recovery priorities that align with regulatory expectations and business needs.
Incorrect: Executing a full-scale functional exercise is a testing method used to validate an existing plan rather than a tool for determining the initial recovery requirements. Reviewing Service Level Agreements ensures that vendors can meet established needs but does not help the organization define what those needs are based on business impact. The strategy of conducting threat modeling focuses on identifying security risks and vulnerabilities rather than establishing the time-based recovery goals required for business continuity.
Takeaway: The Business Impact Analysis is the primary tool for defining recovery objectives by assessing the impact of downtime on the organization.
Incorrect
Correct: A Business Impact Analysis (BIA) is the formal process used to identify critical business functions and the impact of their disruption, which directly informs the creation of Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO). By analyzing how downtime affects the organization quantitatively and qualitatively, the BIA provides the data necessary to set recovery priorities that align with regulatory expectations and business needs.
Incorrect: Executing a full-scale functional exercise is a testing method used to validate an existing plan rather than a tool for determining the initial recovery requirements. Reviewing Service Level Agreements ensures that vendors can meet established needs but does not help the organization define what those needs are based on business impact. The strategy of conducting threat modeling focuses on identifying security risks and vulnerabilities rather than establishing the time-based recovery goals required for business continuity.
Takeaway: The Business Impact Analysis is the primary tool for defining recovery objectives by assessing the impact of downtime on the organization.
-
Question 14 of 19
14. Question
The Chief Information Security Officer (CISO) of a publicly traded financial institution in the United States is restructuring the organization’s security program. To comply with Sarbanes-Oxley (SOX) Act requirements and recent SEC cybersecurity disclosure rules, the CISO needs to implement a formal security governance framework. Which of the following actions most effectively demonstrates the application of security governance at the institutional level?
Correct
Correct: Security governance is fundamentally about ensuring that security activities support business goals and are overseen by senior management. By establishing a cross-functional steering committee, the organization ensures that security is treated as a business risk rather than a technical problem, fulfilling the accountability requirements of United States regulations like SOX and SEC mandates.
Incorrect: Relying solely on technical solutions like SIEM deployment addresses operational capabilities but does not establish the strategic oversight or accountability required for governance. The strategy of using a framework only for technical hardening ignores the broader organizational and risk management components essential to a true governance structure. Opting for risk transfer through insurance is a valid risk treatment but does not constitute governance, as executive management remains legally and ethically responsible for the protection of sensitive data.
Takeaway: Security governance ensures that information security strategies are aligned with business objectives and overseen by accountable executive leadership.
Incorrect
Correct: Security governance is fundamentally about ensuring that security activities support business goals and are overseen by senior management. By establishing a cross-functional steering committee, the organization ensures that security is treated as a business risk rather than a technical problem, fulfilling the accountability requirements of United States regulations like SOX and SEC mandates.
Incorrect: Relying solely on technical solutions like SIEM deployment addresses operational capabilities but does not establish the strategic oversight or accountability required for governance. The strategy of using a framework only for technical hardening ignores the broader organizational and risk management components essential to a true governance structure. Opting for risk transfer through insurance is a valid risk treatment but does not constitute governance, as executive management remains legally and ethically responsible for the protection of sensitive data.
Takeaway: Security governance ensures that information security strategies are aligned with business objectives and overseen by accountable executive leadership.
-
Question 15 of 19
15. Question
A financial services firm in the United States is updating its supply chain risk management framework following a series of high-profile zero-day exploits in common open-source libraries. The Chief Information Security Officer (CISO) wants to ensure the security team can rapidly identify which internal systems are affected when new vulnerabilities are disclosed. Which approach to utilizing a Software Bill of Materials (SBOM) provides the most effective improvement to the organization’s incident response capabilities?
Correct
Correct: Machine-readable SBOMs, such as those in CycloneDX or SPDX formats, allow for automated and continuous monitoring of the software supply chain. By integrating these into vulnerability management tools, the organization can instantly map newly disclosed vulnerabilities to specific versions of nested components. This aligns with the transparency goals of U.S. Executive Order 14028 and NIST guidelines for enhancing software supply chain security.
Incorrect: Relying on static PDF documents creates a significant lag in visibility because these files are not easily searchable or updated as software versions change. Simply conducting manual keyword searches at the application level is insufficient because most modern vulnerabilities exist within deep, nested dependencies that are not visible in high-level application titles. The strategy of restricting development to a specific reference library is overly rigid and fails to address the risk of new vulnerabilities emerging in previously ‘safe’ components.
Takeaway: Machine-readable SBOMs enable automated, real-time identification of vulnerabilities within complex software supply chains and nested dependencies.
Incorrect
Correct: Machine-readable SBOMs, such as those in CycloneDX or SPDX formats, allow for automated and continuous monitoring of the software supply chain. By integrating these into vulnerability management tools, the organization can instantly map newly disclosed vulnerabilities to specific versions of nested components. This aligns with the transparency goals of U.S. Executive Order 14028 and NIST guidelines for enhancing software supply chain security.
Incorrect: Relying on static PDF documents creates a significant lag in visibility because these files are not easily searchable or updated as software versions change. Simply conducting manual keyword searches at the application level is insufficient because most modern vulnerabilities exist within deep, nested dependencies that are not visible in high-level application titles. The strategy of restricting development to a specific reference library is overly rigid and fails to address the risk of new vulnerabilities emerging in previously ‘safe’ components.
Takeaway: Machine-readable SBOMs enable automated, real-time identification of vulnerabilities within complex software supply chains and nested dependencies.
-
Question 16 of 19
16. Question
A lead investigator at a New York-based investment firm is responding to a security incident involving a compromised production server. The firm must comply with SEC record-keeping requirements and anticipates potential litigation in federal court. To maintain the integrity of the evidence while following the standard order of volatility, which of the following should the investigator perform first?
Correct
Correct: The order of volatility dictates that the most transient data should be collected first. Physical memory (RAM) is highly volatile and contains critical information such as running processes, network connections, and encryption keys that are lost once the system is powered down or if the data is overwritten by ongoing system activity. In the context of United States federal rules of evidence, capturing this data immediately ensures a more complete reconstruction of the event for legal proceedings.
Incorrect: Focusing on bit-by-bit imaging of storage volumes is a vital step but should follow the collection of more volatile data since disk data persists after power loss. The strategy of extracting logs directly from the live system risks altering file metadata and does not capture the transient state of the machine. Choosing to shut down the server immediately is generally avoided in modern forensics because it results in the total loss of volatile evidence and may activate hardware-level encryption or anti-forensic scripts that trigger upon power state changes.
Takeaway: Digital evidence collection must prioritize data based on its volatility to ensure the most transient information is preserved for legal proceedings under United States law or regulatory requirements.
Incorrect
Correct: The order of volatility dictates that the most transient data should be collected first. Physical memory (RAM) is highly volatile and contains critical information such as running processes, network connections, and encryption keys that are lost once the system is powered down or if the data is overwritten by ongoing system activity. In the context of United States federal rules of evidence, capturing this data immediately ensures a more complete reconstruction of the event for legal proceedings.
Incorrect: Focusing on bit-by-bit imaging of storage volumes is a vital step but should follow the collection of more volatile data since disk data persists after power loss. The strategy of extracting logs directly from the live system risks altering file metadata and does not capture the transient state of the machine. Choosing to shut down the server immediately is generally avoided in modern forensics because it results in the total loss of volatile evidence and may activate hardware-level encryption or anti-forensic scripts that trigger upon power state changes.
Takeaway: Digital evidence collection must prioritize data based on its volatility to ensure the most transient information is preserved for legal proceedings under United States law or regulatory requirements.
-
Question 17 of 19
17. Question
A Chief Information Security Officer (CISO) at a mid-sized investment firm in New York is updating the organization’s risk management strategy to comply with SEC cybersecurity disclosure requirements. During a review of the internal software development lifecycle, the CISO identifies that the current threat modeling lacks a structured, risk-centric approach that aligns technical vulnerabilities with business impact. The CISO needs a methodology that integrates technical requirements with business objectives to provide a comprehensive view of the threat landscape. Which approach should the CISO implement to meet these requirements?
Correct
Correct: The Process for Attack Simulation and Threat Analysis (PASTA) is a seven-step, risk-centric threat modeling methodology. It is specifically designed to link technical vulnerabilities to business impacts and objectives. This alignment is critical for organizations that must report on material cybersecurity risks to regulatory bodies like the SEC, as it provides a strategic view of how technical threats affect the business’s bottom line.
Incorrect: Relying solely on the DREAD methodology is often discouraged because it focuses on subjective scoring and has been largely phased out by many organizations due to its lack of consistency and reproducibility. The strategy of using the STRIDE model is effective for identifying technical threat categories at the developer level but often fails to bridge the gap between technical findings and business-level risk management. Simply conducting a high-level qualitative risk assessment provides a broad overview of risk but lacks the granular, systematic technical analysis required for effective threat modeling of complex software systems.
Takeaway: PASTA provides a risk-centric threat modeling framework that aligns technical security findings with organizational business objectives and impact.
Incorrect
Correct: The Process for Attack Simulation and Threat Analysis (PASTA) is a seven-step, risk-centric threat modeling methodology. It is specifically designed to link technical vulnerabilities to business impacts and objectives. This alignment is critical for organizations that must report on material cybersecurity risks to regulatory bodies like the SEC, as it provides a strategic view of how technical threats affect the business’s bottom line.
Incorrect: Relying solely on the DREAD methodology is often discouraged because it focuses on subjective scoring and has been largely phased out by many organizations due to its lack of consistency and reproducibility. The strategy of using the STRIDE model is effective for identifying technical threat categories at the developer level but often fails to bridge the gap between technical findings and business-level risk management. Simply conducting a high-level qualitative risk assessment provides a broad overview of risk but lacks the granular, systematic technical analysis required for effective threat modeling of complex software systems.
Takeaway: PASTA provides a risk-centric threat modeling framework that aligns technical security findings with organizational business objectives and impact.
-
Question 18 of 19
18. Question
A Chief Information Security Officer at a United States financial institution is implementing the NIST Risk Management Framework to enhance compliance with SEC cybersecurity disclosure requirements. During the initial phases of the project, the team must establish the foundation for security control selection. Which activity specifically characterizes the Categorize step of the NIST RMF in this context?
Correct
Correct: The Categorize step in the NIST Risk Management Framework requires the organization to describe the information system and the information it processes, then determine the impact of a security breach. This impact analysis is based on federal standards such as FIPS 199, which evaluate the potential consequences for organizational operations, assets, and individuals to determine if the system is Low, Moderate, or High impact.
Incorrect: Identifying the specific set of controls is part of the Select step, which uses the categorization results to choose a baseline. Verifying control implementation through testing is the core objective of the Assess step. Establishing a plan for ongoing tracking and reporting is the primary function of the Monitor step. These activities occur later in the framework lifecycle and depend on the initial impact analysis.
Takeaway: Categorization in the NIST RMF establishes the impact-based foundation for all subsequent risk management and control selection activities.
Incorrect
Correct: The Categorize step in the NIST Risk Management Framework requires the organization to describe the information system and the information it processes, then determine the impact of a security breach. This impact analysis is based on federal standards such as FIPS 199, which evaluate the potential consequences for organizational operations, assets, and individuals to determine if the system is Low, Moderate, or High impact.
Incorrect: Identifying the specific set of controls is part of the Select step, which uses the categorization results to choose a baseline. Verifying control implementation through testing is the core objective of the Assess step. Establishing a plan for ongoing tracking and reporting is the primary function of the Monitor step. These activities occur later in the framework lifecycle and depend on the initial impact analysis.
Takeaway: Categorization in the NIST RMF establishes the impact-based foundation for all subsequent risk management and control selection activities.
-
Question 19 of 19
19. Question
A Chief Information Security Officer (CISO) at a publicly traded financial services firm in New York is preparing for an upcoming Sarbanes-Oxley Act (SOX) compliance audit. The internal audit team must verify that the Information Technology General Controls (ITGCs) effectively support the integrity of financial reporting. Which action should the CISO prioritize to ensure the audit provides a valid assessment of the organization’s compliance posture?
Correct
Correct: Mapping controls to a framework like COBIT is essential for SOX compliance because it aligns IT governance with the financial reporting requirements mandated by the Securities and Exchange Commission (SEC). This systematic approach ensures that the audit covers all necessary domains, such as change management and logical access, which directly impact the reliability of financial statements.
Incorrect
Correct: Mapping controls to a framework like COBIT is essential for SOX compliance because it aligns IT governance with the financial reporting requirements mandated by the Securities and Exchange Commission (SEC). This systematic approach ensures that the audit covers all necessary domains, such as change management and logical access, which directly impact the reliability of financial statements.