Wednesday, September 20, 2023

PCI DSS Vulnerability Scanning and Penetration Testing Hygiene

Image: Dalle3
The Payment Card Industry Data Security Standard (PCI DSS) is an essential benchmark for businesses that store, process, or transmit cardholder data. The introduction of PCI DSS v4.0 brings several clarifications and new layers of complexity. Today, we’ll take a look at internal and external vulnerability scans and penetration testing. 

Vulnerability Scans ( and

One of the critical security controls that PCI DSS v4.0 emphasizes is the need for internal vulnerability scans. Companies must perform these scans after any 'significant change,' as defined by the standard. Significant changes include things like adding new hardware, software, or making considerable upgrades to existing infrastructure.

The scans aim to detect and resolve high-risk and critical vulnerabilities based on the entity’s vulnerability risk rankings. Following the scan, any detected vulnerabilities must be resolved, and rescans should be conducted as needed.

External vulnerability scans are equally important and follow the same triggering mechanism—significant changes in the environment. Here, the focus is on resolving vulnerabilities scored 4.0 or higher by the Common Vulnerability Scoring System (CVSS). As with internal scans, rescans are required as necessary to confirm that vulnerabilities have been adequately addressed.

Penetration Testing (11.4.2, 11.4.3)

Internal penetration testing is a more aggressive form of evaluation and should be conducted at least once every 12 months or after any significant change to the infrastructure or application. The testing can be carried out either by a qualified internal resource or a qualified external third-party, provided that there is organizational independence between the tester and the entity being tested. Notably, the tester doesn't need to be a Qualified Security Assessor (QSA) or an Approved Scanning Vendor (ASV).

Much like its internal counterpart, external penetration testing is required annually or after any significant alterations to the system. The testing must also be conducted by qualified resources and should follow the entity’s defined methodology for testing.

What Constitutes a 'Significant Change'?

PCI DSS v4.0 is pretty broad in what it considers to be 'significant changes,' effectively encompassing any new hardware, software, or networking equipment added to the Cardholder Data Environment (CDE), as well as any replacement or major upgrades to existing hardware and software in the CDE. The list is exhaustive and is aimed at ensuring that any changes, no matter how seemingly minor, are given adequate attention from a security perspective.

Summary of Requirements

The PCI DSS v4.0 requirements for vulnerability scans and penetration testing provide a structured approach for entities to keep their data environments secure. While these requirements might seem stringent, they offer a well-defined framework for securing cardholder data against the backdrop of ever-advancing cyber threats. Adhering to these requirements is not just about ticking compliance boxes; it’s about taking the necessary steps to protect your organization and its stakeholders.

  • Internal vulnerability scans:
    • Internal vulnerability scans are performed after any significant change as follows:
      • High-risk and critical vulnerabilities (per the entity’s vulnerability risk rankings defined at Requirement 6.3.1) are resolved.
      • Rescans are conducted as needed (significant changes..).
  • External vulnerability scans:
    • External vulnerability scans are performed after any significant change as follows:
      • Vulnerabilities that are scored 4.0 or higher by the CVSS are resolved.
      • Rescans are conducted as needed (significant changes..).
  • Internal penetration testing:
    • 11.4.2 Internal penetration testing is performed:
      • Per the entity’s defined methodology, at least once every 12 months
      • After any significant infrastructure or application upgrade or change
      • By a qualified internal resource or qualified external third-party
      • Organizational independence of the tester exists (not required to be a QSA or ASV).
  • External penetration testing:
    • 11.4.3 External penetration testing is performed:
      • Per the entity’s defined methodology, at least once every 12 months
      • After any significant infrastructure or application upgrade or change
      • By a qualified internal resource or qualified external third party
      • Organizational independence of the tester exists (not required to be a QSA or ASV).
  • Significant changes are defined in PCI DSS to include (PCI-DSS-v4_0.pdf page 26):
    • New hardware, software, or networking equipment added to the CDE.
    • Any replacement or major upgrades of hardware and software in the CDE.
    • Any changes in the flow or storage of account data.
    • Any changes to the boundary of the CDE and/or to the scope of the PCI DSS assessment.
    • Any changes to the underlying supporting infrastructure of the CDE (including, but not limited to, changes to directory services, time servers, logging, and monitoring).
    • Any changes to third party vendors/service providers (or services provided) that support the CDE or meet PCI DSS requirements on behalf of the entity.

Thursday, September 14, 2023

Using Maturity Levels and Qualitative Measurement for Visualizing Technology Implementations

Example Maturity Model

Check out this maturity model. What does it mean to measure the maturity of a technology implementation qualitatively? And how can maturity levels help visualize the current and future states to meet control requirements? 

Let's unpack these concepts and show how qualitative measures can enrich the maturity model process, particularly with the use of visualization techniques like bar or radar graphs.

Maturity models serve as diagnostic tools, usually consisting of a sequence of maturity levels that provide a path for improvements. These models are vital for benchmarking and identifying the best practices that need to be implemented for organizational success. In technology implementation, they can gauge how effectively an organization is meeting its control requirements—be it in data security, governance, or software development lifecycle.

The Qualitative Dimension

While numbers and metrics provide a certain level of clarity, they often lack context. Qualitative measurements step in here to provide nuanced insights into otherwise cold data. Through expert interviews, case studies, and scenario analyses, qualitative assessments can address 'how' and 'why' questions that numbers cannot.

One of the powerful ways to present the qualitative aspect of maturity models is through visualization. A bar or radar graph can be used to overlay the current and future states of an organization's maturity levels.

Current State

Imagine a bar graph where the X-axis represents different control requirements like "Data Encryption," "User Access Management," and "Compliance Monitoring," and the Y-axis represents maturity levels from 0 (Non-existent) to 5 (Optimized). The current state can be represented by blue bars reaching up to the current maturity level for each control requirement.

This visualization allows stakeholders to immediately grasp which areas are well-managed and which need improvement. It's not just about the height of the bar but the story behind each bar—which can be enriched by qualitative inputs like expert opinions, employee feedback, and process reviews.

Future State

In the same graph, future state scenarios can be represented by a different color—say, green bars—overlaying or adjacent to the current state bars. These future state bars are not arbitrary but are informed by qualitative measures like scenario planning, risk assessments, and strategic discussions.

The juxtaposition of current and future states in one graph offers a compelling narrative. It shows where the organization aims to be, providing a clear vision for everyone involved.

Utility of Qualitative Maturity Models

Maturity levels, when fleshed out with qualitative measurements, offer more than a snapshot of the present; they provide a roadmap for the future. Visual representations like bar or radar graphs give life to these qualitative insights, making them easy to understand and act upon.

So, the next time you consider assessing your organization’s technology maturity, think beyond numbers. Look at the stories those numbers can tell, and use qualitative measures to fill in the gaps. And don't just keep these insights in spreadsheets and reports—visualize them. 

Combine qualitative measures with visualization techniques and build a more meaningful, actionable, and comprehensive roadmap. Aim for a balanced, nuanced, and visually engaging approach to understand the current state and opportunity for improvement.

Example Output 

Here's a quick assessment of an organization's adherence to the NIST Privacy Framework. The beauty of this method - by the way - is that it's fast and easy to create this chart using qualitative measures. Search for the Privacy Framework spreadsheet under the downloads section if you want a copy of this.

Tuesday, September 5, 2023

NIST.SP.800-66r2.ipd Worksheet - HIPAA Indexed on NIST

HIPAA to NIST and NIST to HIPPA indexed worksheets in a single spreadsheet based on the Initial Public Draft (ipd) are posted on the downloads website. Look for the workbook 2022 HIPAA Crosswalk SP 800-66 ipd Table 12 on:

Friday, September 1, 2023

Numbers and Narratives: The Power of Qualitative and Quantitative Feedback

While technological prowess is crucial for cybersecurity, human factors are often the linchpin that determines an organization's susceptibility to cyber threats. As we navigate this ever-evolving landscape, the role of learning programs in enhancing cybersecurity awareness cannot be overstated. But how do we measure the effectiveness of these initiatives? The answer lies in a meticulous blend of quantitative and qualitative feedback.

The Quantitative Dimension

In the realm of cybersecurity learning programs, quantitative data acts as the backbone that offers empirical evidence of program effectiveness. This data, collected through various channels—from real-world cybersecurity incidents and metrics on employee reporting to targeted simulations and longitudinal studies—provides a measurable barometer of your organization's cybersecurity posture. It can also help tailor training materials to specific departments, evaluate ROI, and keep content up to date. This section will detail the key types of quantitative data that you should focus on, offering a robust framework for continuously enhancing your cybersecurity initiatives through actionable metrics.

  1. Cybersecurity Incident Data - Utilize real-world data on past incidents to simulate realistic scenarios in your training programs. For example, if there has been a rise in phishing attacks, including similar scenarios in your learning modules can help prepare the workforce better.
  2. Metrics on Incident Reporting - Review how many employees report potential cybersecurity events pre- and post-training. An increase in reports post-training could indicate higher awareness.
  3. Simulated Attack Responses - Phishing simulations can provide invaluable data. If 90% of your employees ignore a phishing email post-training compared to 50% pre-training, you know you’re on the right track.
  4. Longitudinal Data - Track the program's impact over time to identify trends. Maybe the initial spike in awareness drops after six months, indicating a need for refresher courses.
  5. Employee Testing Data - Compare employee cybersecurity test scores before, immediately after, and three months post-training to assess knowledge retention.
  6. Performance by Department - Do tech departments outperform sales in cybersecurity awareness? This could guide department-specific training.
  7. Training Attendance and Completion Rates - Low attendance or completion could indicate that the training is too cumbersome or not engaging enough.
  8. Quantitative Surveys and Costs - Use closed-ended surveys for quick, quantifiable feedback. Also, calculate the per-participant cost of developing and delivering the program for ROI assessment.
  9. Privacy and Technical Metrics - Track the frequency and type of privacy or cybersecurity events to identify the need for role-based training. Changes following technical training—like a reduction in accounts with privileged access—can also be invaluable metrics.

The Qualitative Dimension

While quantitative metrics provide the hard facts, it's the qualitative data that enriches our understanding by adding context, nuance, and depth to these numbers. Qualitative feedback captures the human elements that are often overlooked in cybersecurity initiatives. From capturing employees' responses about the program's delivery and content to conducting focus groups for in-depth insights, qualitative data allows us to gauge the intangibles that make or break a learning program. In this section, we will delve into various types of qualitative feedback, including presenter evaluations, open-ended surveys, and even observations from the training sessions, to provide a more holistic assessment of your cybersecurity education efforts.

  1. Presenter and Program Feedback - Encourage employees to share feedback on trainers and program content to make real-time improvements.
  2. Open-Ended Surveys and Reports - Use these to gather nuanced opinions. Maybe the training material is excellent, but the pace is too fast?
  3. Focus Groups and Observations - Conduct these with a cross-section of employees to get richer insights into the learning experience, identifying areas for improvement.
  4. Suggestion Box - A suggestion box allows employees to provide candid feedback and innovative ideas for program improvement.

A Marriage of Metrics and Mindsets 

Combining quantitative data with qualitative insights will not only paint a comprehensive picture of your program's effectiveness but will also guide data-informed decisions for future improvements. For instance, if your quantitative data indicates high knowledge retention but qualitative feedback points to low engagement, you may need to inject more interactive elements into your program. Because when it comes to cybersecurity, an empowered workforce is your best line of defense. 

And if you haven't already, check out NIST Special Publication 800-50 and look for the upcoming Rev. 1. This is a comprehensive guideline that serves as an invaluable resource for information security education, training, and awareness. Thank you to NIST and the industry authors and contributors for your tireless work in advancing the field and providing a foundational resource for cybersecurity professionals everywhere.