Friday, August 28, 2020

The State of Compliance – and the Cloud – in the Financial Services Industry

Ghost written after an interview with me as the source. Heck of a job by Dennis McCafferty.

If there is one unifying and indisputable fact about the hackers of the world, it’s this: They go where the money goes.

This makes the financial services industry one of their favorite targets. Once they compromise an organization in the sector, they can make thousands – or tens of thousands or much, much more – by stealing credit card account numbers, committing wire fraud, emptying savings accounts, etc. Or they can profit off of ill-gotten intelligence, selling or buying stock shares based upon non-publicly disclosed insider information contained “within the vault.”

That’s why financial services organizations should view regulatory compliance not as an onerous, “check the boxes” effort, but one that can ensure the continued integrity of their operations and reputation.

In seeking to reduce the risk of debit and credit card loss and limit identity theft, the Payment Card Industry Data Security Standard (PCI DSS) sets 12 security requirements for all companies (including banks and other financial institutions) that process, store, transmit or accept credit card information, regardless of size or number of transactions. The requirements cover firewall configuration, cardholder data encryption, secure systems/applications development, physical access to data and regular testing. Unfortunately, global PCI DSS compliance among firms has fallen for two straight years and now stands at 36.7 percent, down from 55.4 percent in 2016, according to the 2019 Payment Security Report from Verizon. Fines for PCI DSS violations range from $5,000 to $100,000 per month.

It’s worth noting that, in prior annual Payment Security reports, Verizon has revealed that no organizations were found to be fully compliant at the time of a breach, demonstrating lower compliance with ten out of the 12 PCI DSS key requirements. In addition, only 29 percent of companies are still fully compliant with PCI DSS less than a year after being validated. This speaks to a governance problem, as these firms are failing to commit to a sufficient level of continuous monitoring which will effectively manage both risk and compliance over time.

It’s clear that an ever-elevating threat landscape is creating formidable compliance and risk-management barriers: Based upon its analysis of nearly 41,700 security incidents and more than 2,010 breaches, the 2019 Verizon Data Breach Investigations Report (DBIR) indicates that the finance industry accounted for 927 of those incidents (ranked #4 among all sectors) and 207 of the breaches (third overall, behind only the public sector and healthcare). These organizations also suffered the second-highest average cost of a data breach at $5.86 million – 49 percent greater than the $3.92 million global average for all industries, according to the 2019 Cost of a Data Breach Report from the Ponemon Institute and IBM.

What’s more, increasing activity in the cloud is adding to the complexities of data protection and compliance: Financial services organizations are allocating 41 percent of their IT budgets to the public cloud, up from 34 percent in 2018, according to research from Refinitiv. Sixty-seven percent of the industry’s firms either are already invested in Infrastructure as a Service (IaaS)/public cloud offerings or are planning to, compared to 59 percent of companies in general, according to 451 Research. Fifty-five percent of these firms are either already invested in Platform as a Service (Paas) or are planning to, compared to 46 percent of companies in general.

However, despite the abundant activity in the cloud, there is much trepidation: Technologies supporting cloud migrations are considered the top contributor to cybersecurity risk for financial services organizations, as cited by three-fifths of the industry’s security practitioners, according to research from Ponemon. (Blockchain tools ranked #2, as cited by 52 percent of these professionals.)

At Caveonix, we have dedicated ourselves to helping financial services organizations readily comply, thus enabling them to better defend their data and systems – whether on-premise or in the cloud. Our Caveonix RiskForesight platform implements continuous compliance to ensure our customers are meeting regulatory standards across their infrastructure and applications. The solution tracks and reports adherence to compliance requirements, and determines the impact of configuration and vulnerability changes on compliance. RiskForesight detects compliance drift and how to bring it back into compliance.

RiskForesight distinguishes itself because it delivers an ongoing, continuous evaluation of where our customers stand, identifying and assessing compliance and risk wherever their data exists. The Caveonix team firmly believes that, if you can see it, you can quantify it. And if you can quantify it, you can apply the governance required to incorporate effective controls. We understand that satisfying regulations are about more than “checking boxes” – they’re about successfully managing risk and protecting information in the interest of operational integrity and brand reputation. If you’d like to talk about how we can work together to help solve your compliance/cybersecurity problems, then please contact us.

Wednesday, July 22, 2020

CRITICALSTART CEO Rob Davis selected as EY Entrepreneur of the Year® 2020 Southwest Finalist

I'm so incredibly proud of Rob!! Congratulations!!! He's the best brother, mentor, and friend you could have. I still learn from him constantly. He's exacting and demanding of the best you can give, but selfless and giving to help you get there. 

https://www.criticalstart.com/rob-davis-selected-as-ey-entrepreneur-of-the-year-finalist/

"Congratulations to CRITICALSTART CEO, Rob Davis, who has been selected by an independent panel of judges as an EY Entrepreneur Of The Year® 2020 Southwest Finalist. Rob was selected as a finalist for his vision, leadership, and determination in our ever-changing world.

Widely considered as one of the most prestigious business award programs in the U.S., this program recognizes entrepreneurs and leaders of high-growth companies who are excelling in areas such as innovation, financial performance, and personal commitment to their businesses and communities- while also transforming our world.

Regional award winners will be virtually announced on Friday, October 2, 2020."

Friday, June 26, 2020

AWS, Accenture, Palo Alto: State of the Cloud

3000 people consulted from AWS, Accenture, Palo Alto Networks across
  • Financial Services and Insurance,
  • Consumer and Industrial Products,
  • Energy and Resources,
  • Technology and Telecommunications, and
  • Life Sciences And Healthcare

Download here: 
https://www.paloaltonetworks.com/state-of-cloud-native-security

Selected Highlights 
Compute spread: VMs 30%, containers 24%, CaaS 21%, PaaS 22%

We are in a multicloud, multi-compute world
  • 94% of organizations use more than 1 cloud platform
  • 60% use between 2 and 5 platforms
  • AWS is the most popular public cloud service provider
The top three challenges for moving workloads to the cloud
  • Technical complexity (42%)
  • Maintaining comprehensive security (39%)
  • Ensuring compliance (32%)
More security tools doesn’t necessarily mean better security
  • Companies investing more than $100 million in cloud are trimming the number of tools they use. 53% of this high-spending group use just 5 or fewer cloud security tools
  • Acquiring more tools and vendors can create inefficiencies and make employee tool training more difficult
  • Companies start to see overlaps between tools and vendor offerings, so they consolidate and rationalize tools and tool providers
  • 71% of companies use third-party vendor tools, 65% use CSP-provided security tools and 62% use open source tools
Cloud security spend is rapidly catching up with cloud spend
  • Cloud security spend is highest for companies with an annual cloud budget of $100 million or more
  • 34% of these high spenders allocate 16% or more of their cloud budget to security
Greatest challenge to providing comprehensive cloud security.
  1. Lack of visibility 15%
  2. Tool training 14%
  3. Safe practice training 11%
  4. Evaluating current state 11%
  5. Integration of security tools 10%
  6. Securing budget 10%
  7. Security reporting tools 8%
  8. Automation 6%
  9. Executive buy-in 4%
  10. Other 1%
More Numbers
  • 65% of survey organization’s invested more than 10% of their 2019 cloud budget in securing their cloud estates.
  • 58% use 6 or more cloud security vendors.
  • 57% use 6 or more cloud security tools.
  • 77% of companies have cloud security teams bigger than 20 people.

Tuesday, June 23, 2020

Hot News! 25% of enterprises by Summer 2021 will be 100% all-cloud.

All-in on All-cloud! 

Excellent summary by Joe McKendrick on ZDNet of a survey conducted by O'Reilly Media covering large organizations with more than 10,000 employees. Market share for this study was AWS at 67%, Azure at 48%, and GCP at 32%.

Here's the summarized takeaways:
  • 17% of large organizations employees have already moved 100% of their applications to the cloud.
  • 25% of large organizations have plans to move 100% of their infrastructure to the cloud next year.
  • 54% of large organizations use multiple CSP's.
  • 67% of large organizations have plans to move more than 50% of their infrastructure to the cloud.
  • 90% of large organizations expect to increase their usage of cloud infrastructure.
  • 34% of large organizations are using serverless computing
  • 52% of large organizations use microservices with 70% saying they have <3 years experience using them
  • 35% of large organizations have implemented Site Reliability Engineering (SRE)
  • 47% of large organizations expect to implement Site Reliability Engineering (SRE)
Great work Joe and O'Reilly Media!

Friday, April 24, 2020

How to Gain Control Over a Multi-Cloud Environment


Here's my contribution to an information week article that will publish soon.

What's the best way for an organization to gain control over a multi-cloud environment?
    The purpose of gaining control is to manage risk. Period. An effective risk management framework looks at the priority and impact for each element and makes a decision about how to handle the risk. Start with what requirements apply to what assets. Subsequently implement uniform requirements and test security and compliance controls before authorizing production. Starting with a known good state, continuously validate the environment looking for a deviation from your acceptable risk.

    The six executed steps of the NIST Risk Management Framework (RMF) are usually shown in a circular pattern. Wash, rinse, repeat:
    1. Categorize System
    2. Select Controls
    3. Implement Controls
    4. Assess Controls
    5. Authorize System
    6. Monitor System  
    Why does this technique work so well?

    An organization looking to gain control of their cloud footprint already feels the pain from their distribution of assets and differences in technology stacks and features. A solid risk management framework as required by every regulation, standard, and security best practice framework. ISO. PCI. HIPAA. GDPR. Name drop… Keep going. Why? It's because getting everything set up properly the first time is difficult. But managing over time? Much more difficult. Governance is the new gold rush.

    Is there anything else an organization can do to make a multi-cloud environment more manageable?

    Gartner performed a comprehensive study of risk management frameworks several years ago. Their conclusion? It doesn't matter which one you implement. You need to implement one. Complex environment? Cloud computing? Multiple geographies? Focus on risk management. Decide how to identify and measure risk, quantify, and produce truly actionable results. The report of 15,000 entries doesn't help me. Show me where to focus and specifically tell me why.  

    What's the biggest mistake enterprises make when trying to gain better control over a multi-cloud environment?
      The single biggest mistakes enterprises make:
      1. Failing to prioritize and rank impact your findings. This is the purpose of a risk management framework.
      2. Failing to take any action as you look for perfect information, solutions, decision points. Analysis paralysis. Take the problem, break it down into small chunks, and delegate.
      3. Ignoring blind spots because you don't have enough technical depth or visibility across the multi-cloud environment.
      Is there anything else you would like to add?

      Measure. Act. Measure again. Figure out how to incorporate risk management into your entire environment.

      Wednesday, March 4, 2020

      Cloud Security & Compliance Success = Repeating a Thousand Good Decisions.

      Click the image to see detail.
      There is no way about it. You must maintain the security of your systems – and the compliance of your systems – over time.

      What does this mean? How do we get there? Here's the short answer.

      Assurance is the confidence in your ability to provide for the confidentiality, integrity, availability, and accounting of your systems (NIST). Getting your systems set up "in the cloud" is one thing. Doing it with the appropriate controls to enforce security and meet compliance requirements… That's another. Step up the maturity of your processes and maintain your security and compliance posture over time. Step it up further to provide actionable intelligence for proactive (automatic or manual) responses. Document everything. Details matter.

      This has been discussed before, and this is an illustration that I created a few years ago. It came up again this week while discussing governance with a peer. Maintaining and governing systems requires work. It. Just. Does.

      Automation is a great tool, but then, consider the details as well. What is your scope? Is it accurate? How much of it is automation cover? Who covers the rest? What about the business processes? Which ones are supportive of your endeavor? Which ones are antagonistic? What is an acceptable substantive change? According to what metrics? How do you measure your success?

      It's difficult to do. It requires revisiting previously executed routines multiple times and measuring your results. Success is derived when you stop looking at this as a series of athletic sprints (not to be confused with agile processes). Set yourself up with the endurance to excel in ongoing operations.

      Wednesday, February 12, 2020

      15 Top-Paying IT Certifications for 2020

      Here's an interesting article from Global Knowledge. Notice that the top certifications are focused on cloud, security, and risk. Importantly, both here and from some of the pundits discussing this, you'll see that in many cases multiple certifications matter, demonstrating a broad level of knowledge, and the average level of experience may be more than you think.

      Take a look at the details here:
      https://www.globalknowledge.com/us-en/resources/resource-library/articles/top-paying-certifications/.

      Top-paying certifications:

      1. Google Certified Professional Cloud Architect — $175,761
      2. AWS Certified Solutions Architect – Associate — $149,446
      3. CISM – Certified Information Security Manager — $148,622
      4. CRISC – Certified in Risk and Information Systems Control — $146,480
      5. PMP® – Project Management Professional — $143,493
      6. CISSP – Certified Information Systems Security Professional — $141,452
      7. CISA – Certified Information Systems Auditor — $132,278
      8. AWS Certified Cloud Practitioner — $131,465
      9. VCP6-DCV: VMware Certified Professional 6 – Data Center Virtualization — $130,226
      10. ITIL® Foundation — $129,402
      11. Microsoft Certified: Azure Fundamentals — $126,653
      12. Microsoft Certified: Azure Administrator Associate — $125,993
      13. CCA-N: Citrix Certified Associate – Networking — $125,264
      14. CCNP Routing and Switching — $119,178
      15. CCP-V: Citrix Certified Professional – Virtualization — $117,069

      Monday, February 10, 2020

      Scaled Agile Framework®: SAFe® for Lean Enterprises

      Image Credit: https://www.scaledagileframework.com

      I'm a supportive fan of what Dean Leffingwell, Creator of SAFe® and Chief Methodologist at Scaled Agile, Inc., has done with Agile development. He's created an excellent body of work. Please take a minute to view his website for more information and learn how to leverage his work and lower your project risk with your own teams.

      Why is this important? Let's say your strategy and technology decisions are correct. Tactical execution is difficult. Presuming your strategic direction is on target, how do you get there? Effective execution can be learned, and here's one place to start. The unfortunate reality is that poorly executed projects often result in insecure systems. Security controls are poorly implemented or left out entirely. I've often said that simply based on the paperwork alone, I get a good feel for how well the security controls are implemented. Stop taking shortcuts. Do it right the first time.

      Following is from his website:

      The latest version, SAFe 5.0, is built around the Seven Core Competencies of the Lean Enterprise that are critical to achieving and sustaining a competitive advantage in an increasingly digital age: 

      1. Lean-Agile Leadership – Advancing and applying Lean-Agile leadership skills that drive and sustain organizational change by empowering individuals and teams to reach their highest potential 
      2. Team and Technical Agility – Driving team Agile behaviors as well asl sound technical practices including Built-in Quality, Behavior-Driven Development (BDD), Agile testing, Test-Driven Development (TDD), and more 
      3. Agile Product Delivery – Building high-performing teams-of-teams that use design thinking and customer-centricity to provide a continuous flow of valuable products using DevOps, the Continuous Delivery Pipeline, and Release on Demand 
      4. Enterprise Solution Delivery – Building and sustaining the world’s largest software applications, networks, and cyber-physical solutions 
      5. Lean Portfolio Management – Executing portfolio vision and strategy formulation, chartering portfolios, creating the Vision, Lean budgets and Guardrails, as well as portfolio prioritization, and roadmapping 
      6. Organizational Agility – Aligning strategy and execution by applying Lean and systems thinking approaches to strategy and investment funding, Agile portfolio operations, and governance 
      7. Continuous Learning Culture – Continually increasing knowledge, competence, and performance by becoming a learning organization committed to relentless improvement and innovation

      Tuesday, January 28, 2020

      Resources Serving Veterans

      More than 900 hours of free cybersecurity training for any government employee or veteran. Awesome.

      The barrier to learning has been lowered to a decision of whether you want to do it or not. Go do it. Make it happen. Make a decision to improve the rest of your life today.

      "The Federal Virtual Training Environment (FedVTE) provides free online cybersecurity training to federal, state, local, tribal, and territorial government employees, federal contractors, and US military veterans."
      Also, fellow veterans... check out:

      Friday, January 24, 2020

      Homage to the Legends. Reference Monitor - October 19

      Recently, I was asked to speak to a group of middle school students. One of the questions that I asked them was, "Why is it possible to build ever-increasingly complex software and systems?" The answer is simple. Many have gone before us to build the foundations and think through complex problems. We get to benefit from their knowledge and accelerate our learning.
      Occasionally, there are special luminaries that either by sheer luck or sheer genius create breakthroughs in thinking that everyone else gets to benefit from.

      James P. Anderson is responsible for pushing the understanding and usefulness of several powerful and simple constructs in computer security.

      I continue to hear much of his material regurgitated as original thought. It's fine, insofar as somebody really does have an original thought. It's interesting to me just how relevant his research and ideas are today, nearly half a century after his original content was published.
      NIST: http://csrc.nist.gov/publications/history/ande72a.pdf 

      Understanding the History
      (text below is from the Orange Book.)
      Trusted Computer System Evaluation Criteria
      https://fas.org/irp/nsa/rainbow/std001.htm
      DoD 5200.28-STD
      Library No. S225,7ll
      DEPARTMENT OF DEFENSE STANDARD

      Historical Perspective
      In October 1967, a task force was assembled under the auspices of the Defense Science Board to address computer security safeguards that would protect classified information in remote-access, resource-sharing computer systems. The Task Force report, "Security Controls for Computer Systems," published in February 1970, made a number of policy and technical recommendations on actions to be taken to reduce the threat of compromise of classified information processed on remote-access computer systems.[34]  Department of Defense Directive 5200.28 and its accompanying manual DoD 5200.28-M, published in 1972 and 1973 respectively, responded to one of these recommendations by establishing uniform DoD policy, security requirements, administrative controls, and technical measures to protect classified information processed by DoD computer systems.[8;9]  Research and development work undertaken by the Air Force, Advanced Research Projects Agency, and other defense agencies in the early and mid 70's developed and demonstrated solution approaches for the technical problems associated with controlling the flow of information in resource and information sharing computer systems.[1]  The DoD Computer Security Initiative was started in 1977 under the auspices of the Under Secretary of Defense for Research and Engineering to focus DoD efforts addressing computer security issues.[33]

      Concurrent with DoD efforts to address computer security issues, work was begun under the leadership of the National Bureau of Standards (NBS) to define problems and solutions for building, evaluating, and auditing secure computer systems.[17]  As part of this work NBS held two invitational workshops on the subject of audit and evaluation of computer security.[20;28]  The first was held in March 1977, and the second in November of 1978.  One of the products of the second workshop was a definitive paper on the problems related to providing criteria for the evaluation of technical computer security effectiveness.[20]  As an outgrowth of recommendations from this report, and in support of the DoD Computer Security Initiative, the MITRE Corporation began work on a set of computer security evaluation criteria that could be used to assess the degree of trust one could place in a computer system to protect classified data.[24;25;31]  The preliminary concepts for computer security evaluation were defined and expanded upon at invitational workshops and symposia whose participants represented computer security expertise drawn from industry and academia in addition to the government.  Their work has since been subjected to much peer review and constructive technical criticism from the DoD, industrial research and development organizations, universities, and computer manufacturers.

      The DoD Computer Security Center (the Center) was formed in January 1981 to staff and expand on the work started by the DoD Computer Security Initiative.[15]  A major goal of the Center as given in its DoD Charter is to encourage the widespread availability of trusted computer systems for use by those who process classified or other sensitive information.[10]  The criteria presented in this document have evolved from the earlier NBS and MITRE evaluation material.

      \\ Here is one of the more powerful security constructs that I use on a regular basis as a simple model that can be explained to others. I first learned this in 1999 as part of my CISSP studies.

      6.1  THE REFERENCE MONITOR CONCEPT
      In October of 1972, the Computer Security Technology Planning Study, conducted by James P.  Anderson & Co., produced a report for the Electronic Systems Division (ESD) of the United States Air Force.[1]  In that report, the concept of "a reference monitor which enforces the authorized access relationships between subjects and objects of a system" was introduced.  The reference monitor concept was found to be an essential element of any system that would provide multilevel secure computing facilities and controls.

      The Anderson report went on to define the reference validation mechanism as "an implementation of the reference monitor concept .  .  .  that validates each reference to data or programs by any user (program) against a list of authorized types of reference for that user." It then listed the three design requirements that must be met by a reference validation mechanism:
           a. The reference validation mechanism must be tamper proof.
           b. The reference validation mechanism must always be invoked.
           c. The reference validation mechanism must be small enough to be subject to analysis and tests, the completeness of which can be assured."[1]

      Extensive peer review and continuing research and development activities have sustained the validity of the Anderson Committee's findings.  Early examples of the reference validation mechanism were known as security kernels.  The Anderson Report described the security kernel as "that combination of hardware and software which implements the reference monitor concept."[1]  In this vein, it will be noted that the security kernel must support the three reference monitor requirements listed above.