According to a survey conducted by Palo Alto Networks, 39% of global organizations reported a surge in breaches over the past year. The security vendor polled over 2500 respondents in the US, Australia, Germany, France, Japan, Singapore and the UK:
- 90% said they are unable to detect, contain and resolve cyber-threats within an hour
- 42% reported an increase in mean time to remediate
- 30% reported a major increase in intrusion attempts and unplanned downtime
Part of the challenge appears to be the complexity of their cloud security environments – partly caused by tool bloat.
- 76% said that the number of cloud security tools they use creates blind spots
- 77% said they struggle to identify what tools are necessary to achieve their objectives
A previous Palo Alto study revealed that organizations rely on over 30 tools for security, including 6–10 cloud security products.
I have two comments on this. The first is from Dave Ratner, CEO at HYAS:
“The growing complexity of cloud environments, whether it is hybrid cloud, multi-cloud, or simply a growing infrastructure, means that it’s easy to lose the visibility of what’s actually going on inside the environment. Without the proper visibility, it’s increasingly difficult to ensure proper controls, which provides great opportunities for bad actors to hide without being seen, communicate with their command-and-control for instructions and data exfiltration without being detected, and otherwise perform nefarious actions at will.
“What’s required is the proper level of visibility and observability into the environments to detect, in real-time, any and all anomalous communications — only then can organizations actually enforce their controls, cut down on the mean-time to detect anomalous communications, and shine a light on the bad actors’ hiding spots.
“While this visibility may have been performed in the past through deep packet inspection or other mechanisms, the growth and complexity of the cloud environments makes that nearly impossible at scale; nevertheless, organizations which monitor and track their DNS traffic can actually address this problem in a light-weight, easy to deploy, easy to manage, and inexpensive to operate manner. This allows organizations to shift left, move into a true business resiliency and business continuity program, detecting and shutting down anomalies in the network before they become significant breaches and issues.”
Bryson Bort, Founder and CEO at SCYTHE follows up with this:
“A threat can only hack what they can touch: surface area is the technical range of this. The more code (software) with the more features accessible (beware default configurations!), the more opportunities you have provided a potential threat. A large percentage of software is installed with the default configurations (this is now part of the threat’s text matrix for their attacks) or sub-optimally configured (likely increasing risk).
“First step, which takes a just few minutes: map all of your tools by category of what they defend (assets, users, etc) against the NIST CSF defensive phases: Identify (Configuration Management), Protect, Detect, Respond, and Recover. Now you know what’s generally covered and you’ve identified overlap where you are over-exposed. Now, make the tools work for you! Invest in validating your assumptions (does this block/see what I think it does?) and optimizing how they’re configured.
“Security is defined by the threat, so a Continuous Threat and Exposure Management approach is the best practice by driving real threat behaviors safely in your environment and continuously so it’s helping you adapt to the rate of change of your business.”
The complexity of managing cloud environments has clearly become the next battleground between threat actors and those who defend against them. Hopefully those who are on the side of the good guys read reports like these and take action to prevent bad things from happening to them.
Like this:
Like Loading...
Related
This entry was posted on March 8, 2023 at 3:31 pm and is filed under Commentary with tags Palo Alto. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
“Tool Bloat” Slows Cloud Threat Resolution Time: Palo Alto Networks
According to a survey conducted by Palo Alto Networks, 39% of global organizations reported a surge in breaches over the past year. The security vendor polled over 2500 respondents in the US, Australia, Germany, France, Japan, Singapore and the UK:
Part of the challenge appears to be the complexity of their cloud security environments – partly caused by tool bloat.
A previous Palo Alto study revealed that organizations rely on over 30 tools for security, including 6–10 cloud security products.
I have two comments on this. The first is from Dave Ratner, CEO at HYAS:
“The growing complexity of cloud environments, whether it is hybrid cloud, multi-cloud, or simply a growing infrastructure, means that it’s easy to lose the visibility of what’s actually going on inside the environment. Without the proper visibility, it’s increasingly difficult to ensure proper controls, which provides great opportunities for bad actors to hide without being seen, communicate with their command-and-control for instructions and data exfiltration without being detected, and otherwise perform nefarious actions at will.
“What’s required is the proper level of visibility and observability into the environments to detect, in real-time, any and all anomalous communications — only then can organizations actually enforce their controls, cut down on the mean-time to detect anomalous communications, and shine a light on the bad actors’ hiding spots.
“While this visibility may have been performed in the past through deep packet inspection or other mechanisms, the growth and complexity of the cloud environments makes that nearly impossible at scale; nevertheless, organizations which monitor and track their DNS traffic can actually address this problem in a light-weight, easy to deploy, easy to manage, and inexpensive to operate manner. This allows organizations to shift left, move into a true business resiliency and business continuity program, detecting and shutting down anomalies in the network before they become significant breaches and issues.”
Bryson Bort, Founder and CEO at SCYTHE follows up with this:
“A threat can only hack what they can touch: surface area is the technical range of this. The more code (software) with the more features accessible (beware default configurations!), the more opportunities you have provided a potential threat. A large percentage of software is installed with the default configurations (this is now part of the threat’s text matrix for their attacks) or sub-optimally configured (likely increasing risk).
“First step, which takes a just few minutes: map all of your tools by category of what they defend (assets, users, etc) against the NIST CSF defensive phases: Identify (Configuration Management), Protect, Detect, Respond, and Recover. Now you know what’s generally covered and you’ve identified overlap where you are over-exposed. Now, make the tools work for you! Invest in validating your assumptions (does this block/see what I think it does?) and optimizing how they’re configured.
“Security is defined by the threat, so a Continuous Threat and Exposure Management approach is the best practice by driving real threat behaviors safely in your environment and continuously so it’s helping you adapt to the rate of change of your business.”
The complexity of managing cloud environments has clearly become the next battleground between threat actors and those who defend against them. Hopefully those who are on the side of the good guys read reports like these and take action to prevent bad things from happening to them.
Share this:
Like this:
Related
This entry was posted on March 8, 2023 at 3:31 pm and is filed under Commentary with tags Palo Alto. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.