The Department of Health and Human Services released a concept paper outlining its healthcare cybersecurity strategy and establishing goals for improving the sector’s cybersecurity posture, including future updates to HIPAA and the establishment of voluntary performance goals.
According to the HHS Office for Civil Rights, cyber incidents in health care between 2018-2022 saw a 93% increase in large breaches reported to OCR, with a 278% increase in large breaches involving ransomware.
The healthcare cybersecurity strategy consists of four pillars and focuses on strengthening resilience for hospitals and patients impacted by cyberattacks:
- Publish voluntary healthcare and public Health sector cybersecurity Performance Goals.
- Provide resources to incentivize and implement cybersecurity practices.
- Implement an HHS-wide strategy to support greater enforcement and accountability.
- Expand and mature the one-stop shop within HHS for healthcare sector cybersecurity.
“Taken together, HHS believes these goals, supports, and accountability measures can comprehensively and systematically advance the healthcare sector along the spectrum of cyber resiliency to better meet the growing threat of cyber incidents, especially for high-risk targets like hospitals. Acting on these priorities will protect the health and privacy of all Americans and enable safe access to health care,” reads the paper.
George McGregor, VP, Approov Mobile Security had this to say:
“It’s a good thing that the initiative aims to provide financial and technical resources for healthcare providers in combination with enforcement.
“However this announcement is light on specifics about exactly what the voluntary Cybersecurity Performance Goals may be. Further communication needs to detail these or tie them to guidelines which exist already.
“The HSS also continues to push for sharing of PII and clinical data between providers as well as third-party apps and services and these developments present security risks to providers.
“This means that two critical areas which should be addressed directly with enhanced security guidelines for healthcare service providers are:
- the security of APIs such as FHIR
- the enforcement of protections for mobile apps which access PII: either owned by service-providers themselves or third-party apps”
Troy Batterberry, CEO and Founder, EchoMark follows with this:
“Once again, these government policy papers fail to fully acknowledge the large and disproportionately growing threat of information breaches done by insiders. Historically, leaks or theft by insiders are some of the most damaging types of information breaches.
“While conventional insider risk management tools including logging and monitoring activities are important, and must be implemented as soon as possible, we know they do not go nearly far enough to prevent insider leaks and theft. Insider leaks continue to accelerate at well run government and commercial organizations all over the world, even with sophisticated monitoring activities in place. The leaker (insider) simply feels they can hide in the anonymity of the group and never be caught. Sadly, today, many of them are right.
“An entirely new approach is required to help change human behavior and prevent insider leaks. The best way to do that is to catch leakers which will help deter other leakers in the future. Information watermarking is one such game-changing technology that can help keep private information private.”
Stephen Gates, Principal Security SME, Horizon3.ai follows with this:
“After reviewing the Healthcare Sector Cybersecurity: Introduction to the Strategy of the U.S. Department of Health and Human Services paper, one strategy they have completely missed is in the form of continuous security self-assessments. Although the paper does reference the Health Industry Cybersecurity Practices, which mentions the term assess or assessment 17 times, no mention of assessments can be found in the prior publication just noted. This should be a wakeup call to those responsible for cybersecurity in the healthcare industry to petition HHS to duly note the value of cybersecurity self-assessments, making them an industry-wide best practice.
“Today, organizations in every industry are beginning to take a preemptive approach to cybersecurity improvement. This preemptive approach is not in the context of deploying more defensive-based security technologies. In place of more defenses, this approach encourages organizations to begin assessing themselves using the same tactics, techniques, and procedures (TTPs) that attackers are using so they can preemptively identify their truly exploitable weaknesses and fix them before falling prey to attackers.
“There is a considerable movement throughout all industries and geographies whereby a call for action in the form of continuous self-assessments using manual and automated adversarial exercises (aka red team exercises) is beginning to surface. These exercises are not in the form of the once-per-year penetration tests or periodic vulnerability scans. Instead, organizations are beginning to adopt and deploy autonomous assessment solutions that can be run continuously so that organizations can rapidly act upon the weaknesses these solutions are discovering in their environments.
“If readers would like to learn more about what this preemptive approach is all about, this whitepaper can help.”
Having a strategy is a good thing as long as it makes measurable progress towards having an IT infrastructure that is resilient to cyberattacks. Let’s see how will this works.


2024 Technology Predictions From Hammerspace
Posted in Commentary with tags Hammerspace on December 9, 2023 by itnerdHammerspace has served up their 2024 Technology Predictions about important trends in data management, data storage, and AI. Molly Presley, SVP of Marketing, put these together and they’re very interesting to read.
Organizations will put distributed unstructured data sets to work to fortify their AI strategies and AI data pipelines while simultaneously achieving the performance and scale not found in traditional enterprise solutions. One of the biggest challenges facing organizations is putting distributed unstructured data sets to work in their AI strategies while simultaneously delivering the performance and scale not found in traditional enterprise solutions. It is critical that a data pipeline is designed to use all available compute power and can make data available to the cloud models such as those found in Databricks and Snowflake. In 2024, high-performance local read/write access to data that is orchestrated globally in real time, in a global data environment, will become indispensable and ubiquitous.
Organizations will start moving away from “store and copy” to a world of data orchestration. Driven by AI advancements, robust tools now exist to analyze data and tease out actionable insights. However, file storage infrastructure has not kept pace with these advancements. Unlike solutions that try to manage storage silos and distributed environments by moving file copies from one place to another, data orchestration helps organizations integrate data into a single namespace from different silos and locations and automates the placement of data when and where it’s most valuable, making it easier to analyze and derive insights. IT organizations need the flexibility to use all of their data – structured, semi-structured and unstructured – for iteration and may need to move different data sets to different models. The data orchestration model allows organizations to realize the benefits of eliminating copying data to new files and repositories – including reducing the time to inference from weeks to hours for large data environments.
In 2024, data teams will increasingly use rich, actionable metadata to derive value from data. With the continued growth and business value of unstructured data across all industries, IT organizations must cope with increasing operational complexity when they manage digital assets that span multiple storage types, locations, and clouds. Wrangling data services across silos in a hybrid environment can be an extremely manual and risk-prone process, made more difficult by incompatibilities between different storage types. Metadata has the power to enable customers to solve these problems. Machine-generated metadata and data orchestration are crucial to data insights.
In 2024, organizations will increasingly adopt parallel global file systems to truly realize digital transformation. File systems are traditionally buried into a proprietary storage layer, which typically locks them and an organization’s data into a storage vendor platform. Moving the data from one vendor’s storage type to another, or to a different location or cloud, involves creating a new copy of both the file system metadata and the actual file essence. This proliferation of file copies and the complexity needed to initiate copy management across silos interrupts user access, and is a key problem that inhibits IT modernization and consolidation. The traditional paradigm of the file system trapped in vendor storage platforms is inconvenient within silos of a single data center. But the increasing migration to the cloud has dramatically compounded the problem, since it is typically difficult for enterprises with large volumes of unstructured data to move all of their files entirely to the cloud. Unlike solutions that try to manage storage silos and distributed environments by shuffling file copies from one place to another, a high performance parallel global file system that can span all storage types, from any vendor, and across one or more locations and clouds is more effective.
1 Comment »