Archive for January 28, 2025

Today Is Data Privacy Day

Posted in Commentary on January 28, 2025 by itnerd

January 28th is Data Privacy Day, an annual and international effort to raise awareness about the importance of data protection an privacy. Below, the following cybersecurity experts have provided the following insight about this important day and how crucial it is for people and organizations alike to protect their data. 

Paul Bischoff, Consumer Privacy Advocate at Comparitech

“Data privacy used to be about protecting your private information from hackers, criminals, and data brokers. Now we can add AI to that list. AI programs scrape as much data as they can from public sources to train their algorithms. As a result, personal info can be included in an AI’s response to a prompt, either intentionally or unintentionally. AI significantly reduces the barriers of finding and collecting information, making it easier for criminals to abuse personal data. I recommend disallowing search engines and other bots from scraping your social media accounts if it’s an option, and remove as much identifying personal information from your profiles as possible. Use a data removal service like Incogni or PrivacyBee to get your data out of the hands of data brokers.”

Chris Hauk, Consumer Privacy Champion at Pixel Privacy:

“It’s important for users to take control of their data privacy. I strongly recommend contacting data brokers to have your information removed from their servers. Data brokers are popular targets for hackers, putting all of the data (your data) on the brokers’ servers at risk. While it can be a time consuming process, it is worth it in the long run. If you are pressed for time, subscribe to a service like Incogni, who will contact the data brokers on your behalf, while keeping you informed of their progress.”

“AI is also a rising threat to data privacy. As the use of AI rises, so does the threat to customers’ data. Organizations must take steps to ensure that they put enough security in place that customer data is not inadvertently shared.”

“We continue to see misconfigured data buckets on cloud storage providers like AWS expose data to hackers. Several times, unprotected databases on AWS and other cloud providers have exposed customer and company data to the world, simply because the security protections are misconfigured. This has to stop.”

Carlos Aguilar Melchor, chief scientist, cybersecurity at SandboxAQ

“Privacy Day highlights the importance of safeguarding personal information and advancing secure systems in an increasingly interconnected world. We are seeing organizations across the globe push toward a Zero Trust Architecture (ZTA) strategy, which underscores a shift to “never trust, always verify” principles, enhancing data security and resilience against cyber threats. Simultaneously, the ongoing transition to Post-Quantum Cryptography (PQC) is crucial to future-proofing encryption against the potential risks posed by quantum computing, ensuring privacy and security in the digital age. We are proud to be contributing to these initiatives through cryptography modernization, and to reflect a proactive approach to evolving privacy challenges.”

Jimmy Astle, Senior Director of Detection Enablement at Red Canary:

The rise of generative AI has brought data privacy to the forefront of global conversations. These AI models, trained on vast amounts of internet-scraped data, have ignited concerns about consent and transparency. Questions are being asked about whether individuals and organizations should be informed if their data is being used in this way.

It’s clear our current privacy laws are struggling to keep pace with the evolution of technology. However, while generative AI adds complexity, it doesn’t eclipse existing data privacy concerns that we’re already grappling with. In fact, the most pressing challenges still stem from widespread data breaches and apps that exploit personal data for profit.

What GenAI has done though is introduce new dimensions to these existing challenges. For example, we’re seeing a rise in AI-driven SaaS tools that collect and process user data. Technology vendors are increasingly offering opt-out options for their AI features to safeguard user privacy, but this underscores a larger need for more clarity around how data is being used.

The path forward demands a balance of adaptability, transparency, and regulation. Organizations must take proactive steps to safeguard privacy, including clear communication around data practices and investment in privacy-preserving technologies. Regulators must also work closely with the technology industry to craft policies that protect individuals without hindering progress.

Guest Post: Empower individuals to control their biometric data: the new challenge across all sectors

Posted in Commentary with tags on January 28, 2025 by itnerd

An opinion piece by Thomas Decker, VP Product Marketing Finance at Linxens

What if your face, fingerprint, or iris was your greatest vulnerability in a cyberattack? All those parts of you that are most unique and private are now embedded in our devices, workplaces, and airports, promising seamless access and enhanced security. But there is a dark side to this convenience: the fear of knowing where biometric data is stored and how it is used, and cybercriminals have seized on this. Attracted by these potential loopholes, they are questioning the security and integrity of our data storage. Trust in biometrics is being eroded as individuals worry that their sensitive information is being stored in cloud environments that are vulnerable to breaches and misuse. To address these concerns, the future of biometric access security needs to drive action for change on an economy-wide scale. 

Why the cloud is a concern  

The rise of cloud-based systems has accelerated the adoption of biometric solutions. By storing large amounts of data remotely, cloud platforms allow for scalability and easier system updates. However, high-profile data breaches and unauthorized access to personal information have fueled public skepticism. Deloitte’s 2023 ‘Customer data privacy and security’ survey found that 67% of consumers fear their biometric data could be misused if stored in the cloud, and this concern is particularly acute in regions with strict privacy laws, such as the European Union under the General Data Protection Regulation (GDPR).  

Geopolitical tensions also increase the risks. Critical environments such as airports, military installations, and nuclear power plants cannot afford vulnerabilities in their access systems. In fact, they are a goldmine for hackers. They can intercept valuable biometric data and commit serious crimes such as rigging elections, spying on hostile nations, usurping identities, or sabotaging sensitive systems and areas.  These are irreversible actions with potentially dramatic consequences.

Moving to localized storage  

Biometric systems that prioritize edge computing offer a solution. Instead of sending data to the cloud, biometric information is processed and stored locally on secure devices or smart cards. These systems eliminate the need to transmit data over networks, dramatically reducing the risk of potential hacking.  

For example, smart cards embedded with biometric data allow users to authenticate their identity without needing to interact with the cloud. This decentralized approach enhances privacy as the data remains under the control of the user and is less likely to fall prey to cyber-attacks. It also complies with ethical and legal frameworks by giving users autonomy over their personal information.  

Strategically securing high priority environments  

Industries that handle sensitive materials or information – such as pharmaceuticals, energy, and defense – demand the highest levels of access security. Traditional access systems, such as swipe cards or PIN codes, are not enough to prevent unauthorized access. Biometrics offers a reliable alternative to the strategy adopted by these high-risk industries, but only if it is implemented without introducing new vulnerabilities.  

Some organizations have already deployed on-premises biometric solutions that process data in a closed environment, ensuring that sensitive information never leaves the facility. For example, nuclear power plants are increasingly using locally stored multimodal biometric systems (e.g. combining fingerprint and iris scans) to strengthen access controls. Similarly, the military and financial institutions are adopting innovative technologies such as the use of biometric smart cards: personal data is stored exclusively on the card itself, without recourse to the cloud or external servers. This not only reduces the risk of data leakage but also ensures strict compliance with the RGPD by guaranteeing secure, local management of personal data.

Challenges and the way forward

Despite its benefits, localized biometric security faces challenges, especially as local devices must be robust enough to prevent tampering and cyber intrusions.  

To overcome these hurdles, manufacturers are investing in advanced encryption techniques and tamper-resistant hardware. The use of biometric templates —mathematical representations of biometric data rather than raw images — also mitigates risks. These templates cannot be reverse engineered into the original data, further protecting users’ privacy.  

Looking ahead, biometric systems will need to balance convenience, security, and ethical responsibility. By moving away from cloud dependency, organizations can rebuild public trust while securing critical environments. 

Eventually, to fully realize the potential of localized biometric systems, the industry must come together to establish standards and best practices. This is not just a technological shift but an ethical and strategic imperative to rebuild trust and safeguard critical environments.

The future of access security lies not in centralized technologies such as the cloud, but in empowering individuals to control their own data. The question is not whether industries can adapt to this ethical evolution, but how quickly they will embrace this shift.