Archive for Privacy

Three of the Top Photo ID Apps Are Leaking Users’ Data

Posted in Commentary with tags on February 12, 2026 by itnerd

Three of the most widely used photo ID mobile applications are reported to have exposed sensitive user data, stemming from misconfigured Firebase instances exacerbated by an absence of attestation – i.e., a backend infrastructure that trusted requests without properly enforcing authentication and authorization controls. Once the backend endpoint was accessible, data could be retrieved directly outside the legitimate app context.

TechRadar Pro quotes Cybernews research noting that the exposed data included personal information and backend tokens, and more than 150,000 users were impacted: Dog Breed Identifier Photo Cam has 500K downloads, with 66,182 users affected; Spider Identifier App by Photo has 500K downloads, with 40,779 users affected; and Insect identifier by Photo Cam has 1M downloads, with 45,005 users affected.

Mobile app security expert Ted Miracco, CEO of Approov, notes:

    “These incidents show how mobile backend misconfigurations become breaches when APIs trust requests without verifying the app itself. Runtime app attestation and client-bound credentials can be immediately invoked and will stop attackers from exploiting exposed endpoints, even when backend controls fail. When publishers and B2C brands don’t take active steps to preventing reverse-engineered apps, scripts, or emulators from querying their backend APIs, the result is all too often a wide-open door that’s simple for greedy, data-stealing imposters to walk through.”

This highlights the fact that apps on your phone have to be completely trustworthy. It would really be nice if apps had “nutrition labels” or something like that so that you know what you are getting into. In the absence of that, I’m glad that someone is looking at this.

Data Privacy Week: Warnings for Consumers & Organizations That Are Being Targeted 

Posted in Commentary with tags on January 26, 2026 by itnerd

It’s Data Privacy Week, the National Cybersecurity Alliance’s annual international initiative to empower people and businesses to respect privacy, safeguard data and enable trust.

NCA warns consumers: “Your online activity creates a treasure trove of data – from your interests and purchases to your online behaviors, and it is collected by websites, apps, devices, services, and companies all around the globe, and can even include information about your physical self, like health data”

This is to share timely, helpful data privacy and litigation/risk advice and cautions for consumers and the retail, financial, healthcare, entertainment and personal services organizations targeting them from three data privacy, cybersecurity and AI experts.

Consumer Advice: Are Your Security Apps Are Putting You At Risk?

Ifrah Arif, Product Manager at PureVPN, a leader in personal cybersecurity and data privacy protections, warns: “We rely on an array of data privacy and security apps: VPNs, password managers, ad blockers, dark web monitors and more. They can conflict with one another, failing the user just when they’re needed most.”

“Non-integrated security tools from different vendors can actually drive ‘alert storms’ that put sensitive info at risk.

“Notification storms typically arise when someone’s using incompatible, non-integrated password managers, VPNs, dark web monitors, trackers, ad blockers and other security tools from differing vendors. The storm arises when tools roll out uncoordinated alerts and notifications to get the user’s attention. One tool mistakes another tool’s attempt to do its job as a threat, and sends users alerts. The resulting ‘alert fatigue’ often drives users to close their VPN or password manager, opening their devices to threats and exposing themselves to data theft and fraud.

The recent study The Cost of Fragmentation: Measuring Time, Spend and Risk in Personal Cybersecurity Tool Stacks,” found that 44% of users receive overlapping alerts, and 38% of those receiving overlapping alerts say they ignore them.

That’s why it’s important to use an integrated suite of security tools – a single unified platform. That way, instead of juggling multiple apps competing for your attention and overriding one another, you get a single, intelligent alert stream and a single place to act on it.

B2Cs, Be Aware:  That Popular Web Visitor Tracking Tech You’re Using? It May Be Illegal.

Ian Cohen, CEO and Founder at Lokker, said: “Data Privacy Week 2026 marks a watershed moment: plaintiffs’ attorneys and regulators are no longer asking whether organizations have compliant policies. They’re demanding proof of how data is processed in practice.”

The finalization of California’s Risk Assessment and Cybersecurity Audit regulations and the CCPA (mandates and penalties now in place as of January 1st) foreshadow regulatory trends to come.

Tracking Technologies and Data Privacy

“The popular tracking technologies companies use to personalize visitors’ experiences have emerged as the primary enforcement focal point. Their widespread deployment, reliance on third parties, and tendency to change without notice place them squarely within the definition of high-risk processing.”

Cohen notes that litigation and enforcement measures will put the spotlight on whether organizations can demonstrate visibility into and control of these tracking technologies.

Why this matters:  

  • 78% of sites deploy session replay tools that courts are treating as wiretap violations, and
  • 49.2% of S&P 500 companies include the Meta Pixel despite its status as a frequent litigation target.

Cohen notes: “Risk exists regardless of whether consent banners are present or policies are well-drafted. The convergence of private rights of action, operational regulatory mandates, and California’s expanding pen registry framework, through CIPA enforcement and class action activities, creates an environment in which technical privacy missteps can become costly litigated events overnight if neglected or mismanaged.

“To protect themselves and their customers, organizations need continuous visibility, defensible documentation, and clear remediation capabilities.

“Moving from static representations to operational proof isn’t optional anymore. It’s the foundation of modern privacy compliance.”

Michael Bell, CEO and co-Founder of AI implementation and cybersecurity firm Suzu Labs, confirms the problem.

“For businesses with websites (i.e. virtually every business), privacy compliance is moving from documentation theater to operational proof. The regulatory environment no longer accepts “we have a policy” as sufficient. Regulators and plaintiffs now ask ‘can you prove what actually happens?’ ” Bell said.

The 92.7% Problem: “Nearly all websites load third-party trackers before user consent is given. That’s not a configuration problem at the margins. That’s an industry-wide failure of the consent model as implemented. The banner exists. The policy exists. The trackers fire anyway,” he warned.

“This is exactly the gap between stated controls and actual controls that creates legal exposure. When plaintiffs’ attorneys or regulators examine what’s technically happening versus what disclosures claim, they find daylight. That daylight becomes litigation. There’s No grace period – the CCPA came into effect January 1.”

UPDATE: I have a pair of additional comments:

Andrew Costis, Manager of the Adversary Research Team at AttackIQ:

“Data has never been more under fire than it is currently. With the introduction of AI into cybercriminal activity, the number of attack surfaces has increased dramatically, as well as the number of exploitable vulnerabilities. If organizations don’t know exactly where their sensitive data lives or how it could be accessed, with or without authorization, they’re flying blind with their security defenses.

The emulation of adversarial attack tactics and techniques is paramount to the security of an organization’s data. Validating defenses against realistic attack paths protects data proactively by not only determining where the exploitable vulnerabilities lie, but also revealing which security controls actually prevent data exfiltration. Organizations need to take away the pathways to internal systems and data before attackers can find them and exploit them.

That being said, it’s important not to overlook the basics of cybersecurity hygiene and the backbone they provide for security defenses. Maintaining up-to-date software and applying distributed patches is a key first layer of protection for both individuals and organizations. Additionally, the use of strong, unique passwords and implementation of multi-factor authentication adds multiple layers of defense, making it harder for attackers to steal data, even if a set of credentials is already exposed.”

Ross Filipek, CISO at Corsica Technologies:

“In today’s environment where data is constantly moving between clouds, partners, and internal systems, modern platforms are forced to handle increasingly complex data flows across EDI, ERP, and CRM connections. With this comes greater risk, as with more systems to secure comes more potential attack surfaces, as well as more opportunities for sensitive customer or organizational data to be exposed.

Organizations need a platform that can offer visibility into data movement to maintain control and accountability over shared data. Prioritization of real-time monitoring and proactive issue resolution can help organizations detect anomalous behavior or unauthorized access before threat actors can fully infiltrate systems. These capabilities can transform a company’s infrastructure into a defensive layer that actively increases and supports data privacy, instead of standing by and watching as attackers march right to the core of a company’s network.”

UPDATE #2: Here’s another comment that just came in from Karl Bagci, Head of Information Security, Exclaimer:

  • “Email is a key target for cyber threats, which makes data privacy an everyday operational issue, not just a security concern. In regulated industries, email governance is one of the clearest signals of data protection maturity. All it takes is one unhinged email to expose risk, no matter how strong the underlying controls, audits, or certifications may be. Data Privacy Day is a reminder for organizations to embed governance into everyday communication, as this is what turns compliance from a best-effort activity into something enforceable, auditable, and sustainable.”
  • “Most data privacy failures don’t start with a breach or a sophisticated cyber-attack. They begin with everyday communication that isn’t governed, where information is shared quickly and repeatedly without consistent controls. If data protection policies don’t hold up in routine email, then those policies exist on paper rather than in practice. Data Privacy Day reminds us to adopt secure practices and protect sensitive information in every communication.”
  • “Data protection isn’t a policy document or a once-a-year compliance exercise. It’s an operational discipline that shows up in every external message an organization sends. The small details, the

Guest Post: Why Your Privacy Fears Keep Feeding the Data Machine

Posted in Commentary with tags on April 8, 2025 by itnerd

Supplied by International Drivers Association

Understanding Privacy Fears

In an era marked by the relentless surge of digital technologies, privacy fears have become a pervasive concern for individuals navigating the digital landscape. These fears are not unfounded; they are grounded in the reality that personal data is commodified and utilized by various online platforms without explicit user consent. The rising tide of privacy concerns stems from a perceived lack of control over personal information. A significant majority of Americans feel they have little to no control over the data collected about them by governments and corporations alike. This uncertainty is compounded by the complexity and opacity of data practices, leaving many in the dark about how their personal information is collected, used, and shared. The link between privacy fears and trust is particularly noteworthy. Traditionally, privacy concerns are thought to negatively impact trust; however, research has revealed that this relationship is not always straightforward. While privacy fears can indeed erode trust, some studies suggest that the dynamics between the two can vary based on context, such as the technology being used or the novelty of the data-handling processes involved. As such, understanding the intricacies of these relationships is crucial for addressing privacy concerns effectively. Additionally, the rapid advancement of technology and the resulting “data deluge” have exacerbated privacy fears, presenting risks that threaten to stifle innovation and trigger regulatory backlashes. The inability of consumers to grasp the full extent of data collection practices fuels these fears. For example, many users are unaware of the potential for re-identification of anonymized data, a factor that has profound implications for privacy and trust in digital systems. Misconceptions further cloud the landscape of privacy fears. Contrary to some beliefs, consumers do care deeply about having control over their private data, as opposed to only fearing data breaches by hackers. This desire for control is often overshadowed by the complexities of modern data ecosystems and the challenge of navigating privacy settings and policies. Understanding privacy fears requires acknowledging the legitimate concerns individuals have about data security, transparency, and control. As the digital age continues to evolve, addressing these fears with effective privacy measures and clearer communication of data practices becomes paramount. Only then can trust be rebuilt, and privacy fears mitigated in a world increasingly driven by data.

The Data Machine in Motion

In the ever-evolving digital age, the “data machine” operates with relentless precision, continuously driven by the wealth of information generated every second. As individuals navigate the online world, their actions create data footprints that feed into a larger network of data collection and analysis. This vast ecosystem is sustained by a complex interplay of data mining, consumer profiling, and digital marketing strategies aimed at enhancing user experiences and business outcomes. At the core of this machine is the concept of data collection, a methodological process critical to a business’s success. Organizations harness both primary and secondary data collection methods to gather insights, leveraging advanced technologies like artificial intelligence (AI) to optimize these processes. This approach not only boosts efficiency but also facilitates real-time decision-making and strategic planning. For instance, AI aids in categorizing survey responses and generating synthetic datasets, driving the speed and quality of data insights. However, the data machine is not without its challenges. Privacy concerns arise as data mining techniques become more prevalent, creating a need for transparent data practices and user empowerment. Tech companies are increasingly prioritizing user control over personal data, ensuring transparency in data handling, and implementing privacy-by-design principles to build trust with users. This is crucial, given the persistent myths and misconceptions that cloud public understanding of data privacy and security. Despite the regulatory frameworks in place, such as the Privacy Act of 1974 and HIPAA, which govern how data can be collected and used, the commodification of personal information persists. This underscores the importance of user consent and data minimization to mitigate privacy risks. Policymakers and businesses must balance innovation with privacy protection to prevent a regulatory backlash that could stifle the data economy. Ultimately, the data machine continues to evolve, fueled by advancements in technology and the insatiable demand for consumer insights. As organizations strive to navigate this complex landscape, they must remain vigilant in protecting user data while simultaneously harnessing the power of information to drive growth and innovation.

The Privacy-Data Cycle

In the digital age, the interplay between privacy concerns and data utilization has created a complex ecosystem where user data powers a multitude of online services, often at the cost of personal privacy. This cyclical relationship, dubbed the Privacy-Data Cycle, highlights the ongoing struggle to balance convenience and control in an increasingly data-driven world.

Data as Currency

Today, many online platforms operate on a model where services are offered “for free,” but with a caveat—users must agree to share their personal data, which in turn fuels targeted advertising that funds these services. This transaction creates a situation where privacy concerns are intrinsically tied to the services that users depend on daily. Despite growing apprehension about data security, this model persists due to the perceived value of the services provided.

Empowering Users with Control

One of the critical components in breaking or at least mitigating the adverse effects of the Privacy-Data Cycle is user empowerment. Enabling individuals to have control over their data is essential for safeguarding online privacy. Through informed consent, users are made aware of how their data will be collected and used, allowing them to make conscious decisions about their online interactions. This control not only enhances privacy but also builds trust between users and service providers.

The Role of Data Privacy Laws

The global nature of the internet poses a challenge to data privacy laws, which vary significantly from country to country. In the United States, for instance, a complex web of federal and state regulations governs the handling of personal data, aiming to protect individuals’ privacy while allowing for data-driven innovation. These laws strive to set boundaries on how data can be collected, processed, and shared, serving as a regulatory framework that can disrupt the Privacy-Data Cycle by ensuring data is handled responsibly.

Myths and Misconceptions

Amidst these dynamics, myths about data privacy continue to circulate, often clouding public understanding. One such misconception is that people prioritize protection against hackers over control of their personal data. In reality, both elements are crucial, and misconceptions can hinder meaningful discussions on how to address privacy concerns effectively.

Towards a Sustainable Model

As privacy concerns persist, the challenge remains to develop a sustainable model that respects individual privacy while supporting the data economy. Efforts to redefine consent mechanisms, enhance data security practices, and strengthen legal frameworks are vital steps in creating a digital ecosystem where privacy fears do not feed the data machine but rather inspire innovations that uphold user autonomy. This transformation is essential for building a future where privacy and data utilization coexist harmoniously.

Implications of the Data Machine

In the modern digital ecosystem, the “data machine” is an omnipresent force, shaping industries and influencing personal lives in ways that are both transformative and, at times, unsettling. As consumers generate unprecedented volumes of data, businesses harness this information to enhance consumer engagement and craft personalized experiences. The insights gleaned from big data analysis enable companies to optimize the customer journey, tailoring offerings to meet individual preferences and behaviors. However, this expansive use of data is not without significant implications.

Balancing Innovation with Privacy

The tension between leveraging data for innovation and protecting individual privacy is a central theme in the data-driven economy. Organizations are tasked with navigating complex regulatory landscapes designed to safeguard consumer data. Legislation like the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) exemplifies efforts to address privacy concerns while maintaining the flow of data essential for business innovation. Companies that manage to achieve this balance can turn their privacy practices into a competitive advantage, differentiating themselves in a marketplace increasingly concerned with data ethics.

The Role of Consent and Ethical Considerations

As data privacy becomes a focal point, traditional models of consent are being scrutinized. Critics argue that simply opting into terms and conditions does not provide genuine protection in a complex data ecosystem. Ethical considerations come into play as businesses must ensure that data usage aligns with consumer expectations and regulatory standards. This involves not only complying with privacy laws but also fostering a culture of transparency and trust with consumers.

Challenges of Anonymization and Data Re-identification

The assumption that anonymized data can protect privacy is being challenged by advances in re-identification science. Studies have shown that even data stripped of personal identifiers can often be linked back to individuals, undermining privacy assurances and complicating compliance efforts. This revelation underscores the need for robust data governance frameworks capable of real-time monitoring and policy enforcement, ensuring that data remains secure and that privacy rights are respected.

Impacts on Individual Rights and Autonomy

The expansive collection and use of personal data affect more than just privacy—they influence fundamental individual rights. Without meaningful protections, there exists a significant power imbalance between individuals and the institutions that collect their data. This imbalance raises concerns about autonomy, as individuals may have limited control over how their personal information is used and shared in the digital realm. The implications of the data machine are multifaceted, requiring a nuanced approach to data management that considers ethical, legal, and societal dimensions. As businesses continue to harness the power of data, the challenge will be to do so in a manner that respects individual privacy and fosters consumer trust.

Breaking the Cycle

In the digital age, the cycle of privacy fears feeding the data machine seems relentless, but it doesn’t have to remain unbroken. Both individuals and enterprises can take strategic steps to regain control over personal data and mitigate the pervasive risks associated with data privacy concerns. First and foremost, transparency is a cornerstone in rebuilding trust and breaking the cycle of data misuse. By clearly communicating how data is collected, used, and shared, organizations can enhance accountability and empower individuals to make informed choices regarding their personal information. This transparency not only promotes credibility but also fosters an environment where privacy concerns are acknowledged and addressed proactively. For enterprises, implementing robust data governance frameworks is crucial. This involves documenting data usage meticulously to ensure accountability and transparency, while model cards and data cards track data provenance and context. Such measures are vital in aligning data practices with human-centered outcomes rather than mere compliance. Furthermore, education plays a pivotal role in disrupting this cycle. By educating employees and the public about data privacy best practices, companies can help safeguard personal information from unauthorized access and breaches. An informed public is better equipped to navigate the complexities of privacy in the digital era, thereby reducing the likelihood of privacy fears escalating into breaches. Additionally, adapting to evolving privacy regulations is essential. A comprehensive understanding of the patchwork of federal, state, and local privacy laws enables organizations to stay compliant and avoid penalties. This includes adhering to sector-specific privacy laws and acknowledging the implications of global legislative developments, such as the General Data Protection Regulation (GDPR), which is widely regarded as a gold standard in data privacy regulation. Ultimately, breaking the cycle requires a collaborative effort from both consumers and businesses. As privacy continues to be a contentious issue worldwide, it is incumbent upon all stakeholders to challenge the status quo, innovate on data protection strategies, and prioritize the security and privacy of individual data. By taking these steps, we can begin to dismantle the data machine’s insidious hold on our privacy.

Data Privacy Week Starts On Monday

Posted in Commentary with tags on January 25, 2025 by itnerd

Whether you’re in IT, healthcare, government, or finance — every industry that handles sensitive data or critical systems benefits from protecting its data. We are reminded of this every time we see a new breach in the news, and especially during Data Privacy Week which is next week, helps to further empower everyone to protect our privacy online.

I have a pair of comments on Data Privacy Week from industry experts:

Evan Dornbush, former NSA cybersecurity expert:

“This is a great time for developers and product leads to remember, ‘if you don’t collect it, it can’t find its way into a breach,’ and be mindful of how much information is captured and stored that may be a liability to the business rather than an asset. For end users, in the past few months, we’ve seen clear-text SMS messages and call data records, some dating back as far as seven years, disclosed in telecom hacks. Encrypted options for video, voice and text exist and are now being promoted by professionals and government groups alike.”

Jawahar Sivasankaran, President at Cyware

“Data Privacy Week is a good opportunity to reflect on how security and privacy go hand-in-hand. Threat intelligence is a critical part of protecting sensitive data – it helps us identify and respond to risks before they turn into tangible threats. A strong security posture is essential for safeguarding privacy, and this week underscores the need to integrate both into your strategy. Protecting data is about more than compliance; it’s about being proactive in identifying and mitigating risks to keep both privacy and security intact.”

The website that I linked to above has a ton of great resources that you can use to take more control of your data. Feel free to check them out.

Ford Wants To Target You With Ads By Listening In On Your Conversations…. WTF?

Posted in Commentary with tags , on September 24, 2024 by itnerd

My wife and I a few years ago said that we would drive our car into the ground because modern cars seem to want to invade your privacy in so many ways. And according to MalwareBytes Labs, Ford has taken this next level. Here’s how:

Car manufacturer Ford Motor Company has filed a patent application for an in-vehicle advertisement presentation system based on information derived from several trip and driver characteristics. Among those characteristics—human conversations. 

In the abstract of the patent application publication Ford writes:

“An example method includes determining vehicle information for a trip, the vehicle information including any one or more of a current vehicle location, a vehicle speed, a drive mode, and/or traffic information, the user information including any one or more of a route prediction, a speed prediction for the trip, and/or a destination, determining user preferences for advertisements from any one or more of audio signals within the vehicle and/or historical user data, selecting a number of the advertisements to present to the user during the trip, and providing the advertisements to the user during the trip through a human-machine interface (HMI) of the vehicle.”

Further one it details that “the controller may monitor user dialogue to detect when individuals are in a conversation.”

Based on this info, the controller can decrease or increase the number of advertisements. And “the conversations can be parsed for keywords or phrases that may indicate where the occupants are travelling to.”

If Ford wanted to incentivize me to not ever consider buying their cars, this would be a great way to do it because I don’t want a third party listening in on my conversations…. Ever. Now to be clear, there’s no evidence that this has been implemented in any car that they sell. But the fact that they came up with this and are filing a patent for it is downright scary.

That’s not the only patent that they’ve filed lately:

Another controversial Ford patent filed in July described technology that would enable vehicles to monitor the speed of nearby cars, photograph them and send the information to police.

So based on that sentence, your car will snitch on other cars to the 5-0 as gangster rappers would say. While I will call the police if I see an impaired driver, or a dangerous driver, I am not at all comfortable with my car doing that by default.

So what does Ford have to say about that?

In a statement to Fortune, the company clarified that filing a patent is a standard practice to explore new ideas and doesn’t necessarily indicate immediate plans to release such a system.

That’s likely true. But the fact that they are even thinking about stuff like this and trying to patent it is just creepy. And while I am picking on Ford in this story, it’s a safe bet that other car companies are doing something similar. So perhaps before you sign the lease or finance deal for your next car, perhaps you should read the car’s privacy policy in detail to make sure that this car isn’t doing something that you’re not comfortable with.

Security Researcher Finds That Microsoft Recall Is A Bigger Disaster Than We All Thought

Posted in Commentary with tags , on June 3, 2024 by itnerd

Along with the release of Windows laptops using the Snapdragon X Elite processor, Microsoft released a bunch of new AI features for Windows 11. Including something called Microsoft Recall which literally takes snapshots of everything that you do on the PC. At the time, I said this:

Here’s where things get sketchy. While Recall apparently encrypts everything that it is taking a picture of, Recall with the default settings is taking pictures of everything. So if you do online banking, enter your SIN number online, or do anything else that is sensitive, Recall will likely know about it. Think of the fun a threat actor could have if they somehow managed to pwn the PC and got access to that data. And don’t think that threat actors aren’t thinking about giving that a shot as they know that it’s a potential gold mine of information that they can sell on the dark web. Never mind use against you. Now at this point a threat actor would likely have to have physical access to the device as this info is stored locally. But the one thing that I have learned over the years is that threat actors are creative and crafty individuals. So if there’s another attack vector out there that will allow them to grab this data, they will find it. And exploit it. 

Well, it now seems that this might be worse than previously thought. The Verge has surfaced just how vulnerable Recall actually is:

Despite Microsoft’s promises of a secure and encrypted Recall experience, cybersecurity expert Kevin Beaumont has found that the AI-powered feature has some potential security flaws. Beaumont, who briefly worked at Microsoft in 2020, has been testing out Recall over the past week and discovered that the feature stores data in a database in plain text. That could make it trivial for an attacker to use malware to extract the database and its contents.

“Every few seconds, screenshots are taken. These are automatically OCR’d by Azure AI, running on your device, and written into an SQLite database in the user’s folder,” explains Beaumont in a detailed blog post. “This database file has a record of everything you’ve ever viewed on your PC in plain text.”

Beaumont shared an example of the plain text database on X, scolding Microsoft for telling media outlets that a hacker cannot exfiltrate Recall activity remotely. The database is stored locally on a PC, but it’s accessible from the AppData folder if you’re an admin on a PC. Two Microsoft engineers demonstrated this at Build recently, and Beaumont claims the database is accessible even if you’re not an admin.

Well that’s just incredibly horrible. Because now that we know that pwnage is possible, threat actors around the globe will be figuring out how to pwn anyone who is running this feature. Even if technical details are being withheld.

But I am not done yet. It actually gets worse:

Beaumont has exfiltrated his own Recall database and created a website where you can upload a database and instantly search it. “I am deliberately holding back technical details until Microsoft ship the feature as I want to give them time to do something,” he says.

You would think a company the size of Microsoft would have had a few security researchers try to find vulnerabilities in this feature before even announcing it? But I guess not. It truly sounds like to me that Microsoft needs to do a recall of Recall, because it’s simply not something that users can trust to be secure. Thus it’s not ready for primetime.

Qantas Has An EPIC Privacy Breach On Their Hands

Posted in Commentary with tags , on May 1, 2024 by itnerd

This one is bad. Qantas as in the Australian airline has one hell of a privacy breach on its hands. The Guardian has the rather bad (if you’re Qantas) details:

Potentially thousands of Qantas customers have had their personal details made public via the airline’s app, with some frequent flyers able to view strangers’ account details and possibly make changes to other users’ bookings.

Qantas said late Wednesday its app had been fixed and was stable, after two separate periods that day “where some customers were shown the flight and booking details of other frequent flyers”.

The airline said this didn’t include displaying financial information, and that users were not able to transfer Qantas points from another account or board flights with their in-app boarding passes.

Clare Gemmell from Sydney said that she and four colleagues encountered the problem shortly after 8.30 on Wednesday morning.

“My colleague logged in and said ‘I think the Qantas app has been hacked because it’s not my account when I log in’.”

When Gemmell logged into the app, she was greeted with a message saying “Hi Ben”. The app told her Ben had more than 250,000 points and an upcoming international flight.

“Another colleague of mine said it looked like she was able to cancel somebody’s flight ticket,” she said.

“You could see boarding passes for other people, one of my colleagues could see a flight going to Melbourne and it looked like you could interact and actually affect the booking.”

Well, that’s one hell of a screw up that Qantas has apparently now fixed. But it’s still bad. Ted Miracco, CEO, Approov had this comment:

This incident with the Qantas mobile app is quite concerning from both a cybersecurity and privacy perspective. Many companies fail to implement adequate API security, which can lead to issues like the one potentially faced by Qantas. The security of APIs is critical as they often handle the logic, user authentication, session management, and data processing that apps rely on to function.

The problem described suggests a significant issue with how user sessions and data are being handled within the app. The Application Programming Interface (API) is incorrectly processing or validating session tokens, leading to unauthorized access to data. The exposure of such personal information, including booking details, frequent flyer numbers, and boarding passes, poses serious risks and liability. The data could be used for identity theft, phishing scams, or unauthorized access to further personal information. Such a breach should have significant legal and compliance implications, particularly under data protection regulations like the Australian Privacy Act (APA) or GDPR, if any EU citizens are affected, or other local privacy laws, depending on the nationality of the affected passengers.

The reliance solely on Google and Apple’s app store security measures for safeguarding mobile applications is indeed a common oversight that can lead to significant security challenges, as potentially evidenced by the Qantas incident. The security features provided by these platforms primarily focus on ensuring that apps are free from known malware at the time of upload and meet certain basic security criteria. However, these protections do not extend into the realms of runtime security, business logic, and specific data handling practices which are critical for ensuring application security.

Stephen Gates, Security SME, Horizon3.ai adds this:

Most people who utilize mobile apps don’t realize that these apps use APIs to communicate between the app and the app provider’s backend. And APIs are often full of potential vulnerabilities and subsequent risks due to how they are implemented. 

This is the primary reason why the OWASP API Security Project was created resulting in the most recent version: 2023 OWASP API Security Top 10. Being a contributor of the Top 10 2019 version, and spending time with founding leaders of the Security Project, the API risks organizations and consumers face today are quite clear. 

Today’s software (app) developers must not only become familiar with the API Top 10, but also become experts in understanding the intricacies associated with APIs. The API Top 10 provides highly detailed example attack scenarios as well as excellent recommendations on how to prevent such risks from occurring.

Qantas has some explaining to do to a whole lot of people because of this screw up. I hope they have detailed answers at the ready because this is one of these situations where people are going to want those answers. And they won’t be satisfied with anything less.

American Privacy Rights Act Unveiled

Posted in Commentary with tags on April 9, 2024 by itnerd

The newly unveiled American Privacy Rights Act (APRA) represents a significant step toward establishing a federal data privacy standard in the U.S., offering a bipartisan solution to longstanding legislative challenges.  This legislative effort underscores a unified approach to enhance online privacy protections, aiming to reconcile differences over state preemptions and legal remedies for privacy breaches.

Antonio Sanchez, principal evangelist at cybersecurity company Fortra says:

“Today, about half of the states have some sort of legislation, but it’s varied. Ideally, this legislation would be a baseline of privacy at the federal level which provides consumers with more control over their personal data.  Each state would then decide on passing something more stringent than the baseline.

This would be a great win for consumers as this would be a big step towards reducing misinformation, disinformation, and AI generated content which are used to sway the public’s mindset on a particular issue.  For big tech this would represent a big hit to their bottom line since big tech monetizes personal data by mining, using, and selling it.  The ones that use it deliver content (real and AI generated) to targeted audiences to either position a product or gain support on a social issue.

I like the idea, but we will see if this continues to move forward or if it slowly fades away and nothing happens.”

This is a piece of legislation that is long overdue. If the people on Capitol Hill are smart they would do everything possible to move this bill forward and get it passed into law. But given the tenor of politics in the US at the moment, one has to wonder if that will happen.

UPDATE: Madison Horn, Congressional Candidate (OK-5) and cybersecurity expert adds these comments regarding the American Privacy Rights Act:

The American Privacy Rights Act is a significant first-step towards setting up national consumer centric data privacy standards. While the American Privacy Rights Act aims to define the type of data that companies can collect, there is ambiguity and concern in a number of areas that will be left vague. In the typical process for introducing new regulation, there is either over or under calibration, or it is not specific enough. Regulators must define what data is considered necessary, determine how data collection needs should be managed across applications, determine whether data storage will be centralized or segmented, and establish clear limitations on the types of data companies can collect.

I have concerns that regulators will over-calibrate these new data privacy regulations and inadvertently introduce vulnerabilities in company systems, potentially making it easier for bad actors to exploit them. While giving consumers control over their data is a positive step, it’s crucial that identity and access-management are securely designed, otherwise bad-actors could easily steal personal data. Giving consumers the right to access, correct, delete, and export their personal data is a great step forward, but brings significant security concerns. There’s a technical challenge in setting up and managing identities to ensure that people can’t access or edit someone else’s data. Despite the good intentions, opening these doors will inadvertently increase security concerns. The real task lies in minimizing these incidents as much as possible. It’s all achievable, but requires careful planning and execution.

To get this crucial data privacy law right, it’s important that everyone involved – lawmakers, regulators, and the private sector – all meet at the table together. If lawmakers try to force this law through like dictators, there will be endless pushback from lobbyists – something entirely counterproductive to effective regulation – and will only hurt small businesses and innovation. With many of the few qualified individuals in Congress left retiring or being pushed out of office by partisan politics, it’s up to the American people to elect qualified leaders with experience that matches the problems of today. Leaders that understand the nuances and pitfalls of drafting, right sizing and passing acts that adequately protect Americans while not hindering innovation and economic growth. 

Woman Sues Sex Toy Company For Collecting Her Sex Toy Searches…. No I Am Not Making This Up

Posted in Commentary with tags on February 21, 2024 by itnerd

Following on the heels of this story, I have another story about the dark side of sex toys and the Internet. Which to be clear isn’t really about sex toys. But it is about your privacy.

404 Media is reporting on a lawsuit where a woman is suing Adam & Eve for collecting details of her searches sex toys on their site. Brace yourself for the details:

A woman just brought a class action lawsuit against one of the biggest online retailers for sex toys, Adam and Eve, claiming that the site gave Google information about her searches for 8-inch dildos and strap-on harnesses. 

The plaintiff, who isn’t named in the complaint but goes by “Jane Doe,” claims that Adam and Eve uses Google Analytics, which has an anonymization feature that obscures IP addresses of users, but that the site didn’t have that feature enabled. She’s suing PHE, the owner of Adam and Eve, as well as Google, for allegedly disclosing her “sexual preferences, sexual orientation, sexual practices, sexual fetishes, sex toy preferences, lubricant preferences, and search terms” without her consent.

“By using the Google Analytics tool without anonymized IP feature, PHE is sharing with Google Plaintiff’s online activity, along with her IP addresses, even when consumers have not shared (nor have consented to share) such information,” the complaint claims.

Specifically, the plaintiff takes issue with PHE telling Google that she was browsing the site’s categories for “lesbian toys,” women’s sex toys, and realistic dildos. The complaint describes her online shopping trips in detail, claiming that Analytics captured her looking at listings for “Kingcock Strap-on Harness With 8-Inch Dildo” and showed that she added a “Pink Jelly Slim Dildo” to her cart. It also claims that “any information submitted by consumers through the search bar on the site’s homepage is shared with Google,” which in her case was a search for “strap-on dildo.” 

“The above information, combined with the consumer’s IP address, enables Google to identify the person who has interacted with PHE’s Website or has submitted information through the site,” the complaint claims. “Website consumers did not know that the communications between them and PHE would be shared with a third party, Google. PHE did not obtain consent or authorization of Website consumers to disclose communications about their Private and Protected Sexual Information. The surreptitious disclosure of Private and Protected Sexual Information is an outrageous invasion of privacy and would be offensive to a reasonable person.”

She’s suing PHE and Google for violations of the California Invasion of Privacy Act, which prohibits services from communicating information about users to third parties without their consent. Someone doesn’t have to have suffered “actual damages” to bring legal action under CIPA, and can sue for $5,000 per violation.

Now Google is saying that it doesn’t try to identify individuals and has policies to try and stop that from happening. And it’s really up to the retailer to do the right thing. In other words, Google is using the Shaggy excuse. As in “it wasn’t me.” Adam & Eve didn’t have anything to say to 404 Media. But let’s just take a step back and take the words “sex toys” out of this discussion. What this is really about is the fact that ANY retailer can take your shopping habits, collect that up, and use it or sell it however they see fit. If you’re on Amazon, you might not have an issue with that. But if you are shopping for something more “personal” you might have a problem with that. This really isn’t new. But it highlights the fact that your data is valuable and retailers will want to make money off of it, even if you don’t buy anything from them. That’s something that you might want keep in mind if you shop online.

NSA Admits To Buying User Browsing Data

Posted in Commentary with tags , on January 29, 2024 by itnerd

The NSA has recently admitted to buying user browsing data. Here’s what Senator Ron Wyden had to say on this:

U.S. Senator Ron Wyden, D-Ore., released documents confirming the National Security Agency buys Americans’ internet records, which can reveal which websites they visit and what apps they use. In response to the revelation, today Wyden called on the administration to ensure intelligence agencies stop buying personal data from Americans that has been obtained illegally by data brokers. A recent FTC order held that data brokers must obtain Americans’ informed consent before selling their data. 

“The U.S. government should not be funding and legitimizing a shady industry whose flagrant violations of Americans’ privacy are not just unethical, but illegal,” Wyden wrote in a letter to Director of National Intelligence (DNI) Avril Haines today. “To that end, I request that you adopt a policy that, going forward, IC elements may only purchase data about Americans that meets the standard for legal data sales established by the FTC.”

 John Gunn, CEO, Token had this comment:

Senator Wyden’s efforts are misguided. Instead of working to hinder the critical work of law enforcement agencies that keep everyone safe, he should focus his efforts on the data aggregators. Data purchased by the NSA, marketers, and others is out there in regular commercial markets for anyone to purchase. Nothing is gained by excluding law enforcement from doing their jobs, and people’s privacy is not any more protected by excluding law enforcement from public markets for information. If some of the data being used is obtained illegally, then stop the illegal collection.

I can see a different view on this issue. I am all for law enforcement having access to the data that they need to fight crime. But there needs to be clear limits on how they access that data. It cannot be a free for all where the NSA or any law enforcement agency can get anything that they want with little or no oversight. I’m free to be convinced otherwise as this is a complex issue.