Archive for Privacy

Qantas Has An EPIC Privacy Breach On Their Hands

Posted in Commentary with tags , on May 1, 2024 by itnerd

This one is bad. Qantas as in the Australian airline has one hell of a privacy breach on its hands. The Guardian has the rather bad (if you’re Qantas) details:

Potentially thousands of Qantas customers have had their personal details made public via the airline’s app, with some frequent flyers able to view strangers’ account details and possibly make changes to other users’ bookings.

Qantas said late Wednesday its app had been fixed and was stable, after two separate periods that day “where some customers were shown the flight and booking details of other frequent flyers”.

The airline said this didn’t include displaying financial information, and that users were not able to transfer Qantas points from another account or board flights with their in-app boarding passes.

Clare Gemmell from Sydney said that she and four colleagues encountered the problem shortly after 8.30 on Wednesday morning.

“My colleague logged in and said ‘I think the Qantas app has been hacked because it’s not my account when I log in’.”

When Gemmell logged into the app, she was greeted with a message saying “Hi Ben”. The app told her Ben had more than 250,000 points and an upcoming international flight.

“Another colleague of mine said it looked like she was able to cancel somebody’s flight ticket,” she said.

“You could see boarding passes for other people, one of my colleagues could see a flight going to Melbourne and it looked like you could interact and actually affect the booking.”

Well, that’s one hell of a screw up that Qantas has apparently now fixed. But it’s still bad. Ted Miracco, CEO, Approov had this comment:

This incident with the Qantas mobile app is quite concerning from both a cybersecurity and privacy perspective. Many companies fail to implement adequate API security, which can lead to issues like the one potentially faced by Qantas. The security of APIs is critical as they often handle the logic, user authentication, session management, and data processing that apps rely on to function.

The problem described suggests a significant issue with how user sessions and data are being handled within the app. The Application Programming Interface (API) is incorrectly processing or validating session tokens, leading to unauthorized access to data. The exposure of such personal information, including booking details, frequent flyer numbers, and boarding passes, poses serious risks and liability. The data could be used for identity theft, phishing scams, or unauthorized access to further personal information. Such a breach should have significant legal and compliance implications, particularly under data protection regulations like the Australian Privacy Act (APA) or GDPR, if any EU citizens are affected, or other local privacy laws, depending on the nationality of the affected passengers.

The reliance solely on Google and Apple’s app store security measures for safeguarding mobile applications is indeed a common oversight that can lead to significant security challenges, as potentially evidenced by the Qantas incident. The security features provided by these platforms primarily focus on ensuring that apps are free from known malware at the time of upload and meet certain basic security criteria. However, these protections do not extend into the realms of runtime security, business logic, and specific data handling practices which are critical for ensuring application security.

Stephen Gates, Security SME, Horizon3.ai adds this:

Most people who utilize mobile apps don’t realize that these apps use APIs to communicate between the app and the app provider’s backend. And APIs are often full of potential vulnerabilities and subsequent risks due to how they are implemented. 

This is the primary reason why the OWASP API Security Project was created resulting in the most recent version: 2023 OWASP API Security Top 10. Being a contributor of the Top 10 2019 version, and spending time with founding leaders of the Security Project, the API risks organizations and consumers face today are quite clear. 

Today’s software (app) developers must not only become familiar with the API Top 10, but also become experts in understanding the intricacies associated with APIs. The API Top 10 provides highly detailed example attack scenarios as well as excellent recommendations on how to prevent such risks from occurring.

Qantas has some explaining to do to a whole lot of people because of this screw up. I hope they have detailed answers at the ready because this is one of these situations where people are going to want those answers. And they won’t be satisfied with anything less.

American Privacy Rights Act Unveiled

Posted in Commentary with tags on April 9, 2024 by itnerd

The newly unveiled American Privacy Rights Act (APRA) represents a significant step toward establishing a federal data privacy standard in the U.S., offering a bipartisan solution to longstanding legislative challenges.  This legislative effort underscores a unified approach to enhance online privacy protections, aiming to reconcile differences over state preemptions and legal remedies for privacy breaches.

Antonio Sanchez, principal evangelist at cybersecurity company Fortra says:

“Today, about half of the states have some sort of legislation, but it’s varied. Ideally, this legislation would be a baseline of privacy at the federal level which provides consumers with more control over their personal data.  Each state would then decide on passing something more stringent than the baseline.

This would be a great win for consumers as this would be a big step towards reducing misinformation, disinformation, and AI generated content which are used to sway the public’s mindset on a particular issue.  For big tech this would represent a big hit to their bottom line since big tech monetizes personal data by mining, using, and selling it.  The ones that use it deliver content (real and AI generated) to targeted audiences to either position a product or gain support on a social issue.

I like the idea, but we will see if this continues to move forward or if it slowly fades away and nothing happens.”

This is a piece of legislation that is long overdue. If the people on Capitol Hill are smart they would do everything possible to move this bill forward and get it passed into law. But given the tenor of politics in the US at the moment, one has to wonder if that will happen.

UPDATE: Madison Horn, Congressional Candidate (OK-5) and cybersecurity expert adds these comments regarding the American Privacy Rights Act:

The American Privacy Rights Act is a significant first-step towards setting up national consumer centric data privacy standards. While the American Privacy Rights Act aims to define the type of data that companies can collect, there is ambiguity and concern in a number of areas that will be left vague. In the typical process for introducing new regulation, there is either over or under calibration, or it is not specific enough. Regulators must define what data is considered necessary, determine how data collection needs should be managed across applications, determine whether data storage will be centralized or segmented, and establish clear limitations on the types of data companies can collect.

I have concerns that regulators will over-calibrate these new data privacy regulations and inadvertently introduce vulnerabilities in company systems, potentially making it easier for bad actors to exploit them. While giving consumers control over their data is a positive step, it’s crucial that identity and access-management are securely designed, otherwise bad-actors could easily steal personal data. Giving consumers the right to access, correct, delete, and export their personal data is a great step forward, but brings significant security concerns. There’s a technical challenge in setting up and managing identities to ensure that people can’t access or edit someone else’s data. Despite the good intentions, opening these doors will inadvertently increase security concerns. The real task lies in minimizing these incidents as much as possible. It’s all achievable, but requires careful planning and execution.

To get this crucial data privacy law right, it’s important that everyone involved – lawmakers, regulators, and the private sector – all meet at the table together. If lawmakers try to force this law through like dictators, there will be endless pushback from lobbyists – something entirely counterproductive to effective regulation – and will only hurt small businesses and innovation. With many of the few qualified individuals in Congress left retiring or being pushed out of office by partisan politics, it’s up to the American people to elect qualified leaders with experience that matches the problems of today. Leaders that understand the nuances and pitfalls of drafting, right sizing and passing acts that adequately protect Americans while not hindering innovation and economic growth. 

Woman Sues Sex Toy Company For Collecting Her Sex Toy Searches…. No I Am Not Making This Up

Posted in Commentary with tags on February 21, 2024 by itnerd

Following on the heels of this story, I have another story about the dark side of sex toys and the Internet. Which to be clear isn’t really about sex toys. But it is about your privacy.

404 Media is reporting on a lawsuit where a woman is suing Adam & Eve for collecting details of her searches sex toys on their site. Brace yourself for the details:

A woman just brought a class action lawsuit against one of the biggest online retailers for sex toys, Adam and Eve, claiming that the site gave Google information about her searches for 8-inch dildos and strap-on harnesses. 

The plaintiff, who isn’t named in the complaint but goes by “Jane Doe,” claims that Adam and Eve uses Google Analytics, which has an anonymization feature that obscures IP addresses of users, but that the site didn’t have that feature enabled. She’s suing PHE, the owner of Adam and Eve, as well as Google, for allegedly disclosing her “sexual preferences, sexual orientation, sexual practices, sexual fetishes, sex toy preferences, lubricant preferences, and search terms” without her consent.

“By using the Google Analytics tool without anonymized IP feature, PHE is sharing with Google Plaintiff’s online activity, along with her IP addresses, even when consumers have not shared (nor have consented to share) such information,” the complaint claims.

Specifically, the plaintiff takes issue with PHE telling Google that she was browsing the site’s categories for “lesbian toys,” women’s sex toys, and realistic dildos. The complaint describes her online shopping trips in detail, claiming that Analytics captured her looking at listings for “Kingcock Strap-on Harness With 8-Inch Dildo” and showed that she added a “Pink Jelly Slim Dildo” to her cart. It also claims that “any information submitted by consumers through the search bar on the site’s homepage is shared with Google,” which in her case was a search for “strap-on dildo.” 

“The above information, combined with the consumer’s IP address, enables Google to identify the person who has interacted with PHE’s Website or has submitted information through the site,” the complaint claims. “Website consumers did not know that the communications between them and PHE would be shared with a third party, Google. PHE did not obtain consent or authorization of Website consumers to disclose communications about their Private and Protected Sexual Information. The surreptitious disclosure of Private and Protected Sexual Information is an outrageous invasion of privacy and would be offensive to a reasonable person.”

She’s suing PHE and Google for violations of the California Invasion of Privacy Act, which prohibits services from communicating information about users to third parties without their consent. Someone doesn’t have to have suffered “actual damages” to bring legal action under CIPA, and can sue for $5,000 per violation.

Now Google is saying that it doesn’t try to identify individuals and has policies to try and stop that from happening. And it’s really up to the retailer to do the right thing. In other words, Google is using the Shaggy excuse. As in “it wasn’t me.” Adam & Eve didn’t have anything to say to 404 Media. But let’s just take a step back and take the words “sex toys” out of this discussion. What this is really about is the fact that ANY retailer can take your shopping habits, collect that up, and use it or sell it however they see fit. If you’re on Amazon, you might not have an issue with that. But if you are shopping for something more “personal” you might have a problem with that. This really isn’t new. But it highlights the fact that your data is valuable and retailers will want to make money off of it, even if you don’t buy anything from them. That’s something that you might want keep in mind if you shop online.

NSA Admits To Buying User Browsing Data

Posted in Commentary with tags , on January 29, 2024 by itnerd

The NSA has recently admitted to buying user browsing data. Here’s what Senator Ron Wyden had to say on this:

U.S. Senator Ron Wyden, D-Ore., released documents confirming the National Security Agency buys Americans’ internet records, which can reveal which websites they visit and what apps they use. In response to the revelation, today Wyden called on the administration to ensure intelligence agencies stop buying personal data from Americans that has been obtained illegally by data brokers. A recent FTC order held that data brokers must obtain Americans’ informed consent before selling their data. 

“The U.S. government should not be funding and legitimizing a shady industry whose flagrant violations of Americans’ privacy are not just unethical, but illegal,” Wyden wrote in a letter to Director of National Intelligence (DNI) Avril Haines today. “To that end, I request that you adopt a policy that, going forward, IC elements may only purchase data about Americans that meets the standard for legal data sales established by the FTC.”

 John Gunn, CEO, Token had this comment:

Senator Wyden’s efforts are misguided. Instead of working to hinder the critical work of law enforcement agencies that keep everyone safe, he should focus his efforts on the data aggregators. Data purchased by the NSA, marketers, and others is out there in regular commercial markets for anyone to purchase. Nothing is gained by excluding law enforcement from doing their jobs, and people’s privacy is not any more protected by excluding law enforcement from public markets for information. If some of the data being used is obtained illegally, then stop the illegal collection.

I can see a different view on this issue. I am all for law enforcement having access to the data that they need to fight crime. But there needs to be clear limits on how they access that data. It cannot be a free for all where the NSA or any law enforcement agency can get anything that they want with little or no oversight. I’m free to be convinced otherwise as this is a complex issue.

A Marketing Company Claims That It Can Listen In On Your Conversation Through Your Devices

Posted in Commentary with tags on December 15, 2023 by itnerd

To be clear, I am not the least bit surprised that this could be possible. Though part of me is still a bit stunned at this story as it is an insane privacy breach if this is true. And what I am talking about is a company called Cox Media Group who claims that it can eavesdrop on your conversations, through microphones in smartphones, TVs, and smart speakers. This comes via 404 Media:

A marketing team within media giant Cox Media Group (CMG) claims it has the capability to listen to ambient conversations of consumers through embedded microphones in smartphones, smart TVs, and other devices to gather data and use it to target ads, according to a review of CMG marketing materials by 404 Media and details from a pitch given to an outside marketing professional. Called “Active Listening,” CMG claims the capability can identify potential customers “based on casual conversations in real time.”

The news signals that what a huge swath of the public has believed for years—that smartphones are listening to people in order to deliver ads—may finally be a reality in certain situations. Until now, there was no evidence that such a capability actually existed, but its myth permeated due to how sophisticated other ad tracking methods have become.

It is not immediately clear if the capability CMG is advertising and claims works is being used on devices in the market today, but the company notes it is “a marketing technique fit for the future. Available today.” 404 Media also found a representative of the company on LinkedIn explicitly asking interested parties to contact them about the product. One marketing professional pitched by CMG on the tech said a CMG representative explained the prices of the service to them. 

“What would it mean for your business if you could target potential clients who are actively discussing their need for your services in their day-to-day conversations? No, it’s not a Black Mirror episode—it’s Voice Data, and CMG has the capabilities to use it to your business advantage,” CMG’s website reads.

And:

With Active Listening, CMG claims to be able to “target your advertising to the EXACT people you are looking for,” according to its website. The goal is to target potential clients or customers based on what they say in “their day to day conversations,” the website adds.

Reading this story sent chills down my spine. Now in my case, my household is part of Team Apple. Which means the four HomePod mini’s as well as the three iPhones along with two Apple Watches that my wife and I collectively own are covered by this policy which fully lays out what data Apple collects and why along with what data they may keep and where that data goes. That made me a feel bit better. The only other device in my home that has any form of voice interface is this TCL TV which is powered by the Roku operating system. They have this privacy policy which states the following:

If you link your Roku Device to a non-Roku voice-enabled virtual assistant (e.g., Alexa and Google Assistant), you are choosing to have us disclose device data to the voice assistant provider, such as device type and name, device identifiers, its state (e.g., whether the device is powered on, whether the device is playing video), names of installed streaming service apps, and the names of your device HDMI ports. If you direct such virtual assistant to display content, we will also disclose the content to such voice provider to carry out your request.  For information about how these providers use this data, please review their privacy policies. 

I have the TV linked to Apple HomeKit. Which means that it is covered by Apple’s privacy policy. And I do have a Roku voice remote that requires me to press a button to do anything. So it’s not actively listening into anything I am doing. So I am fine there as well. Now why am I telling you all of this? Well, depending on what smart devices you have in your home, you might be fine, our you might have an issue:

CMG lists a number of other companies as its partners and publishers. These include Amazon, Microsoft, and Google. None of those companies responded to a request for comment on whether they were aware of this capability or whether it was in active use.

I would assume that if you have any Google, Amazon or Microsoft devices, then you likely have a problem. I say that because the first two are exactly the type of companies who would do anything to gather as much data on you as possible to monetize it in any way possible. The jury is still out on Microsoft. But let us assume that they are in the Google or Amazon camp for now until they prove themselves to be different.

I will be interested to see how CMG and their clients respond to this story now that this is out there. Because I think it is safe to say that this story is going to get a lot of attention. Including from regulators which I am sure that CMG and their clients do not want. Thus if they don’t respond to this with some talking points to try and defuse this, they may have bigger problems on their hands.

Governments Spy On Users Using Push Notifications

Posted in Commentary with tags , , on December 7, 2023 by itnerd

From the “I didn’t see this one coming” department comes the revelation that governments have been using push notifications to spy on people for some time. This came to light when Oregon Senator Ron Wyden wrote in a letter to the Department of Justice on December 6 asking the Justice Department to lift restrictions in terms of informing the public of this practise:

Because Apple and Google deliver push notification data, they can be secretly compelled by governments to hand over this information

So why should you care? A government could force Apple or Google to hand over data related to push notifications to show how you interact with your phone and the apps on it, as well as give them access to a notification’s complete text and disclose some unencrypted content. All of which is bad of course.

Apple said in a statement published by Reuters the following:

Now that this method has become public, we are updating our transparency reporting to detail these kinds of requests.

True to their word, Apple has now updated its Legal Process Guidelines document to reflect this new reality. Google for its part said this:

Google said that it shared Wyden’s “commitment to keeping users informed about these requests.”

But beyond that, I haven’t seen Google update anything. And the thing is that beyond the US who clearly has been using push notifications to spy on people, it isn’t clear who else is doing it. And it is likely that we won’t get a straight answer on that. Thus it might be wise for Apple and Google to rework how push notifications work so that this sort of spying isn’t a possibility.

Google Is Now Tracking Your Every Move Online To Make Money

Posted in Commentary with tags , on September 10, 2023 by itnerd

Earlier this week, I posted this story about Google’s new Privacy Sandbox feature. But there’a dark side to this announcement that ARS Technica is highlighting:

Don’t let Chrome’s big redesign distract you from the fact that Chrome’s invasive new ad platform, ridiculously branded the “Privacy Sandbox,” is also getting a widespread rollout in Chrome today. If you haven’t been following this, this feature will track the web pages you visit and generate a list of advertising topics that it will share with web pages whenever they ask, and it’s built directly into the Chrome browser. It’s been in the news previously as “FLoC” and then the “Topics API,” and despite widespread opposition from just about every non-advertiser in the world, Google owns Chrome and is one of the world’s biggest advertising companies, so this is being railroaded into the production builds.

Google seemingly knows this won’t be popular. Unlike the glitzy front-page Google blog post that the redesign got, the big ad platform launch announcement is tucked away on the privacysandbox.com page. The blog post says the ad platform is hitting “general availability” today, meaning it has rolled out to most Chrome users. This has been a long time coming, with the APIs rolling out about a month ago and a million incremental steps in the beta and dev builds, but now the deed is finally done.

Well, I don’t use Google Chrome as my main web browser. But this is a few steps too far. And not only won’t I be using Chrome on any of my computers, but I will encourage others not to use Chrome as well. The other thing that this does is make my trust level with Google as a company drop to zero.

If you’re looking for alternatives, Firefox and Safari on the Mac would be my choices. Neither of those browsers have shown blatant disregard for their user base that Google Chrome has.

Wyze Seems To Have A Privacy Issue Related To Their Cameras

Posted in Commentary with tags , on September 9, 2023 by itnerd

A reader tipped me off to this Reddit thread where Wyze has had some sort of issue has broadcasted private camera streams randomly to others. That’s one hell of a privacy issue. But not the company’s first one. I wrote about another privacy issue with Wyze back in 2019. Thus I am not shocked by this. The Verge confirms that this was happening on Friday along with additional Reddit threads illustrating that this issue was widely seen by uses, and they also report the following:

After we published this story, Wyze spokesperson Dave Crosby shared a statement explaining what happened. Although Crosby says the issue is resolved and that view.wyze.com is “back up and running,” the status page still says view.wyze.com is under maintenance as of Saturday morning. (Crosby says the company will update the status page “shortly.”)

Here is Crosby’s statement:

This was a web caching issue and is now resolved. For about 30 minutes this afternoon, a small number of users who used a web browser to log in to their camera on view.wyze.com may have seen cameras of other users who also may have logged in through view.wyze.com during that time frame. The issue DID NOT affect the Wyze app or users that did not log in to view.wyze.com during that time period.

Once we identified the issue we shut down view.wyze.com for about an hour to investigate and fix the issue.

This experience does not reflect our commitment to users or the investments we’ve made over the last few years to enhance security. We are continuing to investigate this issue and will make efforts to ensure it doesn’t happen again. We’re also working to identify affected users.

That’s nice. But again, I’ll point out that this is not the first time that Wyze has run into a privacy issue. Besides what I mentioned above, there was this:

In March 2022, Wyze revealed that it had been aware of a security vulnerability for three years that could have let bad actors access WyzeCam v1 cameras, but quietly discontinued the camera rather than telling customers about it.

My take home message is that nobody should buy Wyze cameras. They may be cheap on Amazon. But they’re clearly insecure and the company cannot be trusted.

Cars Are Rolling Privacy Nightmares Says Mozilla As They Collect All Your Data… Including Data About Your Sex Life

Posted in Commentary with tags on September 7, 2023 by itnerd

Internet connected cars are all the rage at the moment. And I for one will not be buying one and I will be hanging on to my Internet disconnected car for as long as I can do so. The reason being is according to a study done by Mozilla, cars collect all sorts of data about you and sends it back to the manufacturer. And the kind of data that is collected is shocking:

We reviewed 25 car brands in our research and we handed out 25 “dings” for how those companies collect and use data and personal information. That’s right: every car brand we looked at collects more personal data than necessary and uses that information for a reason other than to operate your vehicle and manage their relationship with you. For context, 63% of the mental health apps (another product category that stinks at privacy) we reviewed this year received this “ding.”

And car companies have so many more data-collecting opportunities than other products and apps we use — more than even smart devices in our homes or the cell phones we take wherever we go. They can collect personal information from how you interact with your car, the connected services you use in your car, the car’s app (which provides a gateway to information on your phone), and can gather even more information about you from third party sources like Sirius XM or Google Maps. It’s a mess. The ways that car companies collect and share your data are so vast and complicated that we wrote an entire piece on how that works. The gist is: they can collect super intimate information about you — from your medical information, your genetic information, to your “sex life” (seriously), to how fast you drive, where you drive, and what songs you play in your car — in huge quantities. They then use it to invent more data about you through “inferences” about things like your intelligence, abilities, and interests.

The car companies then sell this data, as it’s a revenue source for them. And opting out of this data collection isn’t an option for the most part. Consent is an illusion as simply stepping into a car with this sort of tech qualifies as consent. And finally, all car companies do this.

This to me is not cool and I hope that consumers file complaints with the relevant government agencies (In Canada that’s the Privacy Commissioner) so that all of these car companies are forced to explain why they do this which may make them reconsider if they should be doing this at all.

Teamsters Accuse CN Rail Of Secretly Tracking Their Employees Movements Via Company Issued Tablets

Posted in Commentary with tags , on August 24, 2023 by itnerd

This is one of those topics that I always thought would come up more often. CTV News is reporting that the Teamsters union is accusing CN Rail of tracking employees movements, even after hours via the tablets that CN Rail issues their employees and not disclosing that they were doing so:

The Teamsters Canada Rail Conference, which is the union that represents 5,500 Canadian National railway employees, alleges CN has been monitoring the whereabouts of a train operator outside of work hours through a company-issued tablet.

“It’s spying, it’s wrong and it’s illegal in our view” according to Teamsters Canada’s director of public affairs Christopher Monette, who adds “on top of it being creepy, it’s downright dystopian. It’s something that shouldn’t be happening.” 

The union says they have reason to be concerned that a large number of CN Rail employees may have also had their location tracked by the company during their own personal time after work.Speaking to CTV National News, Monette says that CN “didn’t tell us this was going on and they didn’t seek consent from workers to use geolocation data” from their company issued devices and believes CN was trying to keep their tracking methods secret.

“We only found out about this by accident, through a disclosure process where the company was forced to disclose why they were disciplining a worker,” according to Monette.

Now CN Rail doesn’t want to comment on this. But frankly I am not surprised. Tablets and phones issued by companies are often what are called “managed” devices. Meaning that the devices are put into a type of software called Mobile Device Management software or MDM for short. This software allows a company to do a number of things. Get the status of the device, push out software updates, remote control the device for troubleshooting purposes, and most relevant to this story, track the device. Now a company may only decide to use this software to track a device if it is stolen. But I can see a scenario where a company may use this software to track a device at all times. Which if they disclose that up front, I guess that’s fine. But if they didn’t you get this situation.

Now if you have a company issued device and are afraid of being tracked, there are very low tech solutions to this:

Cyber security analyst and lawyer Ritesh Kotak believes employees who have a work phone, tablet or laptop should try and purchase their own personal devices to use off work hours.

“These high-tech problems have really low-tech solutions,” Kotak says.

He also says that he uses a tab to cover the camera on his work computer when he’s not on a video call. Kotak adds that, if possible, employees should turn their work devices onto airplane mode off work hours.

“It’s important to understand that information (from your devices) is being collected on a continuous basis by the employer, it’s probably being stored and there maybe third parties who have access to it.”

One thing to consider is that if you go this route, your company may complain at some point because the device isn’t on all the time. Another thing to consider is if you “BYOD” or bring your own device, and the company puts their MDM software on it, you could be in the same situation. So you may want to keep that in mind as well.

The bottom line is that if you use company property, or simply have their software installed on your own smartphone or computer, you should have no expectation of privacy. Ever. Unfortunate, but true.