Last week I brought you a story about DeepSeek having a database that for a brief period of time was publicly accessible. That was on top of the fact that DeepSeek was under attack, and two reports of successful jailbreaks popped up. Now there’s news that the iOS version of DeepSeek seriously fails at basic security:
A NowSecure mobile application security and privacy assessment has uncovered multiple security and privacy issues in the DeepSeek iOS mobile app that lead us to urge enterprises to prohibit/forbid its usage in their organizations.
And:
Key Risks Identified:
- Unencrypted Data Transmission: The app transmits sensitive data over the internet without encryption, making it vulnerable to interception and manipulation.
- Weak & Hardcoded Encryption Keys: Uses outdated Triple DES encryption, reuses initialization vectors, and hardcodes encryption keys, violating best security practices.
- Insecure Data Storage: Username, password, and encryption keys are stored insecurely, increasing the risk of credential theft.
- Extensive Data Collection & Fingerprinting: The app collects user and device data, which can be used for tracking and de-anonymization.
- Data Sent to China & Governed by PRC Laws: User data is transmitted to servers controlled by ByteDance, raising concerns over government access and compliance risks.
Implications for Enterprises & Government Agencies:
- Exposure of sensitive data, including prompt data; intellectual property, strategic plans, and confidential communications.
- Increased risk of surveillance through fingerprinting and data aggregation.
- Regulatory & compliance risks, as data is stored and processed in China under its legal framework.
Recommended Actions:
NowSecure urges enterprises and agencies to:
Continuously monitor all mobile applications to detect emerging risks.
Immediately remove the DeepSeek iOS app from managed and BYOD environments.
Explore alternative AI platforms that prioritize mobile app security and data protection.
This is pretty bad. In fact it’s horrific. Thus I am going to say that if you have the DeepSeek app installed on any device, delete it ASAP. It’s clearly risky to have on your device based on what we see with the iOS version of their app. And to be clear, there are risks when using any AI as data that you may not want to be out in the public eye might be used for purposes like training the AI or it might be exposed to third parties like this example. But this example with DeepSeek is way worse. Hopefully DeepSeek gets investigated to see how far the rabbit hole DeepSeek’s security issues go.
Like this:
Like Loading...
Related
This entry was posted on February 7, 2025 at 9:40 am and is filed under Commentary with tags DeepSeek. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
PSA: If You Are Using DeepSeek, Dump It ASAP
Last week I brought you a story about DeepSeek having a database that for a brief period of time was publicly accessible. That was on top of the fact that DeepSeek was under attack, and two reports of successful jailbreaks popped up. Now there’s news that the iOS version of DeepSeek seriously fails at basic security:
A NowSecure mobile application security and privacy assessment has uncovered multiple security and privacy issues in the DeepSeek iOS mobile app that lead us to urge enterprises to prohibit/forbid its usage in their organizations.
And:
Key Risks Identified:
Implications for Enterprises & Government Agencies:
Recommended Actions:
NowSecure urges enterprises and agencies to:
Continuously monitor all mobile applications to detect emerging risks.
Immediately remove the DeepSeek iOS app from managed and BYOD environments.
Explore alternative AI platforms that prioritize mobile app security and data protection.
This is pretty bad. In fact it’s horrific. Thus I am going to say that if you have the DeepSeek app installed on any device, delete it ASAP. It’s clearly risky to have on your device based on what we see with the iOS version of their app. And to be clear, there are risks when using any AI as data that you may not want to be out in the public eye might be used for purposes like training the AI or it might be exposed to third parties like this example. But this example with DeepSeek is way worse. Hopefully DeepSeek gets investigated to see how far the rabbit hole DeepSeek’s security issues go.
Share this:
Like this:
Related
This entry was posted on February 7, 2025 at 9:40 am and is filed under Commentary with tags DeepSeek. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.