Privacy In Focus | January

January 31, 2025

by Olena Nechyporuk

We bring you a round up of articles and updates in the data sphere

Friday, 31st of January 2025

Italy Bans DeepSeek

Following on from two days ago, the Italian data protection authority has issued a ban on DeepSeek, after asking its Beijing headquarters for information on their data policies.

The app is officially blocked in Italy, although some users may be able to access the AI model via various third-party sources.

Why could there be a substantial risk in using DeepSeek?

- The data is most likely stored on servers in China, and this could lead to potential Chinese Government access to personal data of many European users

- DeepSeek is open source and free of cost, which suggests a risk that it can be manipulated and censored to give biased information.

The Italian DPA is still waiting on detailed information from Beijing.

Read more

---

AI Myths, Debunked by the ICO

The ICO has published a list of the most pertinent AI Myths:

#1: People have no control over their personal data being used to train AI

People absolutely have a right to object to how their data is being used, and to file a complaint with their data protection authority.

#2: AI developers can use people’s data without being transparent about it

Any company aiming to use people's data to train their AI models MUST be transparent about this.

#3: Data protection doesn’t apply if AI developers did not intend to process people’s personal data

The law applies in all circumstances, and no AI model is exempt from the law - any personal data must be used lawfully.

#4: AI models themselves do not come with data protection risks

Some AI models do contain and store personal data; it has been identified that some models also make it possible to identify individuals via the personal data they have access to.

#5: AI development should be exempt from the law

There is no reason for AI to have 'sweeping exemptions' - data protection applies in all instances.

#6: Existing regulation is not fit for cutting-edge tech like AI

Data protection laws enable AI models to be built correctly - with a view of incorporating people's privacy from the beginning. Organisations need to make sure their product is safe and all the possible risk mitigated.

Read more

---

Italian DPA Probes into DeepSeek's Privacy Matters

The Italian Data Protection Authority - the Guarantor - has sent a request for information to Hangzhou DeepSeek Artificial Intelligence and Beijing DeepSeek Artificial Intelligence, the companies that provide the DeepSeek chatbot service.

The Authority has asked the two companies to confirm what personal data are collected, from which sources, and for which purposes, what the legal basis of processing is, and whether personal information is stored on servers located in China.

The Guarantor also asked the companies what type of information is used to train the artificial intelligence system and, if personal data is collected through web scraping activities, to clarify how users have been informed about the processing of their data.

DeepSeek must provide the Authority with the requested information within 20 days.

Read more

---

LinkedIn Lawsuit: Training AI Through Private Messages

The lawsuit was filed in a California federal court on behalf of a LinkedIn Premium user, alleging that in August 2024 LinkedIn introduced a privacy setting, automatically opting users in to a programme that allowed third parties to use their personal data to train AI - and then concealing its actions a month later by changing its privacy policy to say user information could be disclosed for AI training purposes.

The lawsuit seeks $1,000 (£812) per user for the alleged violations.

LinkedIn has denied the accusations.

Read more

---

ICO Outlines its Goals for 2025

In a letter to the Prime Minister, Chancellor, and Secretary of State the ICO Commissioner John Edwards has outlined several key areas of focus for 2025.

The ICO says it will do its best to improve the following:

- Giving businesses regulatory certainty on AI

- Cutting costs for small and medium-sized enterprises (SMEs)

- Unlocking privacy-preserving online advertising

- Making it quicker and easier to transfer data internationally

We look forward to the developments in these areas. Let's watch this space.

Read more

---

Trump Revokes Biden's Executive AI Order

Biden's Executive Order required developers of certain high-risk AI systems to share the results of their safety tests with the U.S. government, in line with the Defense Production Act. AI agencies were to set standards for testing and address related risks.

Trump revoked the order on the first day of his being in office, January 20, 2025 - the Republican party position being that Biden's order hinders AI innovation. This is a big step, and makes the legislative field of AI in the US even more fragmented, as different legislations flourish in different states. It is not yet clear how to reconcile and standardize AI compliance across the USA.

Read more

---

CNIL Clarifies 'Consent' in Mobile Applications

CNIL has publishes an article going over some of their recommendations regarding mobile apps and the data they collect. They highlights the need for app developers to differentiate between 'consent' and 'permissions'.

Under the GDPR, app developers are not required to ask for consent if the information is vital for providing the service: for example, a navigation app would need access to the location. In this case, requesting 'permission' from the mobile user is misleading.

CNIL also clarifies that some app developers need to provide more detail when requesting permissions: a torch app does not need access to the mobile contact list. Some messenger apps need to introduce the option of requesting access to pre-determined features (such as accessing a single photo album, rather than the whole gallery) with a set time limit.

App developers need to be more specific when they request consent from users, and inform them how, when and why their data will be used.

Read more

---

The Connection Between Competition Law and GDPR: EDPB Weighs In

The EDPB has adopted a position paper on the connection between data protection law and competition law.

In many cases, data protection and competition authorities are required to work together to achieve effective and coordinated enforcement of data protection and competition law. It is therefore important to assess situations where the laws may intersect.

In this position paper, the EDPB explains how data protection and competition law interact. It suggests steps for:

- incorporating market and competition factors into data protection practices

- data protection rules to be considered in competition assessments.

- recommendations for improving cooperation between regulators.

Read more

---

EDPB Clarifies GDPR Pseudonymisation

The EDPB has entered 2025 by releasing guidelines on the use of pseudonymisation within GDPR. In its statement the EDPB clarifies the definition and the substantial advantages of pseudonymisation. Technical measures and safeguards to ensure confidentiality and prevent unauthorised identification of individuals are also discussed in depth.

A few things to remember:

- Pseudonymised data still personal data and ought to be treated as such - with all appropriate safeguards in place.

- Pseudonymisation can reduce risks and make it easier to use legitimate interests as a legal basis, as long as all other GDPR requirements are met.

Read more

---

UK Launches £1.5 Billion App Store Lawsuit Against Apple

A lawsuit at London's Competition Appeal Tribunal targets Apple for abusing its dominant market position by charging an 'unfair' 30% commission through App Store purchases. The lawsuit is brought on behalf of around 20 million iPhone and iPad users from the UK.

Apple's framework allows the company to impose restrictive terms on app developers and charge excessive commission, which is ultimately borne by consumers.

"Apple is not just dominant ... it holds a 100% monopoly position," lawyer Mark Hoskins said in court filings. Apple, however, maintains the case overlooks the benefits to consumers of the integrated iOS operating system, which prioritises security and privacy.

Read more

---

EU Commission Required to Pay Out Compensation due to Unlawful US Data Transfer

In a ruling released by the General Court of Justice of the European Union on January 8, 2025, the EU Commission has been found guilty of unwittingly transferring a data subject's IP address to the US. The data subject tried to access the EU Commission's webpage via a 'Sign in with Facebook' link that transferred his IP address to the US. He has claimed  400 euros in compensation.

The incident occurred in 2022 when the US did not have the status of having 'adequate protection', nor did they have 'adequate safeguards' as required by the GDPR. The General Court notes, that, "the Commission did not, therefore, comply with the conditions set by EU law for the transfer by an EU institution, body, office or agency of personal data to a third country."

The court has ruled that the Commission has indeed breached the law and is required to pay out the 400 Euros demanded by the data subject.

Read more

---

CJEU Issues an Opinion on Collecting Gender Information

The Mouse association in France has issued a complaint with CNIL due to the fact the French railway is requiring customers to indicate their title (‘Monsieur’ or ‘Madame’) (‘Mr’ or ‘Ms’) when purchasing transport tickets online. Mouse has stated that indicating gender is not necessary information for purchasing tickets and that this violated the GDPR principle of data minimisation.

CNIL has referred this case to the CJEU for advice, and has received the following reply: the railway undertaking could choose to communicate based on generic, inclusive expressions when addressing a customer, which have no correlation with the presumed gender identity of those customers. The court further advises that in order to collect gender information:

- data collector has to inform of the legitimate interest upon collection

- data has to be limited to only that which is strictly necessary

- the risk of discrimination based on gender identity is minimised.

Read more