Issue 01 — March 2026
IT EN DE
The European magazine on private AI

Guide

GDPR and AI: how to use artificial intelligence while respecting privacy

Practical guide to GDPR and artificial intelligence for European businesses. How to use ChatGPT and other AI tools without violating privacy regulations.

GDPR and AI: a necessary coexistence

Generative artificial intelligence offers extraordinary opportunities for European businesses, but GDPR sets precise constraints on how personal data can be processed. It’s not about choosing between AI and privacy: it’s about using AI the right way.

The ChatGPT case: what happened in Italy

In March 2023, the Italian Data Protection Authority blocked ChatGPT in Italy — the first case worldwide. The reasons:

  • No legal basis for data collection and processing
  • Lack of information to users about data processing
  • No age verification to protect minors
  • Data inaccuracy: the model generates false information about people

In 2024, OpenAI was fined 15 million euros by the Italian DPA. This set an important precedent for all European businesses using AI services.

Concrete risks for businesses

When an employee uses ChatGPT with company data:

Risk Consequence
Personal data in prompts GDPR violation, possible fine
Health data Special category data breach (Art. 9 GDPR)
Trade secrets Loss of intellectual property
No DPA with OpenAI Direct company liability
Data to USA Non-compliant extra-EU transfer

How to make AI GDPR-compliant

1. On-premise solutions

The safest solution: AI runs on company servers. No data leaves.

ORCA by HT-X is designed exactly for this: a complete AI platform that runs entirely within the company infrastructure.

2. Data minimisation

If using cloud services, apply the minimisation principle:

  • Anonymise data before sending
  • Don’t include names, tax codes, identifying references
  • Use pseudonymisation where possible

3. DPIA (Data Protection Impact Assessment)

Conduct a DPIA before implementing any AI system that processes personal data:

  • Describe the processing and its purposes
  • Assess necessity and proportionality
  • Identify risks to data subjects
  • Define mitigation measures

4. Records of processing

Update your records of processing (Art. 30 GDPR) to include:

  • AI systems used
  • Categories of data processed
  • Legal bases applied
  • Transfers to third countries (if any)

GDPR and AI Act: the dual track

From 2026, businesses must comply with both GDPR and the AI Act. The two regulations complement each other:

  • GDPR: protects personal data
  • AI Act: regulates AI system operation

An on-premise platform like ORCA addresses both requirements in a single solution.

Focus on your business, not on red tape

GDPR, AI Act, DPIA, processing records, extra-EU transfers: for an SME, the risk is spending more time managing compliance than growing the business. But it doesn’t have to be that way.

Compliance should be a feature of the solution, not a burden on the business owner. If you choose an AI platform that keeps data on-premise, uses open-source models and guarantees traceability by architecture, most regulatory obligations are resolved at the root — no dedicated consultants, no extraordinary audits, no sleepless nights.

ORCA is exactly that: a solution that lets you use AI for what matters — improving processes, analysing documents, supporting decisions — with full GDPR and AI Act compliance built in. Not one more headache, but one less.

The competitive advantage of privacy

GDPR compliance is not just an obligation: it’s a competitive advantage. Companies that demonstrate they protect their customers’ and employees’ data build trust and differentiate themselves. Private AI is the future for responsible European businesses.

Frequently asked questions

Public ChatGPT can violate GDPR if used with personal or sensitive data. Data is sent to OpenAI's servers in the USA, without adequate safeguards for extra-EU transfers. The Italian Data Protection Authority fined OpenAI 15 million euros. On-premise solutions like ORCA eliminate this risk.

The main options are: 1) Use on-premise AI platforms like ORCA that keep data in-house, 2) Choose cloud providers with EU data residency and compliant DPA, 3) Anonymise data before sending to cloud services, 4) Never use personal data with public ChatGPT.

Yes, in most cases. GDPR requires a Data Protection Impact Assessment when processing may result in high risk to data subjects' rights. Using AI to process personal data almost always falls into this category.

You should never send to public ChatGPT: patient health data, client financial data, employee data (salaries, evaluations), proprietary source code, confidential legal documents, and any data that identifies natural persons.

Yes. ORCA is GDPR compliant by design: data stays completely on-premise within the company infrastructure, there is no transfer to third parties, it supports the right to erasure and portability, and provides a complete audit trail to demonstrate compliance.

Looking for a private ChatGPT for your business?

ORCA is the on-premise AI platform by HT-X (Human Technology eXcellence): your data stays yours, GDPR and AI Act compliant.

Discover ORCA