GDPR and AI in 2026
Schrems II, the AI Act, a DPIA template. For compliance officers and DPOs.
Marek runs a four-person law firm in Cracow. His specialty: healthcare and data protection. Clients include hospitals, dental clinics and telemedicine companies. His problem: every client asks "do you use AI to review contracts?" - and every cloud-based legal tool ships data to the United States. BezChmury changes that arithmetic.
Marek Nowak (38) opened his own firm in Cracow in 2019, after three years of practice in a large Warsaw medical-law office. The choice of niche was deliberate: between 2018 and 2020 the Polish legal market began to clearly stratify into healthcare-plus-data-protection specialists and everyone else. GDPR had landed a year earlier, COVID added telemedicine to the mix, and medical clients started looking for lawyers who understood both hospital governance and Article 32 GDPR.
Today Marek's team is four people: himself, two trainee legal advisers and a paralegal. Client portfolio: 12 public and private hospitals, 8 dental clinics across Lesser Poland and Silesia, and 3 telemedicine companies. Day-to-day work is GDPR audits, legal opinions on new-technology rollouts (from patient chatbots to AI in medical imaging), DPIAs for the deployments listed in Article 35 GDPR, and representing clients in proceedings before the Polish DPA (Urząd Ochrony Danych Osobowych).
The market in which Marek competes has concrete contours. As of the end of 2025 there were 56,415 legal advisers registered in Poland (43,546 actively practising) and 29,349 advocates (23,613 actively practising) - a combined 85,764 people across the two principal professional bodies, the Polish Bar Association of Legal Advisers (KIRP - Krajowa Izba Radców Prawnych) and the Polish Bar Council (NRA - Naczelna Rada Adwokacka) (radcaprawny.kirp.pl). Within the "healthcare + privacy" niche the 2025 Legal 500 Poland rankings repeatedly surface the same names: CMS, DLA Piper, Rymarz Zdort Maruta, Kieltyka Gladkowski KG Legal and Baker McKenzie (legal500.com). Marek does not compete head-to-head with the big Warsaw firms - he competes with smaller Cracow-based "healthcare boutique" practices, where the hourly rate is PLN 250-400 and the unique selling point is proximity to the client and deep specialisation.
AI adoption inside the legal profession is the other half of the backdrop. According to Future Ready Lawyer 2026 by Wolters Kluwer (a survey of 810 lawyers from the United States, China and 9 European countries including Poland), 92% of lawyers use at least one AI tool, and 62% save between 6 and 20% of their weekly working time thanks to AI (wolterskluwer.com). The question is not whether Marek should use AI. The question is which AI he can use without shipping the patient files of a Cracow university hospital to a data centre in Virginia.
From 2024 onwards Marek began to hear a new question from his medical clients. "Marek, do you use AI to review contracts?" Telemedicine companies asked first, then private clinics, and finally - in 2025 - public hospitals. The context was usually the same: competing law firms had started advertising "AI-powered legal research", and clients wanted to know whether their lawyer was keeping up with the market.
Marek surveyed the offering. By 2026 the Polish legal AI market has clear names of its own: LEX Expert AI (Wolters Kluwer), Libra by Wolters Kluwer as a European research-and-drafting platform, and Beck-Noxtua for the Legalis ecosystem (wolterskluwer.com, legalis.pl). On top of that the global players: ChatGPT-4, Claude, Microsoft Copilot. Lawyers inside the professional bodies are running practical training on each of them.
They all share one common denominator: cloud. A query about a contract, a fragment of medical documentation or a piece of client correspondence travels to the vendor's servers - most often in the United States. That means every prompt with patient data has to be assessed against Schrems II and the legal basis for transfers outside the European Economic Area, including the realistic exposure to the US Cloud Act and FISA 702.
"The protection afforded by that mechanism must, in practice, be actionable."
Source language quotation - paragraph 184 of the CJEU judgment in C-311/18 (Schrems II), 16 July 2020.
Marek pushed back on his clients: he refused to use cloud AI for patient data. The decision was right, but expensive. Two consequences followed. First - the team was losing time. Marek estimated that 4 people × ~10 hours per week (around 40 h/week in total) went into manual contract analysis, DPIA drafting and opinion writing. Second - the firm started to look "old-school". A client who came in for an audit of a chatbot rollout heard "we review contracts manually" while a competitor was advertising "AI-powered review". The argument "because cloud means transfer risk" was not enough for everyone.
Three concrete examples from his roster:
All three required Marek to combine AI fluency with GDPR fluency. All three also required Marek to operate faster. Without AI of his own he could barely process one such case a week, and the queue kept growing.
Marek follows the Polish DPA (UODO) almost daily. In his niche, public fines and decisions are a hard benchmark for client argumentation. From the 2024-2026 record, four concrete cases appear on every opinion he drafts.
Add to that the DKN.5131.3.2025 decision from 2025, in which the Polish DPA required the controller to demonstrate whether it had performed the risk analysis necessary to determine whether the incident triggered a notification obligation to the authority and to the data subjects (orzeczenia.uodo.gov.pl). That is the load-bearing narrative for Marek: the regulator is not asking about fashionable slogans, it is asking about a documented risk analysis and procedures.
The conclusion for Marek is straightforward. Every medical client now asks about DPIA for AI, so it is no longer a "bonus" line of practice but the core of his practice in 2026. Marek estimates it represents about 30% of his work in 2026. To do it credibly, he must use AI himself - but an AI that does not blow up his own compliance. That is the paradox BezChmury solves: you want to advise clients on AI? Use AI. Just do not let that AI leave your laptop.
Marek heard about BezChmury through networking. The compliance officer at one of his telemedicine clients pointed him to the project as "the Polish BezChmury 11B with a local GDPR-and-KSeF fact base". He signed up for a 30-day trial and started testing on two fronts: legal (are the answers accurate?) and technical (does the application really send nothing to the internet?).
Test 1: Schrems II, paragraph 184. Marek asked BezChmury for a verbatim quotation. The application returned the sentence from paragraph 184 together with the CURIA URL and a fact_id from the local SSoT base. The quote matched the source exactly.
Test 2: Decision DKN.5131.3.2025. Marek asked: "What is decision DKN.5131.3.2025?" BezChmury answered: "A 2025 decision of the Polish DPA (UODO), emphasising a risk analysis as the precondition for assessing whether an incident requires notification (Articles 32 and 35 GDPR). Source: uodo_facts:DKN_5131_3_2025." Consistent with the public text of the decision.
Test 3: technical audit. Marek invited the IT consultant of one of his hospital clients (a person holding CISA + CIPP/E certifications). The team spent half a day observing BezChmury inside a network sandbox. Wireshark showed that, after the first launch, the application opens no outbound connection - the model and the knowledge base ship inside the installation package. Second check: model licence. BezChmury 11B v3 is a SpeakLeash + ACK Cyfronet AGH project under the Apache-2.0 licence; the weights are publicly available on Hugging Face (huggingface.co/speakleash). That means the hospital client can commission an independent audit of the model itself - something that simply cannot be done with a closed ChatGPT.
Test 4: audit log. Every prompt and every response is written locally to a JSONL file, with a timestamp, a hash of the prompt and the fact_id from the SSoT base. The default retention is 5 years - enough for GDPR record-keeping and for a typical law firm's retention policy. Marek was able to demonstrate to the hospital client that, if a year later someone asks "where did that sentence in your March 2026 opinion come from?", there is a deterministic trail.
The decision came after three weeks. BezChmury Lite × 4 seats = PLN 596/month (the price list starts at PLN 149/month per seat, subject to confirmation in the current pricing page). The technical audit took one day, instead of the 5-7 days typical of a cloud AI vendor audit (where you have to verify subprocessors, data centre locations, ISO 27001 certificates and DPF status).
Week 1. Installation of BezChmury Lite on four laptops: three MacBook Pros (M2/M3, 16-32 GB unified memory) and one Dell XPS running Windows 11. BezChmury 11B v3 in the Q4_K_M quantisation (around 6 GB on disk) loaded locally; RAG over the 630-fact SSoT (KSeF, VAT, ZUS, GDPR) indexed. Installation time: ~30 minutes per laptop.
Week 1 (in parallel). Usage policy. Marek wrote a one-page internal rulebook: who uses the tool, how, weekly audit-log review, mandatory DPIA for any case in which AI is part of the deliverable to the client. The team agreed on the principle that "AI is a research assistant, not a source of law" - every BezChmury answer goes through human review by a lawyer before it leaves the firm.
Week 2. Workflow integration:
Weeks 3-4. The first real case. One of the hospitals on Marek's roster wants to deploy an AI chatbot for patient registration (appointment booking + Q&A on procedures). Marek asks BezChmury: "Does an AI chatbot for patient registration require a DPIA?". BezChmury answers:
"Yes - depending on the data scope. If the chatbot processes data concerning health (Article 9 GDPR), a DPIA is mandatory by virtue of Article 35(3)(b) GDPR. In addition, Regulation (EU) 2024/1689 (the AI Act) classifies AI systems affecting access to healthcare under Annex III; the full obligations for high-risk Annex III systems apply from 2 August 2026."
With a verbatim citation in hand and a structured risk outline, Marek writes the opinion in about 4 hours, instead of the previous 12 (everything by hand plus research in a paid legal information system). The client receives an opinion with a bibliography that will withstand a Polish DPA audit. Marek archives the audit log inside the firm's local repository.
Marek does not lay anyone off. The recovered time goes into a new niche: AI audits for medical clients. By the end of the first full quarter on BezChmury Marek plans to deliver 4-6 such audits per month at PLN 5,000-15,000 per audit - an additional PLN 20,000-90,000/month in revenue that simply did not exist before, because he had no time for it.
"BezChmury allows me to tell clients: we use on-premise AI ourselves, that is why we understand DPIA for your AI systems. That is value you cannot measure in hours."
Marek's case shows a pattern that will repeat across 2026-2027 for any law firm bound by professional secrecy: cloud AI is a poor fit for regulated professions, and clients still ask about AI. The answer is not to abandon AI - it is to choose an architecture in which client data never leaves the lawyer's device. BezChmury 11B v3 as a Polish open-weights model, a local SSoT knowledge base, an audit log with fact_ids and a one-off purchase instead of a cloud subscription is the sweet spot for a four-person firm.
For more legal context, see our companion guides: GDPR and on-premise AI - full compliance guide and What is private AI? The current package pricing is on the pricing page.
ŹRÓDŁA
Wszystkie cytaty dosłowne w artykule pochodzą z powyższych oficjalnych źródeł.
Inline odniesienia oznaczone [N] linkują do tej listy.
A short KSeF Private demo (15 min). We will show local execution, control questions, source base and how BezChmury reduces the risk of hallucinations.