NIS2: The Register of Essential and Important Entities Is Now Live. For Some Companies, Failure to Register Could Be a Costly Mistake

The launch of the register of essential and important entities – into which entities covered by the provisions of the Polish Act on the National Cybersecurity System will be entered – is not merely a technical add-on to the implementation of NIS2 in Poland. It marks the point at which, for many organizations, compliance becomes very concrete: it is necessary to determine whether an entity falls within the scope of the Act, whether it should be included in the register, when an application must be submitted, and how quickly new obligations must be implemented.

This is also where mistakes are most likely to occur. Some companies assume that if they have not received any official notification, the rules do not apply to them. That assumption can be risky.


The register is live – and it sets the pace for compliance

According to the Ministry of Digital Affairs, the register of essential and important entities was launched on 13 April 2026. Between 13 April and 6 May 2026, entries are being made ex officio. From 7 May to 3 October 2026, entities not automatically included are expected to self-register.

For new entities, access to the S46 system is scheduled for 12 June 2026. Meanwhile, entities that already met the criteria on the date the law entered into force have until 3 April 2027 to fully comply with the new requirements.

Importantly, inclusion in the register is neither discretionary nor constitutive. The law makes it clear that entries, updates, and removals are declaratory in nature, and that an entry is effective upon submission of the application via the ICT system.

In other words, the register does not create the obligation to comply. It merely formalizes a status that already arises from the law itself.


This is not just another register

The law explicitly defines three purposes of the register: identifying essential and important entities, enabling information exchange in the field of cybersecurity, and supporting supervisory activities.

In practice, this means the register is far more than a simple list of entities. It includes, among other things, contact details, sectors and types of activity, domain names, public IP address ranges, information on account administrators, and details on the use of managed security service providers (MSSPs).

Notably, these data are excluded from standard public access regimes. The provisions on access to public information and open data do not apply. Only aggregated data – such as the number of entities by sector or subsector – will be made public.

This is a clear signal: the register is designed as an operational and supervisory tool, not a public directory.


Who does this apply to? Not every company – but far more than before

The key practical challenge is that the answer to “does this apply to us?” rarely comes from a single provision.

The scope of the law is determined by a combination of factors: sector, type of activity, size thresholds, and specific exclusions. Entities listed in Annexes I and II may qualify as essential or important, often depending on whether they meet the threshold of a medium-sized enterprise. In some cases, special rules apply or entities are covered regardless of size.

The law also applies to entities operating in Poland, including through branches or cross-border activity. For certain digital service providers, additional rules apply regarding the main establishment and the appointment of an EU representative.


First classification, then registration – and immediately after, implementation

As a general rule, essential and important entities have six months from the moment they meet the criteria to apply for entry in the register. Any changes must be reported within 14 days.

Registration, however, is only the beginning.

Entities are required to implement an information security management system covering systems used in service delivery, establish internal cybersecurity structures or engage a managed service provider, and comply with incident reporting obligations.

These timelines are tight:

  • 24 hours for an early warning,
  • 72 hours for reporting a significant incident,
  • and, as a rule, one month for the final report.

Entities that already met the criteria when the law entered into force benefit from a transitional period: 12 months to implement the required measures, and for essential entities, 24 months to conduct the first audit.


Failure to register is not a minor formality

The most important practical takeaway is that failing to apply for entry in the register is explicitly linked to financial penalties.

The competent authority may impose fines on entities that fail to submit an application within the statutory deadline. For essential entities, penalties may reach up to EUR 10 million or 2% of annual turnover. For important entities, up to EUR 7 million or 1.4% of annual turnover.

In particularly serious cases – where a violation leads to a direct and significant cyber threat or risks substantial financial damage – penalties may reach up to PLN 100 million.

The law goes further. Managers themselves may also be fined, including for failing to ensure that the registration obligation is fulfilled. These fines may reach up to 300% of the individual’s remuneration.

If an entity fails to act, the authority may register it ex officio and require completion of missing information – under the threat of further sanctions.

At the same time, the law provides that administrative fines may only be imposed after two years from its entry into force. This does not mean, however, that the issue can be postponed. On the contrary, this period is intended for classification, registration, and implementation – not passive waiting.


Now is the time to assess your status – not to guess

In practice, the greatest risk today is not that an organization has failed to implement measures. The greater risk is that it has incorrectly assumed that the law does not apply.

Determining whether an entity qualifies often requires a multi-layered analysis: the actual business model, sector classification, size thresholds, relationships with affiliated entities, the scope of IT systems, and the role within the supply chain.

This is where legal support brings the most value.

We support clients in:

  • assessing whether an entity qualifies as essential or important,
  • determining whether and when registration is required,
  • preparing registration documentation and processes,
  • structuring internal compliance responsibilities,
  • translating statutory requirements into policies, procedures, and contractual arrangements with service providers.

If you are not certain whether your organization should be included in the register, now is the right moment to verify it. In many cases, the challenge is not a lack of diligence – but the fact that the answer is simply not obvious at first glance.

Workplace bullying and harassment: procedures are no longer optional. New obligations for employers are coming

Anti-bullying procedures are no longer just a “nice to have.” The proposed amendments to the Labour Code make it clear: employers will be expected to act actively and continuously – not only react to complaints. For many organizations, this means rethinking their current approach.

Work is currently underway in Parliament on a draft amendment to the Labour Code and the Code of Civil Procedure, which aims to comprehensively reshape regulations on workplace bullying, discrimination, violations of dignity and other personal rights of employees, as well as equal treatment in employment. The draft was adopted by the Council of Ministers on 17 February 2026, and its first reading took place in Parliament on 25 March 2026.

If adopted in its current form, the changes will require employers to take a much broader view of this area than before.


What exactly is changing?

The draft introduces several key changes:

  • a simplified definition of workplace bullying – removing the requirement of long duration and focusing instead on persistent harassment,
  • an obligation to actively and continuously prevent undesirable workplace behaviours,
  • the employer’s right of recourse against the perpetrator (i.e. seeking reimbursement of compensation or damages paid).

These changes may require a thorough review of existing HR policies and procedures.


A procedure is not enough – what matters is action

The changes will affect both employers who already have procedures in place and those who have not yet formally addressed this area.

In many organizations, current solutions focus mainly on:

  • reporting channels
  • appointing investigation committees
  • conducting internal investigations

However, the draft goes much further.

👉 What will matter is not just the document itself, but how it works in practice.

The new approach emphasizes:

  • prevention,
  • detection of irregularities,
  • appropriate response,
  • corrective actions,
  • real support for affected individuals.

In other words, not only the procedure itself may need to be reviewed, but the entire HR compliance model.


New obligations for employers (including those who already have procedures)

The draft assumes that employers with at least 9 employees will be required to:

➡️ define rules, procedures and the frequency of actions
➡️ in relation to preventing:

  • violations of dignity and personal rights,
  • breaches of equal treatment,
  • discrimination,
  • workplace bullying

…and include them in a separate policy (unless already covered by a collective agreement or work regulations).

This leads to one conclusion: having a procedure will no longer be enough.


No procedure in place? This is the last call

The draft is even more significant for employers who:

  • have no procedure at all
  • or rely only on a general anti-bullying clause in internal regulations

These organizations should start preparing now by:

  • mapping risks,
  • setting up reporting channels,
  • defining investigation processes,
  • establishing documentation standards,
  • implementing protection for whistleblowers and witnesses,
  • planning training and internal communication.

👉 The draft clearly shows that practice will matter more than formal wording.


Smaller companies are not exempt

Importantly, the draft does not exempt smaller employers.

Companies employing fewer than 9 people are still required to:

  • prevent workplace bullying and other undesirable behaviours
  • communicate the adopted rules and procedures to employees

The difference lies mainly in the level of formalization, not in the obligation itself.


Burden of proof – a game changer

The draft also introduces an important procedural change.

In cases concerning a breach of equal treatment:

  • the employee will only need to make the violation plausible,
  • the employer will then need to prove that no violation occurred.

👉 In practice, this means one thing:
proper documentation of HR decisions and actions becomes critical.


Why this is a real risk (not just theory)

📊 As many as 93% of respondents declare they have experienced behaviours that may qualify as workplace bullying (Antal & Dobra Foundation research).

At the same time:

  • only 6% of complaints are found justified (data from the National Labour Inspectorate)

This does not mean the problem is rare — rather, it highlights how difficult it is to properly identify and prove such behaviours.

The draft addresses this, among other things, by requiring courts to assess not only bullying, but also potential violations of other personal rights of the employee.

This significantly broadens employers’ legal exposure.


This is not just a change in documentation

This is not just a “paper change.”

👉 It is a shift in how employment-related risk is managed.

Anti-bullying procedures are becoming one of the key organizational risk management tools.

For some, this will mean updating existing frameworks.
For others — the last moment to build them from scratch.

👉 The real question is no longer: “Do you have a procedure?”
But: “Does it actually work – and can you prove it?”


Do you feel that anti-bullying procedures in your organization technically “exist,” but you’re not entirely sure how they work in practice? This might be a good moment to take a closer look. If you’d like, we can go through it together and help you identify what’s worth improving.

JLSW advises on acquisition

Another exciting transaction completed

We had the pleasure of advising Karlik sp. z o.o. on the acquisition of the Renault and Dacia dealership business previously operated by Pieluszyńska sp. z o.o.

The transaction marks another step in the development of the Karlik Group and strengthens its position in the regional automotive market.

Our team provided comprehensive legal support throughout the process — from due diligence, through drafting and negotiating transaction documentation, to signing and closing. We also represented the client in proceedings before the President of the Polish Competition Authority (UOKiK) to obtain merger clearance.

The project was led by Tomasz Janaszczyk and Joanna Żemojtel (Partners), with support from Róża Dziewa (Legal advisor).

Congratulations to all parties involved and thank you for your trust 🤝

GDPR compliance in e-commerce – when marketing tools become a legal risk

E-commerce businesses rely heavily on digital tools to understand users, optimise conversion rates and target advertising. Retargeting platforms, analytics solutions, advertising networks and anti-bot mechanisms have become a standard element of modern online commerce.

From a technological perspective, implementing such tools often appears straightforward. A marketing team deploys a tag, script or API connection, and the tool begins collecting data about user behaviour.

From a legal perspective, however, the reality is often far more complex.

Many of these technologies involve sophisticated data-processing ecosystems that extend beyond the website where they are deployed. What appears to be a simple marketing integration may in practice trigger extensive data sharing with global technology platforms, often involving multiple controllers, cross-border transfers and behavioural profiling.

This growing complexity explains why regulators across Europe have begun to focus more closely on how marketing technologies process personal data.

For e-commerce companies, this area is becoming one of the most sensitive aspects of GDPR compliance.

The hidden complexity of marketing technologies

In many projects we analyse, the implementation process follows a familiar pattern. An online retailer decides to deploy a tool recommended by a marketing agency or technology partner. The integration is quick and technically simple.

But the legal implications may be much more complicated.

A single marketing or analytics tool may involve:

  • multiple independent data controllers,
  • joint controllership arrangements,
  • behavioural tracking and profiling,
  • cross-border data transfers,
  • and the reuse of collected data within global advertising ecosystems.

These elements are not always visible to the businesses deploying the technology.

For example, retargeting platforms may not rely solely on cookies. In some configurations they also process additional identifiers such as hashed email addresses, phone numbers or CRM-based customer identifiers, which are used to match individuals with advertising accounts across multiple platforms.

From a GDPR perspective, such operations may constitute a separate form of personal data processing and therefore require an independent legal basis and additional transparency obligations.

Legal risks behind audience-matching technologies

A particularly sensitive area concerns audience-matching technologies, such as Google Customer Match or Meta Custom Audiences.

These tools allow companies to upload identifiers from their customer databases – typically hashed email addresses or phone numbers – to advertising platforms. The platform then compares these identifiers with its own user accounts and creates targeted advertising audiences.

From a marketing perspective, the mechanism is extremely effective.

From a GDPR perspective, however, it raises significant concerns.

The central issue is the legal basis for such processing.

Companies sometimes attempt to rely on legitimate interest for this type of targeted advertising. However, regulatory practice increasingly suggests that this approach may not be sufficient.

European data protection authorities have pointed out that individuals who provide their contact details to a company – for example during a purchase or account registration – do not reasonably expect that these identifiers will later be used to target them across external advertising ecosystems.

As a result, tools such as Google Customer Match or Meta Custom Audiences may require separate, explicit user consent for the use of customer contact data in advertising audience matching.

Without such consent, companies risk engaging in unlawful disclosure of personal data to third-party advertising platforms.

Regulators are increasingly scrutinising ad-tech practices

Recent regulatory enforcement illustrates that these risks are not merely theoretical.

In 2023, the French data protection authority (CNIL) imposed a €40 million fine on the advertising platform Criteo. Among other issues, the authority concluded that the company had failed to demonstrate a valid legal basis for processing personal data used within its advertising ecosystem. The regulator also identified shortcomings in transparency and the handling of data subject rights.

Similarly, European regulators have questioned the use of advertising audience-matching tools such as Facebook Custom Audiences. German authorities concluded that uploading customer contact data – even in hashed form – may require explicit user consent.

These cases demonstrate a broader regulatory trend: ad-tech ecosystems are increasingly treated as high-risk environments for personal data processing.

When behavioural tracking becomes personal data processing

Another important issue concerns the identifiability of individuals in digital environments.

Many marketing technologies rely on identifiers that do not directly reveal a person’s name or email address. These may include cookie IDs, advertising identifiers, device fingerprints or behavioural profiles.

However, under the GDPR this does not remove them from the category of personal data.

Recital 30 of the GDPR explicitly recognises that individuals may be associated with online identifiers provided by devices, applications, tools and protocols, including cookie identifiers and other tracking technologies. These identifiers may leave traces which, when combined with other information, can be used to create profiles and identify individuals.

In practical terms, this means that if a user views a product – for example a pair of running shoes – on one website and later sees the same product advertised across multiple websites, it demonstrates that the user has been tracked and recognised within an advertising ecosystem.

Even if the platform operator does not know the user’s name, the individual has been identified well enough to target advertising specifically to them.

This is precisely the type of processing that the GDPR was designed to regulate.

Compliance requires both legal and technical design

These examples illustrate a broader point: compliance with the GDPR in digital marketing environments cannot rely solely on contractual clauses or privacy policies.

It requires carefully designed technical and organisational processes.

Businesses deploying marketing technologies should ensure that:

  • users receive clear and detailed information about how their data is processed,
  • the purposes of processing are transparent,
  • the legal bases are correctly identified,
  • the roles of technology providers are properly assessed,
  • data retention periods are clearly communicated,
  • and international data transfers are appropriately disclosed.

In other words, compliance must be built into both the technical implementation and the documentation surrounding it.

The GDPR was created to protect the privacy of individuals – including their privacy in digital environments, where personal data is often generated not through traditional identifiers but through behavioural signals and online tracking technologies.

Supporting e-commerce companies in navigating ad-tech compliance

At JLSW Janaszczyk Lis & Wspólnicy, we regularly support e-commerce companies in analysing the legal implications of digital marketing and analytics tools used within their online ecosystems.

Our work often includes:

  • assessing the roles of technology providers (controller, processor or joint controller),
  • analysing contractual frameworks with global technology platforms,
  • evaluating consent mechanisms and transparency requirements,
  • reviewing international data-transfer structures,
  • and designing risk-mitigation strategies for complex ad-tech environments.

Our goal is not to discourage businesses from using modern marketing technologies. These tools are essential for e-commerce.

Instead, our focus is to ensure that companies can implement them in a way that is both technologically effective and legally sustainable.

Criteo, Cookies and Customer Data – What an Online Store Should Check Before Implementation

Online stores increasingly rely on advanced marketing and analytics tools. Retargeting, ad personalization, and automated offer matching have become standard in e-commerce. The challenge is that implementing these solutions is not only a technical or marketing decision. Very often, it also involves the processing of personal data.

And this is where an important legal question arises: what actually happens to user data when such a tool is implemented?


Before You Implement a Marketing Tool – Check How It Works

In practice, the situation often looks like this: an online store decides to implement a marketing solution recommended by an agency or a technology partner. The implementation usually involves adding a tag or script to the website.

From a marketing perspective, this is a quick and effective way to boost sales. From a data protection perspective, however, it is only the beginning of the analysis.

It is important to determine, among other things:

  • what data is collected by the tool,

  • who acts as the data controller,

  • whether the data is shared with other entities,

  • whether data is transferred outside the European Economic Area,

  • and what the appropriate legal basis for processing is.

Without this analysis, it is easy to assume that if a user has consented to cookies, everything is compliant. In practice, however, that is often only one element of a much more complex picture.


Example: How Criteo Retargeting Works

A good example is the popular retargeting tool Criteo.

In the basic model, part of the user data is collected through Criteo cookies, which allow the identification of users across the web and enable advertising to be tailored to their previous activity.

However, analysis of the documentation and the way the tool operates shows that in some configurations additional user data may also be transferred.

This may include, for example:

  • hashed email addresses,

  • hashed phone numbers,

  • user identifiers from the online store’s CRM system.

Such data can be shared with advertising systems in order to match users and enable even more precise ad targeting.

And this is where the real legal analysis begins.


Cookie Consent Is Not Always Enough

Many organizations assume that if a user has consented to marketing cookies in a cookie banner, all retargeting activities can rely on that consent.

However, cookie consent is derived from rules concerning storing and accessing information on a user’s device, regulated in Poland by the Electronic Communications Law. These rules implement Article 5(3) of the ePrivacy Directive (Directive 2002/58/EC).

In other words, this consent primarily concerns the use of cookies as a technology.

If, however, a marketing tool also involves additional transfers of personal data beyond what is collected via cookies, such as:

  • hashed email addresses,

  • phone numbers,

  • CRM identifiers,

this may constitute a separate personal data processing operation, which requires an independent legal basis under the GDPR.


What Do Data Protection Authorities Say?

In recent years, European data protection authorities have increasingly scrutinized user-matching mechanisms used in advertising systems.

Criteo itself provides a good example. In 2023, the French data protection authority (CNIL) imposed a €40 million fine on the company. According to the authority, Criteo was unable to demonstrate that it had a valid legal basis for processing user data used in its advertising system, including data collected through retargeting mechanisms. CNIL also identified issues related to the exercise of data subject rights and insufficient transparency regarding data processing. The case shows that this type of technology is already under close regulatory scrutiny, which means its implementation should be preceded by a thorough legal and technical assessment.

Another widely discussed case concerns Facebook Custom Audiences. The German data protection authority concluded that uploading customer lists containing email addresses or phone numbers to Facebook – even in hashed form – requires prior user consent.

Importantly, an administrative court upheld this position, noting that hashing does not eliminate the personal data nature of the information, because the platform can still match it to specific users.

The mechanism behind such tools is relatively straightforward: an advertiser uploads a list of customers (for example, email addresses or phone numbers), and the advertising platform matches them with its users to create a targeted advertising audience.

In practice, this means that personal data from an online store’s customer database is shared with an advertising system.


What Are the Risks for E-Commerce?

Failing to conduct a proper legal analysis before implementing a marketing tool can lead to several serious risks.

The most common ones include:

Data Transfers Without a Legal Basis

If a store shares, for example, hashed email addresses or phone numbers of its customers with an advertising system without an appropriate legal basis, this may be considered an unlawful disclosure of personal data under the GDPR.

Lack of Transparency for Users

Users should be informed not only about cookies but also about the possibility that their data may be used in advertising systems for profiling or targeted advertising.

Transfers of Data Outside the EEA

With global advertising platforms, there is often also the issue of data transfers to third countries.


What About GDPR Fines?

The GDPR provides for significant sanctions for violations of data protection rules.

For serious infringements, administrative fines can reach up to €20 million or 4% of a company’s annual global turnover, whichever is higher.

Supervisory authorities are increasingly focusing on digital marketing and advertising technologies, as these areas involve some of the most complex data flows.


Summary

Modern marketing tools can significantly increase the effectiveness of sales in e-commerce. At the same time, their implementation almost always involves the processing of personal data.

Instead of assuming that “the cookie banner solves the problem,” it is worth verifying:

  • what data is actually being processed,

  • whether identifiers from the store’s systems (such as email, phone number, or CRM ID) are being shared,

  • who is responsible for the processing,

  • whether data transfers outside the EEA take place,

  • and whether an additional legal basis for processing is required.

A proper analysis can help avoid many potential problems while also ensuring that documentation and user communication remain clear and compliant.

👉 If you are planning to implement a marketing tool or want to verify whether the solutions used in your online store comply with the GDPR, feel free to contact us. We will be happy to analyze how these tools operate and help you implement them safely.

🔐 NIS2 becomes reality in Poland

🔐 NIS2 becomes reality in Poland

The President has signed the amendment to the National Cybersecurity System Act implementing the NIS2 Directive into Polish law. The new regulations significantly expand cybersecurity obligations for many organisations – as well as the responsibilities of management boards.

What does this mean in practice? Among other things, organisations may be required to:

• implement appropriate cybersecurity risk-management measures
• establish and maintain an information security management system
• organise incident handling and report serious incidents
• ensure adequate governance and oversight at the management level

The new regime also introduces significant administrative fines – up to EUR 10 million or 2% of global turnover, and in specific cases even up to PLN 100 million. The regulations also provide for the possibility of personal liability of management board members.

📅 Key dates:
2 April 2026 – the Act enters into force
2 May 2026 – publication of the list of key and important entities
2 April 2027 – deadline to implement the statutory obligations
2 April 2028 – first audit for key entities

Will your organisation fall within the scope of the new regulations?
What legal obligations will this create for your company and its management?

Contact us – We would be happy to help identify whether the new regulations apply to your organisation and clarify the legal obligations resulting from the new cybersecurity framework.

The Standard That Defines Our Practice

It’s not only about what we do.
It’s also about the standard we work to.
Based on client feedback, Legal 500 has recognised JLSW for Billing & Efficiency and NPS® – a measure of trust and willingness to recommend.
Behind this recognition is a philosophy we believe in: clarity, accountability and solutions grounded in real business needs.
We focus on long-term relationships and solutions that genuinely support our clients’ projects.
This standard is our starting point, not an exception.

.

reCAPTCHA or Risk? Free Protection, Real Accountability.

Google reCAPTCHA is one of the most commonly used tools for protecting online forms against spam and bots. It’s quick to deploy, technically efficient — and very often implemented by default, without much legal reflection.

For a long time, however, the free version of reCAPTCHA raised serious GDPR concerns: no data processing agreement, an unclear scope of data collection, and extensive behavioural analysis taking place in the background.

As of 2 April, the legal model is changing.

Does that mean the compliance issue disappears?
Not quite.


1. The previous model: a service paid for with data

Under the free model, website owners did not pay with money.
They paid with user data.

reCAPTCHA processes, among other things:

  • IP addresses,

  • browser and device identifiers,

  • behavioural interaction data,

  • cookies and related tracking information.

Until recently, the free version was not covered by the Google Cloud Data Processing Addendum (DPA). Google acted as an independent controller rather than a processor within the meaning of Article 28 GDPR.

In practice, this meant:

  • no formal data processing agreement,

  • data potentially processed for Google’s own purposes,

  • limited ability for the website owner to meaningfully control the scope of processing.

It was, in effect, a “free” service operating within a data-driven model.


2. From 2 April: Google as a processor

Google has announced that, from 2 April, the free version of reCAPTCHA will be covered by the Cloud Data Processing Addendum.

This is a significant development.

Under the updated framework, Google is expected to act as a processor on behalf of its customers. At a general level, the DPA contains the elements required under Article 28 GDPR.

From a formal perspective, this is clearly a step in the right direction:

  • the controller–processor relationship is contractually structured,

  • a data processing agreement is in place,

  • the legal framework becomes more predictable.

For European customers, the contracting entity will be Google Cloud EMEA Limited (Ireland), meaning the processor is an EU-based Google entity.

But a DPA alone does not automatically guarantee full compliance.


3. Transparency and data minimisation: still critical questions

The DPA defines the scope of data very broadly as:

“Data relating to individuals provided to Google via the Services, by (or at the direction of) Customer or by its End Users.”

It does not specify concrete categories of personal data.

Based on publicly available information, the processing appears to involve primarily technical and behavioural signals used to distinguish humans from bots, largely processed on a temporary basis.

However:

  • the categories of data are not exhaustively described,

  • the retention period may extend up to 180 days,

  • and each controller must verify how reCAPTCHA is actually implemented in their specific setup.

The core issue is not necessarily that the data is excessive.
The issue is whether the controller can demonstrate that it is proportionate and limited to what is strictly necessary.

Under the GDPR, accountability requires more than trust. It requires evidence.


4. Legal basis: legitimate interest or consent?

Preventing spam and abuse can, in principle, qualify as a legitimate interest under Article 6(1)(f) GDPR.

Following the introduction of the DPA, relying on legitimate interest may be more defensible than before. That said, controllers still need to:

  • carry out and document a proper balancing test,

  • assess proportionality,

  • verify the actual scope of data processed in practice.

There is also the ePrivacy dimension.

If reCAPTCHA relies on non-essential cookies or similar technologies, prior consent may be required under applicable ePrivacy and national cookie rules — unless the tool can genuinely be considered strictly necessary for a service explicitly requested by the user.

And here the tension becomes practical.

A user wants to submit a form.
They do not explicitly request that their behavioural data be analysed by a third party.

If consent is treated as the safest legal basis and reCAPTCHA is only loaded after opt-in:

  • no consent means no protection,

  • the form remains vulnerable,

  • and a bot is unlikely to click “Accept”.

This illustrates that the choice of legal basis is not merely a theoretical compliance debate. It directly affects how your website operates.


5. New DPA. Familiar compliance questions.

As of 2 April, the formal legal position of the free version is clearly stronger than before.

From a contractual standpoint, this is an important improvement. The controller–processor relationship is now structured, and the framework aligns more closely with Article 28 GDPR standards.

But compliance is not achieved by contract alone.

Controllers must still:

  • determine the actual scope of personal data processed in their specific implementation,

  • properly define and document the chosen legal basis,

  • ensure consistency between privacy notices and real data flows,

  • update records of processing activities,

  • assess any international data transfers and applicable safeguards.

Google is undoubtedly moving closer to European data protection expectations.

However, the responsibility for demonstrating GDPR compliance remains with the controller.


What about your website?

Your privacy policy is not just a formality.

It is visible not only to users — but also to competitors, dissatisfied customers, business partners, and, if necessary, supervisory authorities.

A privacy notice should reflect what truly happens behind the scenes.

Are you confident that:

  • all tools used on your website are properly disclosed?

  • the roles of third parties are accurately defined?

  • your legal basis has been genuinely assessed rather than assumed?

  • your documentation would withstand regulatory scrutiny?

We help our clients ensure that what they declare publicly accurately reflects the data processing taking place internally.

If you would like to understand whether your reCAPTCHA setup is simply a security feature — or a potential compliance exposure — let’s talk.

🌍 Madrid wrap-up | Legal Netlink Alliance

Madrid, great conversations, and people who genuinely understand how cross-border cooperation works – that’s probably the most accurate short summary of this year’s Legal Netlink Alliance meeting.

A few intensive days were enough to exchange experiences (and doubts), reflect on where our profession is heading, and once again confirm that in cross-border work, trust and relationships still matter more than theory and polished slides.
This is exactly what we value about LNA: substance without pretence, strategic discussions without unnecessary buzzwords, and a community of professionals you actually want to work with.

In the background, the LNA Board’s concrete initiatives – including the NextWave project – shows that the alliance is thinking well beyond the next date in the calendar and investing in its future.

Many thanks to everyone we had the pleasure of speaking with, and especially to Fourlaw Abogados for the excellent organisation and true Madrid spirit.

See you at the next LNA meeting.

.