top of page
Writer's pictureErwin SOTIRI

Five common mistakes lawyers make when interpreting eIDAS, MiCA, GDPR, and the AI Act

Updated: Sep 29

In today's digital age, the intersection of law and technology is more critical than ever. Regulations like the eIDAS Regulation, Markets in Crypto-Assets Regulation (MiCA), General Data Protection Regulation (GDPR), and the AI Act are complex legal frameworks that require a nuanced understanding of both legal and technical aspects. Lawyers without sufficient IT knowledge may inadvertently make mistakes in their legal reasoning. This blog post explores five common errors and provides explanations with references to the relevant legal texts.



illustration



Mistake 1: Misunderstanding the scope of electronic signatures under eIDAS

Some legal profesisonal assume that only qualified electronic signatures (QES) are legally valid and enforceable under the eIDAS Regulation, thereby dismissing advanced electronic signatures (AdES) and simple electronic signatures (SES) as legally insufficient.


The eIDAS Regulation (EU) No 910/2014 establishes a comprehensive legal framework for electronic identification and trust services across the European Union. A common misconception arises when legal professionals believe that only QES carry legal weight equivalent to handwritten signatures, leading them to disregard the validity of AdES and SES.


Under eIDAS, electronic signatures are categorised into three levels: SES, AdES, and QES. A Simple Electronic Signature encompasses any electronic data attached to or logically associated with other electronic data and used by the signatory to sign. An AdES builds upon this by being uniquely linked to the signatory, capable of identifying them, and created using means under their sole control. A QES is an AdES that is created by a qualified electronic signature creation device and is based on a qualified certificate issued by a qualified trust service provider (QTSP).


Why it is a mistake:

The erroneous assumption neglects the flexibility embedded within the eIDAS Regulation regarding the legal effects of electronic signatures. Article 25(1) of eIDAS explicitly states that an electronic signature shall not be denied legal effect and admissibility as evidence in legal proceedings solely on the grounds that it is in electronic form or does not meet the requirements of a QES. This provision means that SES and AdES can indeed be legally valid and enforceable, depending on the context of their use and the evidential requirements of the transaction in question.


By insisting exclusively on QES, legal professionals may impose unnecessary burdens on their clients or organisations. QES often require more complex and costly procedures, including the use of specific hardware or face-to-face identity verification. In many commercial transactions, an AdES or even an SES may provide sufficient legal assurance, especially when supplemented by additional evidence or security measures.


Moreover, the misconception ignores the real-world uses and technological developments that contribute to the security and dependability of SES and AdES. Beyond the fundamental needs of SES, modern electronic signature platforms frequently include strong authentication techniques, audit trails, and tamper-evident technology to further improve the integrity and authenticity of electronic signatures.


Legal interpretations:

The legal framework provided by eIDAS is designed to be technology-neutral and adaptable to various levels of risk and types of transactions. Recital 49 of the Regulation emphasises the need to ensure that electronic signatures can be used and recognised across the EU without imposing unnecessary barriers. The courts have discretion to assess the evidential value of an electronic signature based on the reliability of the methods used, rather than solely on its qualification status.


By misinterpreting the scope and validity of SES and AdES, legal professionals risk providing advice that is not only legally inaccurate but also commercially impractical. Understanding the legal provisions and technological capabilities is essential to leveraging electronic signatures effectively and compliantly.


Reference:



illustration


Mistake 2: Believing that technical measures like pseudonymization and encryption fully exempt an organisation from GDPR obligations regarding personal data


The General Data Protection Regulation (EU) 2016/679 sets out comprehensive rules for the protection of personal data within the European Union. A common misapprehension among legal practitioners without IT expertise is the assumption that technical measures like pseudonymization and encryption render personal data anonymous, thereby exempting the organisation from GDPR obligations.


Article 4(5) of the GDPR defines pseudonymization as the processing of personal data so that, without the use of additional information, it can no longer be attributed to a particular data subject; this can only happen if the additional information is kept separate and is subject to organisational and technical safeguards to ensure non-attribution. To stop unauthorised access, data must be encrypted and turned into a code.


Why it is a mistake:

This misconception results from confusing anonymisation with pseudonymization. Although both improve data security, pseudonymized data is nonetheless subject to the GDPR as, in the event that the pseudonym is reversed or connected to further data, it still refers to identifiable individuals. According to GDPR Recital 26, any information pertaining to an identified or identifiable natural person is deemed personal data.


Encryption, similarly, does not remove data from the ambit of the GDPR if the organisation retains the means to decrypt the information. If the data controller or processor can access the data in its original form, the obligations under the GDPR remain fully applicable.


By assuming that these technical measures provide full exemption, lawyers may advise organisations improperly, leading to potential non-compliance with GDPR provisions such as data subject rights, lawful processing requirements, and accountability obligations.


Legal interpretations:

As part of a data protection by design and by default strategy, the GDPR encourages the adoption of technological solutions like encryption and pseudonymization (Article 25). These steps do not remove the need to adhere to other GDPR regulations; rather, they are intended to improve security and lower risks to data subjects..


For example, in the event of a data breach, Article 34 outlines that if the compromised data is encrypted and therefore unintelligible to unauthorised persons, the obligation to notify data subjects may not apply. However, this is an exception rather than a general rule of exemption from GDPR obligations.


Pseudonymization and encryption are useful tools for data protection, but they do not release organisations from their broader obligations under the GDPR, and this must be acknowledged in order to properly understand the law.


Reference:



illustration


Mistake 3: Lawyers misinterpreting MiCA's applicability to all crypto assets

The Markets in Crypto-Assets Regulation (MiCA) aims to create a harmonised regulatory framework for crypto assets across the European Union. Some legal practitioners without sufficient technical understanding might mistakenly believe that MiCA encompasses all crypto assets, including those that qualify as financial instruments under existing legislation such as the Markets in Financial Instruments Directive II (MiFID II).


MiCA is designed to regulate crypto assets that are not already covered by existing EU financial services legislation. It introduces rules for issuers of crypto assets and service providers, intending to protect consumers and ensure market integrity.


Why it is a mistake:

This misinterpretation ignores the clear exclusions listed in MiCA. According to Article 2(2) of the MiCA, among other things, crypto assets that meet the requirements of securitisation, the deposit guarantee schemes directive, or MiFID II as financial instruments, are exempt from the Regulation. Attorneys may unintentionally counsel clients to adhere to the incorrect legal framework by incorrectly applying MiCA to cryptocurrency assets that are already governed by MiFID II or other financial services regulations. This may result in needless compliance efforts where current legislation already apply, or on the other hand, it may result in a failure to comply with the relevant regulations, putting clients at risk for financial and legal consequences.


Legal interpretations:

To identify the regulatory categorisation of a crypto asset, a detailed examination of its unique features is necessary for a correct legal interpretation. For example, if a cryptocurrency asset demonstrates traits of a transferable securities, it can be covered under MiFID II instead of MiCA. The European Securities and Markets Authority (ESMA) and the European Banking Authority (EBA) have provided guidance on the classification of crypto assets, emphasising the need for case-by-case assessment.


Determining the regulatory status of a cryptocurrency asset accurately requires an understanding of its economic activities and technological qualities. Misunderstandings can result in reputational damage, enforcement actions, and noncompliance.


Reference:



illustration


Mistake 4: Ignoring AI Act requirements for high-risk AI systems

The European Regulation (EU) 2024/1689 of 13 June 2024 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) aims to regulate AI systems based on a risk-based approach, imposing stricter requirements on high-risk AI systems. Some professionals may mistakenly assume that adherence to the GDPR's data protection principles is sufficient for the deployment of AI systems, including those classified as high-risk under the AI Act.


The AI Act defines high-risk AI systems in Annex III, covering applications in areas such as critical infrastructure, education, employment, essential services, law enforcement, and migration. These systems are subject to specific obligations to ensure they do not pose unacceptable risks to fundamental rights, safety, or other public interests.


Why it is a mistake:

While GDPR compliance is important, it does not encompass all the requirements set forth by the AI Act for high-risk AI systems. The AI Act introduces additional obligations that address the unique challenges posed by AI technologies.

For high-risk AI systems, the AI Act mandates the implementation of a comprehensive risk management system (Article 9), which involves continuous iterative processes to identify, analyse, estimate, and evaluate risks associated with the AI system. Furthermore, Article 10 requires rigorous data governance practices, ensuring that training, validation, and testing datasets are relevant, representative, free of errors, and complete to the extent possible.


Additionally, suppliers of high-risk AI systems are required to create comprehensive technical documentation (Article 11) that evidences adherence to the AI Act. This encompasses details regarding the system's architecture, its developmental trajectory, and its intended functionality, alongside evaluations of its performance and safety assessments.


Legal interpretations:

The AI Act represents a distinct regulatory framework that complements but does not duplicate the GDPR. It addresses broader concerns related to AI's impact on society, including ethical considerations, transparency, accountability, and human oversight.


Legal professionals must recognise that high-risk AI systems require a multidisciplinary approach to compliance, integrating legal, technical, and ethical perspectives. Failure to appreciate the specific obligations under the AI Act can result in inadequate risk management and non-compliance with EU law.


Reference:


illustration


Mistake 5: Assuming that data stored on blockchain cannot comply with GDPR

There is a prevailing belief that blockchain technology fundamentally clashes with GDPR regulations, particularly concerning Article 17's stipulation of the right to erasure, commonly referred to as the "right to be forgotten." The inherent decentralisation and immutability of blockchain technology guarantee that once data is recorded within the system, it remains permanent and unalterable. Consequently, certain legal experts argue that blockchain-based solutions may conflict with the GDPR, especially concerning Article 17's stipulation on the right to deletion. In specific scenarios, including instances where the data is no longer necessary for its intended purpose or when the individual revokes consent, Article 17 grants individuals the authority to request the deletion of their personal data.


Why it is a mistake:

This conclusion fails to consider the technological solutions and legal interpretations that reconcile blockchain's features with GDPR requirements. While blockchain's immutability poses challenges, it does not render compliance impossible.


Technically, one approach is to store personal data off-chain while recording references or hashes of the data on-chain. This means that the personal data itself is not stored on the blockchain, and if a data subject exercises their right to erasure, the off-chain data can be deleted, rendering the on-chain reference meaningless.


Another solution could be encrypting personal data before adding it to the blockchain. If erasure is requested, the encryption keys can be destroyed (a process known as crypto-shredding), making the data inaccessible and effectively erased from a practical standpoint.


Legal interpretations:

Recital 26 of the GDPR states that "The principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable". This suggests that if data on a blockchain can be truly anonymised, it may fall outside the scope of the GDPR.


The tension between blockchain technology and GDPR compliance has been a subject of significant research and debate. A systematic literature review published in the Journal of Information Security and Applications highlights the ongoing discussions among researchers, practitioners, policymakers, and blockchain users about the legal compliance of blockchain systems.


The European Data Protection Board (EDPB) has recognised the need for guidance on this issue. In its 2023/2024 Work Programme, the EDPB included plans to develop guidelines on blockchain, indicating the importance of addressing GDPR compliance in this context.


A study commissioned by the European Parliament suggests that adherence to data protection standards can be used by data controllers to demonstrate compliance with their obligations under Article 24 of the GDPR. This implies that a nuanced approach, focusing on implementing appropriate technical and organisational measures, may be key to achieving GDPR compliance in blockchain systems.


In the current regulatory environment, it is essential to comprehend the technological subtleties of digital technologies in order to make proper legal arguments. In order to guarantee compliance with intricate rules such as eIDAS, GDPR, MiCA, and the AI Act, lawyers must work in tandem with IT specialists. By avoiding these typical blunders, organisations can protect themselves from legal trouble and promote a more successful integration of law and technology.


Reference:



Additional Resources



60 views
bottom of page