As its use expands, so does the need to reflect on the ethical and legal boundaries that must guide this technology. When the face becomes a password, how can the right to privacy be ensured?
In the iGaming sector, the player’s face has become the key to the vault. A fast biometric verification means a new depositor, while a slow or insecure process can result in lost customers and operational risks.
Using this technology without proper governance not only drives away the best players but may also jeopardize the company’s license itself.
Facial biometrics and the LGPD: what the law says about sensitive data
Under Brazil’s General Data Protection Law (LGPD), biometrics are classified as sensitive personal data (Article 11), requiring extra care regarding collection, processing, and storage.
This type of information, if misused or leaked, can compromise an individual’s integrity and dignity.
What the LGPD requires for biometric data processing:
* Valid legal basis (such as fraud prevention and data subject security)
* Informed and specific consent
* Transparency about purposes and risks
* Adequate technical protection measures
However, beyond legal compliance, the real challenge lies in ensuring that individuals are genuinely aware of the value and risks involved in providing such data.
User awareness of biometric data
This awareness involves an educational and cultural process: understanding that biometric data are not mere digital credentials but extensions of human identity itself.
In times of artificial intelligence and algorithmic surveillance, transparency in the use of these technologies is a concrete expression of respect for personal autonomy.
AI governance and corporate accountability in the use of biometrics
In corporate environments, responsibility goes beyond legal compliance. Ethics must be embedded in data and AI governance, aligned with the accountability principles set forth by the Brazilian Institute of Corporate Governance (IBGC).
Thus, accountability means that the company must not only comply with the law but also demonstrate—through documented and auditable evidence—that it takes effective measures to:
* Prevent harm to data subjects
* Avoid algorithmic bias
* Combat discrimination in AI systems
* Ensure auditability and traceability
Facial recognition: power, trust, and social control
Ultimately, the debate around facial recognition is a reflection on trust and power.
Whoever holds the ability to identify a person in milliseconds also holds a significant degree of social and economic control.
For that reason, it is essential that companies and governments advance not only in technological efficiency but also in:
* Independent oversight mechanisms
* Transparency about use and purposes
* Documented accountability
* Clear limits on application
The future of digital identity: security vs. freedom
The future of digital identity will depend on how we manage to balance the security–freedom equation.
Two possible scenarios emerge:
* Well-governed biometrics: strengthen digital trust and reduce fraud ethically and sustainably.
* Unrestricted biometrics: risk turning the human body into exploitable data.
At this crossroads, AI governance, the LGPD, and corporate accountability principles converge—to ensure that technology serves humanity, not the other way around.
Compliance practices in facial recognition: from theory to implementation
Implementing facial recognition systems requires a control architecture that goes beyond formal legal compliance.
In practice, this means building successive layers of protection to address specific risks at each stage of the biometric data lifecycle.
Essential technical controls:
* Continuous algorithmic audits: systematic review of models to identify bias, false positives/negatives, and performance degradation over time
* Privacy Impact Assessments (DPIAs): prior risk mapping before implementing new features or expanding use cases
* Ongoing model monitoring: real-time observability of algorithm behavior in production
* Data minimization: collection strictly limited to the declared purpose, avoiding unnecessary data accumulation
* Advanced encryption: protection of biometric data at rest and in transit, with strict key management
Third-party governance and proportionality:
* Vendor due diligence: verifying that technology partners uphold the same ethical and technical standards
* Specificity of purpose: each use of biometrics must have a documented technical and legal justification
* Defined retention cycles: biometric data cannot be stored indefinitely; retention periods must be proportionate to purpose
* Periodic proportionality reviews: continuous evaluation to confirm necessity and explore less invasive alternatives
In the end, effective biometric governance is not a fixed state but a continuous process of adjustment between technical capability, regulatory demands, and respect for fundamental rights.
Thomas Hannickel
DPO at Legitimuz and finalist for Compliance On Top 2025 in the Privacy and Data Protection category
Legitimuz is a Brazilian company specializing in identity verification, KYC (Know Your Customer) solutions, Advanced Geolocation, and AML (Anti–Money Laundering) for the regulated market. All technologies mentioned in this text—including document verification via OCR, facial recognition with liveness detection, transaction monitoring, and geolocation—are part of the company’s portfolio, always focused on regulatory compliance and operational efficiency.