სამართლის კულტურა

LEGAL CULTURE

ISSN 3088-4365
E ISSN 3088-4357
Logo
Journal
LEGAL CULTURE
ISSN 3088-4365
E ISSN 3088-4357
2026, N1(2)

FACIAL RECOGNITION TECHNOLOGIES IN GEORGIAN CRIMINAL PROCEDURE - ARTIFICIAL INTELLIGENCE, THE "BLACK BOX" AND THE RIGHT TO A FAIR TRIAL

Author: Eka Khutsishvili, Nino Gvenetadze

DOI: https://doi.org/10.65454/lc/2026/2/51-76


51-76

ABSTRACT

This article is dedicated to the legal analysis of human rights risks associated with the use of Facial Recognition Technologies (FRT) within the framework of Georgian criminal procedure. The paper examines the legal consequences of evidence obtained through the use of FRT, specifically in the context of systemic and large-scale monitoring conducted in public (crowded) places during criminal proceedings. The study identifies gaps in the Law of Georgia on Personal Data Protection, particularly concerning mandatory requirements for the Data Protection Impact Assessment (DPIA) document. These gaps create risks of violating the rights to private life and a fair trial. The research analyses the admissibility of evidence obtained with this method under both Georgian criminal procedure and the standards of Articles 6 (Right to a Fair Trial) and Article 8 (Right to Respect for Private and Family Life) of the European Convention on Human Rights (ECHR). The research relies on: the latest standards established by the Council of Europe and the European Union; Principles developed by the European Court of Human Rights (ECtHR); Doctrinal approaches and standards established by US court practice and academic debates-, such as the Frye/Daubert standards and the "Glass Box" vs "Black Box" doctrines in relation to Artificial Intelligence, including FRT; Georgia's current legislative framework. While Georgia's personal data protection standards have improved following the updated legislation (the Georgian Law "On Personal Data Protection" enacted in March 2024 brought them closer to the EU's General Data Protection Regulation (GDPR) standards), gaps remain. These flaws need to be addressed to ensure full protection of the rights to private life and a fair trial. Specifically, refining the requirements of the Data Protection Impact Assessment (DPIA) document and subsequently utilizing this document in criminal proceedings will play a crucial role in the comprehensive protection of the aforementioned rights.

Keywords: Facial Recognition Technology (FRT), Algorithmic Traceability, Data Protection Impact Assessment (DPIA), Admissibility of Evidence, Fair Trial


REFERENCES

LEGISLATION OF GEORGIA

Constitution of Georgia, 29/06/2020.

Law of Georgia "On Personal Data Protection" (12/11/2025).

Criminal Procedure Code of Georgia (Latest updated edition), 16/10/2025.

Order №21 of the Head of the Personal Data Protection Service, February 28, 2024; Mass Surveillance of Protesters and the Inadequate Response of the Personal Data Protection Service.

LEGISLATION OF THE EUROPEAN UNION

Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 on harmonised rules on artificial intelligence (Artificial Intelligence Act) [2024] OJ L 168/1. https://eur-lex.europa.eu/eli/reg/2024/1689/oj accessed 7 December 2025.

LEGISLATION OF THE EUROPEAN UNION

Consultative Committee of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108), ‘Guidelines on facial recognition’ (Council of Europe, 2021). https://rm.coe.int accessed 7 December 2025.

Council of Europe, Committee of Ministers, Recommendation CM/Rec(2020)1 on the human rights impacts of algorithmic systems (adopted 8 April 2020). https://search.coe.int accessed 7 December 2025.

European Court of Human Rights, Guide on Article 6 of the European Convention on Human Rights (Criminal limb) (2024). https://ks.echr.coe.int accessed 7 December 2025.

OECD, ‘OECD Framework for the Classification of AI systems’ (OECD Digital Economy Papers №323, 2022). https://www.oecd.org accessed 7 December 2025.

OECD, Recommendation of the Council on Artificial Intelligence, C(2019)34/FINAL (adopted 22 May 2019). https://legalinstruments.oecd.org accessed 7 December 2025.

UN General Assembly, AI in judicial systems: promises and pitfalls, Report of the Special Rapporteur on the independence of judges and lawyers, Margaret Satterthwaite, A/80/169 (16 July 2025). https://docs.un.org accessed 7 December 2025.

United Nations Human Rights Council, The right to privacy in the digital age, A/HRC/54/21 (11 September 2023). https://undocs.org accessed 7 December 2025.

INTERNATIONAL LEGAL INSTRUMENTS

Admitting Doubt: A New Standard for Scientific Evidence’ (2010) 123 Harv L Rev 2021. https://harvardlawreview.org accessed 7 December 2025.

Bernstein, D. E., Jackson, J. D. (2004). ‘The Daubert Trilogy in the States’. 44 Jurimetrics J 351. https://www.researchgate.net accessed 7 December 2025.

Burrell, J. (2016). ‘How the machine ‘thinks’: Understanding opacity in machine learning algorithms’. 3 Big Data & Society 1. https://www.researchgate.net accessed 7 December 2025.

Faigman, D. L., Slobogin, Ch., Monahan, J. (2016). ‘Gatekeeping Science: Using the Structure of Scientific Inference to Draw the Line Between Admissibility and Weight in Expert Testimony’. 110 Nw UL Rev 859. https://scholarlycommons.law.northwestern.edu accessed 7 December 2025.

Feigenson, N., Carney, B. (2025). ‘Generative AI as Courtroom Evidence: A Practical Guide’. 52 Mitchell Hamline L Rev 1. https://open.mitchellhamline.edu accessed 7 December 2025.

Kaminski, M. E., Urban, J. M. (2021). ‘The Right to Contest AI’. 121 Colum L Rev 1957. https://www.columbialawreview.org accessed 7 December 2025.

Limantė, A. (2024). ‘Bias in Facial Recognition Technologies Used by Law Enforcement: Understanding the Causes and Searching for a Way Out’. 42 (2) Nordic Journal of Human Rights 115. https://www.tandfonline.com accessed 7 December 2025.

Tracol, X. (2025). ‘The Use of Facial Recognition Technologies by Law Enforcement Authorities in the US and the EU: Towards a Convergence on Regulation?’ TechReg 289

BOOKS, REPORTS, OTHER PUBLICATIONS

Advisory Committee on Evidence Rules, Agenda Book for Committee Meeting, April 19, 2024 (2024). https://www.uscourts.gov accessed 7 December 2025. European Union Agency for Fundamental Rights, Facial Recognition Technology: Fundamental Rights Considerations in the Context of Law Enforcement (FRA, November 2019). https://fra.europa.eu accessed 7 December 2025.

Garrett, B. L., Rudin, C. (2023). ‘Right to a Glass Box: Explainability and Transparency in Criminal Justice Algorithms’ (SSRN Working Paper №4361462). https://ssrn.com accessed 7 December 2025.

Institute for Development of Freedom of Information (IDFI), Institute for Development of Freedom of Information (IDFI), Massive surveillance of protesters and the inadequate response of the Personal Data Protection Service, https://idfi.ge accessed 7 December 2025.

Scirica, A. J. (2020). ‘Preface: The Judges' Book’ in The Judges’ Book: Creating a Fairer, More Effective and More Responsive Judiciary, 1. https://repository.uclawsf.edu accessed 7 December 2025.

US Government Accountability Office, Biometric Identification Technologies: Considerations to Address Information Gaps and Other Stakeholder Concerns, GAO-24-106293 (April 2024). https://www.gao.gov accessed 5 December 2025.

JUDICIAL PRACTICE

JUDGMENTS OF THE EUROPEAN COURT OF HUMAN RIGHTS (ECTHR)

Al-Khawaja and Tahery v United Kingdom (2011), 54 EHRR 23.

Glukhin v Russia, App no 11519/20 (ECtHR, 4 July 2023).

DECISIONS OF THE COURTS OF THE UNITED STATES

Frye v United States 293 F 1013 (DC Cir 1923).

Daubert v Merrell Dow Pharmaceuticals 509 US 579 (1993).

General Electric Co v Joiner 522 US 136 (1997).

Kumho Tire Co v Carmichael 526 US 137 (1999).