AI Law - International Review of Artificial Intelligence LawCC BY-NC-SA Commercial Licence ISSN 3035-5451
G. Giappichelli Editore

28/04/2026 - AI Forensics and Indian Law: Bridging the ‘Black Box’ with Bharatiya Sakshya Adhiniyam (India)

argument: Notizie/News - Legal Technology

Source: LiveLaw

AI-driven tools now perform pattern recognition across forensic domains—untangling complex DNA mixtures, detecting microscopic ballistics striations, automating handwriting comparison, recovering deleted data, identifying malware and tracing blockchain transactions—producing outputs that courts may treat as machine-generated proof. Current Indian admissibility pathways would place such outputs under the framework shifting from the Indian Evidence Act (IEA) to the forthcoming Bharatiya Sakshya Adhiniyam (BSA), but reliance on Section 65B(4) certification and legacy precedents like Anvar P.V. and Arjun Panditrao Khotkar render the existing regime a “straitjacket” for dynamic AI outputs. The principal legal difficulty is the “black-box problem”: opaque algorithmic logic that creates an analytical gap between raw inputs and AI conclusions, prompting courts to require judicial re-verification or independent human expert validation before admitting such evidence.

Comparative approaches show a range of filters: US judges use gatekeeping standards focused on testability, peer review and error rates; the EU’s 2024 AI Act treats forensic systems as high-risk requiring human oversight and transparency; China deploys numerous AI-assist systems for judges; Colombia issued UNESCO-backed guidance in December 2024. Legal scholars recommend statutory reform during the BSA transition to define machine-generated proof, mandate disclosure of system architecture, training data and biases, and amend provisions (proposed reference to S.45 IEA/39 BSA) to recognise validated algorithmic systems as expert sources if reliability standards are met. Complementary measures urged include improving technical literacy among judges and lawyers, ensuring parity of arms with defense access to technical experts, guarding against automation bias, and creating independent oversight or national registries to audit and validate forensic tools—shifting from black-box certification to a glass-box model revealing design, datasets and error rates.