Introduction
This has hit the Indian judiciary on a new dimension where the sanctity of evidence is currently at a crossroad to the technology that is supposed to make humans very effective. In a set of observations in early 2024 and thus growing through to 2026, the Supreme Court of India displayed immense concern over a new trend through which even in the context of divorce and maintenance lawsuits, new evidence that uses AI and fabrication is increasingly being utilized to get preferable results in the courtroom.[1] Now that artificial intelligence is already used in the home, the distinction between the digital reality and the synthetic fabrication has disappeared, endangering the truth-seeking element of the family court system.
The classical matrimonial struggle has changed. It is no longer confined to verbal witnessing and to physical reports, but has entered the realm of synthetic reality: full digital destruction of the personality of the spouse has become the object of desire, not necessarily justice.[2]
The Dimension of Synthetic Cruelty.
Matrimonial conflicts have an emotional and personal nature. Nevertheless, with the realisation of Generative AI (Gen-AI), these struggles have been streamlined into computerised “battlefields” nowadays.[3] AI is finding its way into synthetic history follow by litigants. Contrary to classical forgery, AI-generated art (also known as Deepfakes) can sound, look, and write like a spouse with dreadful precision.
Viewed within the context of Hindu Marriage Act, 1955, and Special Marriage Act, 1954, cruelty is one of the main causes of divorce. Now litigants are creating audio recordings where a spouse is supposedly admitting to adultery or using abusive language and thus achieving a high evidentiary burden of proving “mental cruelty” due to fraudulent creation of audio tape evidence.[4]
The Digital Harassment of women has a gendered nature.
One such very important aspect that has been ignored in the conversation surrounding the subject of deepfakes is the fact that the technology is being abused disproportionately by women in cases of contested matrimony. A new form of coercion is being employed in divorce proceedings as “non-consent intimate imagery” (NCII) generated by AI, often referred to as undressing apps, is being utilized in cases of divorce and separation (Farnham 2021).
The psychological effect is instant when a husband or wife makes such images in a closed cover in the presence of a judge in a Family Court. Although it can be later established that the evidence presented is a fake one by conducting forensic analysis, the stigma of the digital footprint is usually used to coerce the targeted spouse into a bad deal about the alimony or the custody of children[5]. This is prompting a change in the attitude of courts regarding the elements of an interim evidence in the era of AI.
The Digital Forensics Gap in the BSA, 2023.
The definition of document has been broadened by the application of the Bharatiya Sakshya Adhiniyam, 2023 (BSA) to consider electronic records as a broader concept regarding the definition of documentation as a term. The section 63 of the BSA addresses the admissibility of the electronic records, which is concerned with the integrity of the medium.[6]
Nevertheless, there is still a major legal gap. The BSA uses a big amount of certificates to demonstrate authenticity of electronic evidence. This brings about a paradox of Verification:
- Using AI, a litigant generates a fake audio file.
- The litigant then prepares a valid Section 63 certificate indicating that the file was stored in his/her phone.
- The certificate is accepted by the court since it is a source of proof that it comes out of that source and there is no mechanism provided in the law to prove the authenticity of the content contained in that source.[7]
The existing check of “Verifying Hash Value” is not very effective against AI since a deepfake file has an ideal hash value; it is not that the file was modified after it was created but that it was created with the help of an AI.
The Constitutional Issue: Comforter vs. Right to Privacy. Right to Fair Trial
Article 21 of the Constitution of India is also involved in the use of AI-generated evidence. Without fair trial there can be no trial when the evidence presented to determine the life and freedom (or marital status) of a person is synthesized through neural networks.[8] On the other hand, in case the courts require a kind of a forensic audit of each WhatsApp chat presented in a divorce case, this would cause an unprecedented infringement of privacy in addition to the excessively large backlog on a court system that is already overstated.
In K.S. Puttaswamy v. The right to define one digital identity was considered to belong to privacy as a facet of Union of India. Trying to submit a deepfake is not only perjury it is also an identity theft in the courtroom.[9]
Suggestions of Future Legal Investigations.
In order to ensure that justice is not being subverted, the solutions listed below are suggested to be implemented and studied more:
- Obligatory AI-Forensic Preliminary Hearings: Courts need to introduce something like in-camera hearings on sensitive topics: to sift through fake media, it is necessary to introduce so-called Digital Authenticity Hearings at the interim stage at all levels of courts.[10]
- Reverse Onus in Synthetic evidence: Under synthetic evidence, once a party has made out the prima facie case that an electronic record was generated by AI technology, the burden should be placed on the submitter to demonstrate that the record is not AI-generated by means of a native format of the Metadata Transparency. Reverse Onus in Synthetic Evidence: Under synthetic evidence, once a party has established the prima facie case that an electronic record is an AI-generated record, the burden should be shifted onto the submitter to establish that the record is not an AI.[11]
- Strict Liability on BNS: BNS (Bharatiya Nyaya Sanhita) 227-230, on false evidence, shall be construed to contain Synthetic Perjury. Punishment on providing evidence generated by AI in the court should be considerably more than the one of traditional forgery because of its possibility of the mass promotion and harm to the personality.[12]
- The Watermark Mandate: Legal researchers ought to discuss the creation of a so-called Judicial Mandate where all digital evidence presented in matrimonial courts is run through standard AI-detecting software, like the use of anti-plagiarism software in the academia.[13]
Conclusion
The matrimonial law is an eye opener to the Indian legal fraternity due to the emergence of deepfakes in party lawsuits. Technology is efficient, however in the wrong hands, it is also used as a tool of intimidation by a disgruntled litigant. The caution issued by the Supreme Court that litigation is not about instructing the opponent a lesson is more timely than any time before in history to date.[14] In the role of a researcher, we should strive to make sure that the Bharatiya Sakshya Adhiniyam is not merely a cyber form of an ancient law, but a solid defense against the Artificial Intelligence hallucinations. The balance of scales of justice cannot be supported on the pixels of a deepfake.
Author(s) Name: Siddharth Singh Dangi (Symbiosis Law School, Pune)
References:
[1] Neha Lal v. 2024 INSC (Supreme Court of India pressuring the tendency to malicious litigation in marital conflict cases).
[2] Nevertheless, the primary influx of artificial intelligence in the area of courtship and matrimony is expected to bring both beneficial and detrimental effects on the whole judiciary system. Digital Characterization of Performances: AI and Below Family Law, Journal of Indian Legal Theory, Vol. 4, Iss. 2 (2025)
[3] Gupta, Rao, and Zacharaj (2018) criticize existing research and highlight that growing interest in AI and ICT alike is justified by current studies and their applications in this field
[4] V. Bhagat v. D. Bhagat, (1994) 1 SCC 337 (Elaboration of what constitutes mental cruelty, which is now enforced on the digital world).
[5] Facilitating Gendered Technologies: The Gendered Effect of AI-Generated Media in Divorce, International Journal of Law and Society, 2024
[6] The Bhartita Sakshya Adhiniyam(BSA), 2023 – Section 63
[7] Dr. Vivek Dubey, “Authored: Admissibility of Electronic evidences under Bharatiya Sakshya Adhiniyam,” 2024: Indian Law Journal.
[8] Anvar P.V. v. P.K. Basheer, (2014) 10 SCC 473 (The forerunner to electronic evidence standards in India).
[9] Justice K.S. Puttaswamy (Retd.) v. Union of India, (2017) 10 SCC 1.
[10] “Digital Evidence Procedural Reforms Law India commission (Working Paper), 2025.
[11] Bharatiya Sakshya Adhiniyam, Section 63 sub Division 4, 2023 (Requirement of Certificate).
[12] The corresponding Grammatical codes lie with 227-230, Bharatiya Nyaya Sanhita, 2023 (Relating to giving false evidence).
[13] The corresponding Grammatical codes lie with 227-230, Bharatiya Nyaya Sanhita, 2023 (Relating to giving false evidence).
[14] On January 16, 2024, the Supreme Court made an observation about using false allegation as a way of teaching a lesson in cases relating to matrimony.

