Martin Felsky

Senior Counsel

April 28, 2026

 

The long-standing view that electronic discovery and disclosure can be treated as purely procedural or technical exercises distinct from the law of evidence is no longer tenable. That position was already difficult to sustain in an era of emails and databases. It becomes untenable in the age of synthetic media.

 

E-discovery (and here I use the term broadly to include disclosure in criminal and regulatory procedure) is about shaping the evidentiary record. What is preserved, collected, filtered, and produced defines what may ultimately be tendered in court. A small but critical subset of that material will be relied upon as evidence. It must therefore withstand scrutiny under the rules of admissibility, including authentication, reliability, and integrity.

 

The emergence of deepfakes, whether wholly fabricated audiovisual content or subtly manipulated authentic recordings, introduces a qualitative shift in this relationship. It is no longer sufficient for litigators to assume that audiovisual files are what they appear to be. Nor is it sufficient to defer concerns about authenticity to the eve of trial. The risk must be addressed upstream, within the e-discovery process itself.

 

The Expanding Risk Profile of Audiovisual Evidence

Deepfakes are not limited to sophisticated, costly fabrications. High-quality synthetic audio and video can now be generated or altered using widely available tools. The risks manifest in at least three distinct ways:

 

  • Fabricated content: Recordings are created entirely by generative AI prompts, with no real-world basis.
  • Deliberately altered authentic content: Genuine recordings are modified with the intent to mislead.
  • Indirect AI influence: Recordings are real but have been processed, enhanced, translated, or summarized using AI tools in ways that affect their integrity.

Though these are not necessarily equivalent risks, they share a common implication: multimedia data can no longer be presumed to be inherently reliable. This aligns with a broader judicial caution regarding AI. Canadian courts have been emphasizing the need for awareness, verification, and careful handling of AI-related risks, particularly where sensitive or consequential information is involved.

Implications for E-Discovery Practice

The deepfake phenomenon requires litigators to rethink several foundational aspects of e-discovery. The objective is not to transform counsel into forensic experts, but to ensure that reasonable, defensible steps are taken to preserve evidentiary integrity within a documented chain of custody. This could include the following processes:

 

1. Early Identification of Risk

At the outset of a matter, counsel should explicitly consider whether audiovisual evidence is likely to be relevant and, if so, whether its authenticity may be contested. This assessment should inform:

 

  • The scope and methods of preservation
  • The level of scrutiny applied during collection and review
  • The need for expert involvement

In appropriate cases, authenticity should be treated as a live issue from the beginning.

 

2. Preservation of Original Files and Metadata

The integrity of audiovisual evidence is closely tied to its provenance. Counsel should take steps to ensure that:

 

  • Original files are preserved in native format wherever possible
  • Metadata (creation date, device information, encoding details) is retained and not altered through processing
  • Any transformations (compression, conversion, transcription) are documented and reproducible

The routine conversion of video files into standardized formats for convenience may, in some cases, compromise the ability to assess authenticity. Original files should always be securely available for inspection, even if more convenient file formats are produced.

 

3. Documenting Chain of Custody

Chain of custody, too often treated as a formality in civil litigation, takes on renewed importance. For audiovisual evidence, counsel should be able to answer:

 

  • Where did the file originate?
  • Who had access to it, and when?
  • What systems or tools processed it?
  • Were any edits, enhancements, or transformations applied?

A clear and documented chain of custody strengthens both admissibility and credibility.

4. Managing AI-Processed Content

Increasingly, audiovisual evidence is not presented in its raw form. It may be:

 

  • Transcribed using speech-to-text tools
  • Translated across languages
  • Enhanced for clarity (e.g., noise reduction, image sharpening, colour grading)
  • Summarized or excerpted

Each of these processes may involve AI. Counsel should treat such outputs as derivative evidence and ensure that:

 

  • The underlying original is preserved and available (if possible, or noted if not)
  • The tools used are identified and understood at a basic level
  • Any limitations or risks associated with the processing are disclosed

Uploading sensitive recordings to third-party AI tools also raises confidentiality and security concerns, which must be carefully managed.

5. Proactive Use of Forensic Expertise

Where authenticity is likely to be contested counsel should consider engaging forensic experts earlier in the process. This may include:

 

  • Assessing whether a recording shows signs of manipulation
  • Validating metadata and file structure
  • Advising on preservation and collection protocols

Waiting until trial to address these issues may be both inefficient and strategically disadvantageous.

6. Review Protocols That Reflect Authenticity Concerns

Traditional document review focuses on relevance and privilege. For audiovisual material, review protocols may need to expand to include:

 

  • Indicators of manipulation (visual artifacts, inconsistencies in audio or lighting)
  • Discrepancies between metadata and apparent content
  • Contextual anomalies (e.g., timing, location, or participants)

Review teams should be sensitized to these issues, even if final determinations are left to experts.

7. Transparency and Disclosure

Finally, counsel should be prepared to address authenticity transparently. This may include:

 

  • Disclosing known issues or uncertainties
  • Identifying any AI tools used in processing evidence
  • Providing access to original files and supporting metadata

In an environment of increasing skepticism, credibility may depend as much on candour as on technical proof.

A Shift in Professional Responsibility

Deepfakes do not simply introduce a new category of evidence: they challenge underlying assumptions about the nature of evidence itself.

 

For e-discovery counsel, this reinforces a principle that has always been present but not always fully appreciated: the discovery process is not neutral. It is an integral part of the evidentiary pipeline. Decisions made at this stage can either preserve or undermine the ability of a court to determine the truth.

 

Litigators must adopt practices that are proportionate, defensible, and aligned with the realities of modern technology. This includes recognizing that audiovisual evidence may now require careful scrutiny and a disciplined approach.

 

 

twitterlinkedinmail