Heuristica Discovery Counsel


February 6, 2023


Q: Is it really possible to accurately review close to a million emails over a weekend?

A: It depends.


In connection with an ongoing controversy about alleged inappropriate emails sent from the Premier’s office to the provincial prosecutorial service (ACPS), the Alberta government recently confirmed that it had identified emails from over 900+ email accounts and caused almost one million emails to be searched by provincial government employees in an effort to establish that the alleged emails had not been sent. All this was said to have occurred within three/four days of the Premier’s direction that the review be conducted. No evidence was found that the alleged emails were sent. Immediately following the review, the Premier called on the Canadian Broadcasting Corporation to retract a story alleging that the emails had been sent and to issue an apology.


The Alberta government has provided little, if any, meaningful detail on the mechanics of its review and it is unknown whether a document review platform with sophisticated search tools was employed. The very short amount of time it took to plan, set up and complete the review, gives rise to several tentative observations (albeit rebuttable in the face of solid detail as to how Alberta’s review was completed).  For example, given the objective, the limited search parameters, and the exceptional time constraints, the email accounts may not have been collected and/or reviewed in a defensible manner. If one were to guess from the limited amount of information available, the emails of the 900+ persons may simply have been searched using Office 365, based only on a fixed list of email addresses and date ranges. The emails so identified would likely then have been searched using only “keywords” without employing the other powerful analytics functions found in state-of-the-art document review platforms. Those analytics include machine learning, email threading, conceptual searching, concept clustering, keyword expansion and sentiment analysis, among others.


The government’s approach to this review thus raises a number of questions, including these:


  • Was a dedicated document review platform used in the review?
  • If so, precisely how much time was needed to collect the contents of 900+ email accounts, process those results and create a search index before the results could be reviewed?
  • If so, what review platform analytics were applied, if any?
  • Was any form of “machine learning” used in the email review?
  • What was the precise methodology used to search for the emails?
  • How many people were assigned the review? What document review experience or qualifications, if any, do they have?
  • Was an external document review firm involved in connection with the review and its planning/execution?
  • Were emails from non-government/ACPS domains searched?
  • Were any non-email documents created within the ACPS respecting the receipt of the allegedly sent emails that might confirm them as having been sent?
  • Have the emails that were subject to the search and review now been preserved?
  • Did the email retention practices of the Premier’s office and the ACPS comply with provincial government requirements?
  • Were the emails of ACPS members or senior management other than the “relevant prosecutors” searched, i.e., those who might be expected to make a decision on behalf of the ACPS in connection with a possible response as the propriety of the allegedly sent emails?

In the absence of a precise explanation as to how the Government’s email review was conducted, there can be only pointed questions and conjecture. So, rather than speculate, let’s look at how evidence management professionals using a purpose-built review platform such as RelativityOne, might approach an assignment which seeks to determine if any improper email communication occurred between the Premier’s office and members of the ACPS.


As a first step, thorough identification, preservation, and collection of potentially relevant emails is critical. For example, including ACPS senior management, other than the “relevant prosecutors”, as email custodians would clearly be appropriate in the circumstances. Any routine document destruction practices should be immediately suspended, and a “legal hold” should be implemented to ensure relevant documents are not inadvertently deleted. Further, best practice would be to collect the identified custodians’ entire email boxes (PST file) and to limit the collection by date range, but no other criteria. Any further searching (other than date range) should be done using the review platform once all the documents have been properly indexed.


RelativityOne has robust search capabilities to efficiently and defensibly identify potentially relevant documents. A simple search of senders and recipients would likely not be sufficient to identify potentially relevant documents. For example, if the original emails between the Premier’s Office and the ACPS were deleted, but those original emails were forwarded internally within the ACPS (as was likely the case had the emails been sent), the email forward evidencing the original communication would not be identified by a search only for emails between the two offices (as the email sender and recipient would both be ACPS employees). Similarly, if personal email addresses were used, searching for emails between the two offices would also not identify the emails in question.


Rather than running a simple search of senders and recipients, additional searches focused on the content of the emails would need to be conducted. Such additional searches could include full text searches for relevant keywords such as “Coutts”, “COVID”, the names of the accused, the relevant court file numbers, etc., as well as the relevant email domains.


In addition to basic keyword searching, RelativityOne’s “keyword expansion” feature could be used to identify synonyms, nicknames or codewords that may have been used to describe similar concepts (ex: “protestor” or “demonstrator”). Moreover, conceptual searches can be employed to identify potentially relevant documents without requiring a precisely worded keyword query. While keyword searching is literal and only returns exact matches, RelativityOne’s concept searching feature attempts to understand the meaning and context of terms to identify conceptual matches. This is also another way to identify potentially relevant documents that may contain typos, synonyms, or other keywords that were not initially identified.


If a large number of documents were returned by the above searches, the resulting set could be prioritized for review using a new analytic tool called Sentiment Analysis. This application can assist with finding the “needle in the haystack” more quickly by using artificial intelligence to identify documents that likely contain negativity, anger, desire, or other highly charged interactions between email participants.


The discussion above is general in nature and not exhaustive.  As is evident, there are a number of important unanswered questions as to the scope and methodology of the review undertaken by the Alberta government. In order to properly assess the accuracy and comprehensiveness of the review undertaken, further details would need to be provided.