Computer-Assisted Coding: Notes from U.S. Magistrate Judge Andrew Peck

Friday, October 7, 2011 by Thought Leadership Team

On October 1, 2011, Law Technology News published an article from United States Magistrate Judge Andrew Peck titled, “Search, Forward: Time for Computer-Assisted Coding.” In this article, Judge Peck explored the use of computer-assisted coding within document review while discussing the perceived “judicial endorsement” of keyword searching. A common theme throughout his article – which echoes comments made during the Carmel Valley eDiscovery Retreat – is questioning why lawyers seem insistent on receiving a judicial blessing of this technology before using it. Indeed, this was a prevalent question and theme throughout the “What is it and how is it being leveraged?” session at the national Masters Conference that took place in Washington D.C. this week. Some attendance members noted that they were uncomfortable using this technology without agreement from opposing counsel, and may not even use it at all until there was a case certifying its use.

To address the issue of judicial endorsement, Judge Peck provided a thorough analysis regarding the problems inherent in keyword searching, citing several of the well-known opinions on this issue including United States v. O’Keefe, Equity Analytics, LLC v. Lundin, Victor Stanley, Inc. v. Creative Pipe, Inc. and his own opinion, William A. Gross Construction Associates, Inc. v. American Manufacturers Mutual Insurance Co. Judge Peck also cited the Blair and Maron study conducted in 1985, in which a database of 40,000 documents was searched by lawyers. The lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved.[1]

With this analysis aside, Judge Peck then turned to the newest hot button issue – computer-assisted document review, which he defined as “tools… that use sophisticated algorithms to enable the computer to determine relevance, based on interaction with (i.e., training by) a human reviewer.” After discussing how computer-assisted review tools work, Judge Peck noted that there is no federal or state reported case to his knowledge that has ruled on the use of this technology, further stating that it “will be a long wait” for lawyers waiting for a court to conclude: “It is the opinion of this court that the use of predictive coding is a proper and acceptable means of conducting searches under the Federal Rules of Civil Procedure, and furthermore that the software provided for this purpose by [insert name of your favorite vendor] is the software of choice in this court."

In addition, Judge Peck noted that if the use of computer-assisted review technology was presented or challenged in a case before him, he “want to know what was done and why that produced defensible results,” perhaps being less interested in the “science behind the ‘black box’…than whether it produced responsive documents with reasonably high recall and high precision.” Further, proof of quality control would be important to defending use of the technology.

Judge Peck concluded his article with: “Until there is a judicial opinion approving (or even critiquing) the use of predictive coding, counsel will just have to rely on this article as a sign of judicial approval. In my opinion, computer-assisted coding should be used in those cases where it will help ‘secure the just, speedy, and inexpensive’ (Fed. R. Civ. P. 1) determination of cases in our ediscovery world.”

This blog post is intended to merely highlight the main themes throughout this informative article. We certainly recommend it as a “must read” for those practitioners debating whether the use of computer-assisted technology is right for them. To provide further proof, here is a recent case study demonstrating the power behind the Intelligent Review Technology (IRT) offered by Kroll Ontrack in its award-winning document review platform, Ontrack® Inview™.

Embroiled in a complex patent litigation, a national law firm representing a major health care provider trusted Kroll Ontrack to ensure the document review process was efficiently conducted to meet case critical deadlines. Already faced with 334 gigabytes of data, totaling over 750,000 documents from the outset of the case, the law firm was surprised to discover an additional 325,000 documents mid-way through the review period. Moving the deadline for review was not an option, but reviewing the new batch of documents using traditional linear review methods would have necessitated a significant break from the budget and no guarantees that the deadline could be met.

Following initial filtering work and use of Ontrack® Advanceview™, the leading early data assessment solution, counsel utilized all three aspects of IRT: Automated Workflow, Intelligent Prioritization and Intelligent Categorization. Intelligent Prioritization (iP) ran from the project outset in the background, elevating documents that that were more likely to be responsive. iP significantly aided the review process, allowing the team to review and produce 50,000 documents after only two weeks.

Once the approximately 325,000 new documents were discovered and added to the review queue, counsel needed a way to review the documents while still meeting the strict deadlines, but did not have the budget available to hire additional contract attorneys. However, by leveraging Intelligent Categorization (iC) counsel met the challenge in stride. After intense sampling and analysis of reporting, counsel instructed Kroll Ontrack Document Review Service Professionals to remove documents determined to have a 90 percent confidence rating of non-responsiveness by the iC technology – instantly eliminating nearly half of the documents from the new data set. Any technology is only valuable if it is defensible, so when a rigid quality control process revealed a staggering 94 percent agreement rate between a human review “dream team” and the iC determinations across the data set, counsel was able to confidently report to the senior partners that the deadline would be met on time, within budget and without risk.

The end result? Counsel confidently removed more than 125,000 documents using iC alone, saving almost $200,000 in review costs – with $65,676 attributable to iC.


[1] David C. Blair & M.E. Maron, An Evaluation of Retrieval Effectiveness for a Full-Text Document-Retrieval System, Communications of the ACM, Vol. 28, Issue 3 (March 1985).