All posts tagged Intelligent Review Technology

Computer-Assisted Coding: Notes from U.S. Magistrate Judge Andrew Peck

Computer-Assisted Coding: Notes from U.S. Magistrate Judge Andrew Peck

On October 1, 2011, Law Technology News published an article from United States Magistrate Judge Andrew Peck titled, “Search, Forward: Time for Computer-Assisted Coding.” In this article, Judge Peck explored the use of computer-assisted coding within document review while discussing the perceived “judicial endorsement” of keyword searching. A common theme throughout his article – which echoes comments made during the Carmel Valley eDiscovery Retreat – is questioning why lawyers seem insistent on receiving a judicial blessing of this technology before using it. Indeed, this was a prevalent question and theme throughout the “What is it and how is it being leveraged?” session at the national Masters Conference that took place in Washington D.C. this week. Some attendance members noted that they were uncomfortable using this technology without agreement from opposing counsel, and may not even use it at all until there was a case certifying its use.

To address the issue of judicial endorsement, Judge Peck provided a thorough analysis regarding the problems inherent in keyword searching, citing several of the well-known opinions on this issue including United States v. O’Keefe, Equity Analytics, LLC v. Lundin, Victor Stanley, Inc. v. Creative Pipe, Inc. and his own opinion, William A. Gross Construction Associates, Inc. v. American Manufacturers Mutual Insurance Co. Judge Peck also cited the Blair and Maron study conducted in 1985, in which a database of 40,000 documents was searched by lawyers. The lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved.[1]

With this analysis aside, Judge Peck then turned to the newest hot button issue – computer-assisted document review, which he defined as “tools… that use sophisticated algorithms to enable the computer to determine relevance, based on interaction with (i.e., training by) a human reviewer.” After discussing how computer-assisted review tools work, Judge Peck noted that there is no federal or state reported case to his knowledge that has ruled on the use of this technology, further stating that it “will be a long wait” for lawyers waiting for a court to conclude: “It is the opinion of this court that the use of predictive coding is a proper and acceptable means of conducting searches under the Federal Rules of Civil Procedure, and furthermore that the software provided for this purpose by [insert name of your favorite vendor] is the software of choice in this court.”

In addition, Judge Peck noted that if the use of computer-assisted review technology was presented or challenged in a case before him, he “want to know what was done and why that produced defensible results,” perhaps being less interested in the “science behind the ‘black box’…than whether it produced responsive documents with reasonably high recall and high precision.” Further, proof of quality control would be important to defending use of the technology.

Judge Peck concluded his article with: “Until there is a judicial opinion approving (or even critiquing) the use of predictive coding, counsel will just have to rely on this article as a sign of judicial approval. In my opinion, computer-assisted coding should be used in those cases where it will help ‘secure the just, speedy, and inexpensive’ (Fed. R. Civ. P. 1) determination of cases in our ediscovery world.”

This blog post is intended to merely highlight the main themes throughout this informative article. We certainly recommend it as a “must read” for those practitioners debating whether the use of computer-assisted technology is right for them. To provide further proof, here is a recent case study demonstrating the power behind the Intelligent Review Technology (IRT) offered by Kroll Ontrack in its award-winning document review platform, Ontrack® Inview™.

Embroiled in a complex patent litigation, a national law firm representing a major health care provider trusted Kroll Ontrack to ensure the document review process was efficiently conducted to meet case critical deadlines. Already faced with 334 gigabytes of data, totaling over 750,000 documents from the outset of the case, the law firm was surprised to discover an additional 325,000 documents mid-way through the review period. Moving the deadline for review was not an option, but reviewing the new batch of documents using traditional linear review methods would have necessitated a significant break from the budget and no guarantees that the deadline could be met.

Following initial filtering work and use of Ontrack® Advanceview™, the leading early data assessment solution, counsel utilized all three aspects of IRT: Automated Workflow, Intelligent Prioritization and Intelligent Categorization. Intelligent Prioritization (iP) ran from the project outset in the background, elevating documents that that were more likely to be responsive. iP significantly aided the review process, allowing the team to review and produce 50,000 documents after only two weeks.

Once the approximately 325,000 new documents were discovered and added to the review queue, counsel needed a way to review the documents while still meeting the strict deadlines, but did not have the budget available to hire additional contract attorneys. However, by leveraging Intelligent Categorization (iC) counsel met the challenge in stride. After intense sampling and analysis of reporting, counsel instructed Kroll Ontrack Document Review Service Professionals to remove documents determined to have a 90 percent confidence rating of non-responsiveness by the iC technology – instantly eliminating nearly half of the documents from the new data set. Any technology is only valuable if it is defensible, so when a rigid quality control process revealed a staggering 94 percent agreement rate between a human review “dream team” and the iC determinations across the data set, counsel was able to confidently report to the senior partners that the deadline would be met on time, within budget and without risk.

The end result? Counsel confidently removed more than 125,000 documents using iC alone, saving almost $200,000 in review costs – with $65,676 attributable to iC.

[1] David C. Blair & M.E. Maron, An Evaluation of Retrieval Effectiveness for a Full-Text Document-Retrieval System, Communications of the ACM, Vol. 28, Issue 3 (March 1985).

Case Law: CBT Flint Partners, LLC v. Return Path, Inc.

Case Law

Federal Circuit Court of Appeals Vacates Taxation of Costs Decision

CBT Flint Partners, LLC v. Return Path, Inc., 2011 WL 3487023 (C.A.Fed. (Ga.)). Previously in this patent infringement litigation, the Northern District of Georgia court granted summary judgment of invalidity regarding the patent dispute. In addition, the district court determined $268,311.22 in costs related to ediscovery were properly taxable. See CBT Flint Partners, LLC v. Return Path, Inc., 2009 WL 5159761 (N.D. Ga. Dec. 30, 2009). On appeal, the Court of Appeals overturned the root issue in the underlying patent litigation, vacated the ruling regarding costs as the defendant was no longer the prevailing party and remanded to the district court for further proceedings.


Although the 2009 opinion regarding taxation of costs was vacated, it is important to remember that the court’s ruling was a result of the Court of Appeals determining the District Court erred in its analysis of the underlying patent dispute – not the Court of Appeals determining that costs were not properly taxable. Indeed, the entire discussion regarding the cost order was brief:

In light of our disposition, Cisco was not a prevailing party and we therefore vacate the district court’s rulings on costs and we deny the cross-appeal. We remand to the district court for further proceedings consistent with this opinion.

Despite this ruling and others on the topic, parties still face the difficult issue of containing costs while navigating ediscovery effectively. How can this seemingly impossible task be achieved? The best advice is for parties to cooperate early on in pretrial conferences. Further, parties must navigate thediscovery process with an eye toward efficiency. Courts that are addressing the issue of costs are largely expressing frustration not only with the lack of cooperation, but the failure to limit discovery so as to keep costs reasonable. The discoverability standard remains extremely broad, and the costs of discovery will vary widely depending upon the facts of the case, but litigants should always do their best to ensure that discovery is at least reasonably scoped to avoid unnecessary expense. Parties should also make use of solutions such as Early Data Assessment and Intelligent Review Technology to conduct a proper, thorough and fast analysis and review of the data potentially at issue.

Case Law: Star Direct Telecom, Inc. v. Global Crossing Bandwidth, Inc.

Case Law

Court Grants Motion to Compel Citing Failure to Identify Information Not Reasonably Accessible

Star Direct Telecom, Inc. v. Global Crossing Bandwidth, Inc., 2011 WL 1125493 (W.D.N.Y. Mar. 21, 2011). In this business litigation, the plaintiff sought disclosure of internal e-mails relating to its breach of contract claim. Opposing the motion, the defendant argued the request was untimely and that the information sought was not relevant, responsive or readily accessible. Noting the duty to supplement production continues even after the discovery period closes, the court found the requested e-mails were relevant and responsive to the plaintiff’s initial document request. Despite the defendant argument that producing the e-mails would require searching Exchange databases housed on an external 4 terabyte storage array at a cost of $13,000, the court asserted that the defendant had a duty to identify sources of information that were not reasonably accessible in its discovery response and rejected its belated arguments regarding burden. Accordingly, the court determined the defendant’s initial production was incomplete and granted the motion to compel.


This case demonstrates the importance of being prepared for the Rule 26(f) meet and confer conference in order to address important ediscovery issues fully and accurately. It is important to engage in these early discussions as well-informed, prepared counsel may find itself in an elevated bargaining position capable of dictating advantageous terms. In the conference, counsel should make efforts to understand the opposing party’s technical landscape, clarify the scope of document requests, resolve any production format disagreements and pre-empt the negative impact of inadvertent production of privileged documents by entering into a clawback agreement. Other ediscovery topics that counsel should discuss include the preservation of evidence, testifying experts, cost allocation and other anticipated evidentiary disputes.

In order to be prepared for this conference, counsel should also understand their client’s electronic information and can achieve this through collaborating with IT personnel and in-house counsel (or the organization’s ediscovery team). A data map, which is essentially an outline of a company’s information systems and processes, is also an incredibly helpful resource in identifying data sources and making determinations regarding accessibility of data – something that was amiss in the current case. Use the data map to strengthen pre-discoverability inaccessibility arguments by providing credible evidence of undue burden and cost. Other technology, such as early data assessment (EDA), can also be helpful in meet and confer preparations. EDA will help you determine appropriate search terms allowing parties to collaborate on this important discussion point. The reporting functions within this technology can also help document the process used to determine search terms and validate the quality of those terms for both hits and non-hits.

Finally, it is important for counsel to discuss the use of advanced technology at this conference and reach a documented agreement regarding whether it is acceptable to use such things as intelligent review technology. Although not discussed in case law yet, reaching an agreement regarding the use of this technology at the Rule 26(f) conference will greatly bolster defensibility. Be prepared to explain to opposing counsel what the technology is and how you plan to use it – a process that may be aided by enlisting the help of an expert .

Two Truths and a Fib About Intelligent Categorization

Two Truths and a Fib About Intelligent Categorization

Time is money, and linear document review is almost prohibitively expensive due to the surge in electronic data volume over the past several years and the corresponding increase in resources required to review the data. Besides time and costs, having a multitude of attorneys reviewing and categorizing documents for (potentially) months on end can yield inconsistent results. Innovative technological advances have arrived on the document review scene, but concerns about overall effectiveness persist as the legal industry remains hesitant to explore new technology.

Enter Intelligent Categorization (iC). Intelligent Categorization is the third component of Intelligent Review Technology (IRT) that analyzes and learns from category decisions made by human reviewers, then identifies and elevates documents most likely to be relevant and suggests categories for documents not yet reviewed. Along with Automated Workflow and Intelligent Prioritization, the other two legs of IRT, reliance on Intelligent Categorization technology is on its way to becoming a well-established practice in 2011. Differing ideas and opinions associated with this technology have been tossed around, giving rise to certain ideas and misconceptions about what iC is, what iC is not and what iC can do for electronic discovery. Today we will dissipate the confusion and set the record straight by exploring two important truths and a common fib associated with Intelligent Categorization.

Defensible? True.

First and foremost, Intelligent Categorization is defensible. One of the early qualms about iC was that until the technology became court-tested, it was too risky to use. That simply is not the case. In fact, such fears have preceded the acceptance of all new technology, including features such as advanced searching and sampling, which are now embraced by jurists and litigants alike.[1] Case law supports the use of a systematic, well-documented and repeatable process, and Intelligent Categorization is specifically designed to increase accuracy and effectiveness while decreasing review time. Indeed, when using all three components of Intelligent Review Technology, it is possible to save 50 percent on review costs.

Intelligent Categorization also supports the notions of proportionality set forth in Rule 1 of the Federal Rules of Civil Procedure, with the goal of proceedings to be “just, speedy and inexpensive.”[2] As an integrated component of IRT, iC is fully transparent with real-time metrics and analytics available throughout the review process. In addition, experts can explain the technology to judges, opponents, clients and staff if necessary.

Further, the Sedona Conference® has endorsed the use of automated methods (although has not endorsed particular technologies to do so). The Sedona Conference Best Practices Commentary on the Use of Search and Information Retrieval Methods in Ediscovery, Practice Point 1 states:

[R]eliance solely on a manual search process for the purpose of finding documents may be infeasible or unwarranted. In such cases, the use of automated search methods should be viewed as reasonable, valuable, and even necessary.[3]

In addition, The Sedona Conference Commentary on Achieving Quality in the EDiscovery Process advises practitioners to utilize technology that “reasonably and appropriately enable a party to safely and substantially reduce the amount of ESI that must be reviewed by humans.”[4] These commentaries stress the use of technology to realize the important goal of achieving proportionality in the electronic discovery process that has unfortunately spiraled out of control in recent years.

Effective? True.

Closely linked to defensibility is effectiveness. Before investing in new technology, legal teams must be confident that the new feature will work and is worth the change. With sufficient training data, supervised learning can target documents most likely to be relevant. Using supervised learning to identify and pull responsive documents into categories early reduces the time spent organizing documents responsive to particular requests, and helps reviewers and the legal team better understand the case early on. Also, related documents can be dealt with more efficiently as a group and can even be assigned to a reviewer with expertise in a particular category.

The effectiveness of this technology may also be tested through sampling. Sampling is the key to measuring, monitoring, controlling and correcting potential errors in categorization, and is useful in any review to validate results. The technology can systematically and iteratively test the data to evaluate the accuracy of iC (in addition to Intelligent Prioritization). Without the use of sampling, some courts have concluded a party did not take reasonable steps to prevent disclosure.[5] With the flexibility to conduct as much or as little sampling as desired, iC not only reduces the time needed to complete a review, it improves the consistency of and confidence in category determinations.[6]

Independent studies are also proving that the use of Intelligent Review Technology (including Intelligent Categorization) is more effective than traditional, manual review processes. The Ediscovery Institute released a survey that showed using the technology equivalent of Intelligent Categorization resulted in reduced review costs by 45 percent or more.[7] In addition, the TREC Legal Track study from 2008 demonstrated that a “Boolean keyword search found only 24% of the total number of responsive documents in the target data set” while automated searching methods found 76 percent of the responsive documents.[8]

Devoid of Human Control? False.

Intelligent Categorization is not a process devoid of critical human insight and control. In some instances, this new technology has been pitched as a purely hands-off, eyes-off solution. In reality, Intelligent Review Technology as a whole does not replace human reviewers, nor should it. iC works by “learning” from human decisions and applying human logic when suggesting document categories. Human input is required so the technology has data sets with applied classifications from which to learn, and the system learns from both responsive and non-responsive decisions of human reviewers. As more documents are received and sorted, legal teams can rely on technology to continually improve the model while human reviewers can focus their efforts on the content and substance of the documents. In addition, because the tool was designed to increase consistency and accuracy, it affords the flexibility and scalability to give the ediscovery team more control over the review and to leverage as much or as little human input and oversight as is appropriate for the project. Thus, iC is not a substitute for skilled lawyers; rather, it enhances and compliments the work they do.

The question of whether it is reasonable to omit review of some documents altogether is an as-yet undetermined legal question. From a technical standpoint, however, IRT systems can support a range of approaches to selective review, such as extracting documents with a sufficiently low probability of responsiveness from review, guiding a review to read just the most important portions of long documents or focusing extra review on documents likely to belong to sensitive categories.

In short, Intelligent Categorization is a defensible, effective, cost-saving measure that leverages the work of talented attorneys to decrease the time required to complete document review. It is designed to meet flexibility and repeatability needs of the client, and is proving to be the key differentiator in the ability to respond to electronic discovery demands quickly and proportionately.

Note: The above post appeared in the April 2011 issue of the free, monthly e-newsletter, Case Law Update & Trends published by Kroll Ontrack. This newsletter is designed to help busy legal professionals keep pace with case law and information pertaining to electronic evidence. Subscribe and gain valuable and timely information on new ESI court decisions, as well as informative articles and tips for both the corporate and law firm audience.

[1] See, e.g., William A. Gross Constr. Assocs., Inc. v. Am. Mfrs. Mut. Ins. Co., 2009 WL 724954 (S.D.N.Y. Mar. 19, 2009).

[3] The Sedona Conference® Best Practices Commentary on the Use of Search and Information Retrieval Methods in Ediscovery. (Published August 2007).

[4] The Sedona Conference® Commentary on Achieving Quality in the Ediscovery Process, available for download at (Published May 2009).

[5] Mt. Hawley Ins. Co. v. Felman Prods. Inc., 2010 WL 1990555 (S.D.W.Va. May 18, 2010). 

[6] The Sedona Conference® Commentary on Achieving Quality in the Ediscovery Process, Principle 2, states: “In the ediscovery context, statistical sampling can serve as a check on the effectiveness of…automated tools in identifying responsive information and on the reviewers’ ability to correctly code documents.”

[7] See “Ediscovery Institute Survey on Predictive Coding,” available at

[8] For complete results from TREC Legal Track, visit

The User Experience – Reinvented

The User Experience – Reinvented

Version 7.0 of the Ontrack® Inview™ review tool features a new design to dramatically enhance usability and reduce the cost of document review.

Building on 10 years of award-winning document review tool innovations, version 7.0 is now complete with a modern, fresh design and offers clients a brand new user experience. Developed by users for users, insight for the new user interface was obtained from a global team of experienced reviewers located in Kroll Ontrack document review facilities around the world. This interface, together with the unique Intelligent Review Technology (IRT), maximize the speed of document review, reiterating the company’s commitment to helping clients achieve a 50 percent plus cost savings on review.

The new, modern Ontrack Inview interface maximizes the speed of your review, saving costs by:

  • Increasing efficiency. With logical task groupings, a new intuitive ribbon bar and more right-click options, clients can gain faster access to commonly used review features.
  • Improving ease of use. Dual monitor support allows for additional screen real estate for easier viewing of potentially relevant documents.
  • Customizing your experience. With drag and drop, dockable viewing panes, the screen layout is now customizable for all review needs.

“Maximizing usability in the industry-leading Ontrack Inview review tool is of the utmost importance to Kroll Ontrack because it directly impacts efficiency, satisfaction and cost savings for our clients,” said Michele Lange, director of discovery, Kroll Ontrack. “When clients are more efficient, it takes less time to accomplish a particular review task, resulting in increased productivity, cost savings and a higher level of customer satisfaction.”

To ensure that Kroll Ontrack was delivering as promised on this new version, Kroll Ontrack conducted a usability study with a user profile of individuals with little to no experience using the Ontrack Inview tool to test the efficiency of the new review tool interface. Kroll Ontrack timed users as they conducted 10 review tasks in previous versions of the tool and in the new Ontrack Inview 7.0 interface. The usability study revealed it was easier to schedule reports, locate documents, highlight key words and translate text. Specifically, these review tasks were conducted 10 percent faster in the Ontrack Inview 7.0 review tool. The new user interface and its enhanced usability and customizability features are also available in the Ontrack® Advanceview™ early data assessment tool, which now features its own distinct look and feel customized for optimum early data assessment in litigation and regulatory matters.

“Kroll Ontrack strives to continually update its technology and services offerings to improve the efficiency and effectiveness of the legal discovery process,” said George May, vice president of product strategy, Kroll Ontrack. “Innovations such as Intelligent Review Technology, which integrates expert human logic with smart technology by evaluating and learning from document review decisions, and the new Ontrack Inview 7.0 review tool reinforce our relentless commitment to helping our clients drive down the total cost of litigation. As the only end-to-end discovery services and technology provider, we are uniquely positioned to offer the most comprehensive, meaningful and cost-effective improvements to our clients to meet their varying needs. Clients can expect and should look forward to continued innovation from us in technology, services and bundled offerings in 2011 and beyond.”

The Ontrack Inview 7.0 review tool and Ontrack Advanceview 7.0 early data assessment tool are available worldwide. For more information about these tools, visit or

Technology Empowers Legal Review Teams and Increases Document Review Efficiency

Technology Empowers Legal Review Teams and Increases Document Review Efficiency

In his recent New York Times article, John Markoff highlights how “advances in artificial intelligence,” (ediscovery software in particular) have created a shift in the way law firms and corporations conduct legal discovery. While Mr. Markoff agrees that technology provides greater efficiencies and cost savings in document review, he claims that a heavier reliance on technology will ultimately result in a decreased demand for lawyers. While we agree that litigants are relying more heavily on technology during discovery, the suggestion that this will have negative repercussions for the practice of law is misguided.

Document review is, by far, the most expensive part of the legal discovery process. Today, for every $1 spent on processing data, $3 or more is spent on document review. This disparity makes traditional document review economically impractical and counter to notions of proportionality and reasonableness. Furthermore, recent court decisions and learned commentary from reputable organizations such as The Sedona Conference® have opened the door for technology to bring efficiencies to the costly practice of document review.

Kroll Ontrack has a long history of developing groundbreaking technology solutions calculated to make the discovery process less expensive and more efficient. These innovations include advanced search technologies, multilingual character recognition and early data assessment. Most recently, Intelligent Review Technology (IRT), which integrates smart technology with legal expertise, has been developed and implemented in the award-winning Ontrack® Inview™ document review tool. IRT expedites document review decisions by prioritizing and suggesting categories for yet-to-be-reviewed documents. IRT learns from lawyers while they work, strengthens the defensibility of document production decisions and ultimately empowers attorneys to focus on what they do best – devising case strategy and making tactical decisions. Most importantly, studies show that this powerful technology can reduce the expense of document review by more than 50 percent.

“Litigants and their counsel are relentlessly pursuing more efficient, cost-effective legal discovery processes,” said George May, vice president of product strategy, Kroll Ontrack. “It’s inevitable that reliance on technology will continue to increase as clients strive to reduce the total cost of litigation.”

Case Law: In re Fontainebleau Las Vegas Contract Litigation

Case Law

Document Dump of Servers Leads to Privilege Waiver

In re Fontainebleau Las Vegas Contract Litig., 2011 WL 65760 (S.D. Fla. Jan. 7, 2011). In this bankruptcy litigation, the requesting party claimed the third party waived privilege by producing three servers in response to a subpoena and court orders without conducting a review for either privilege or responsiveness. Seeking to use the information but avoid any adverse consequences, the requesting party offered to “eat” the cost of searching the massive document dump of approximately 800 GB and 600,000 documents for relevant materials in exchange for the right to review and use the data free of the obligation to appraise or return any privileged documents. Reviewing the third party’s conduct, the court found that its failure to conduct any meaningful privilege review prior to production constituted voluntary disclosure and resulted in a complete waiver of applicable privileges. Noting that more than two months after production the third party had not flagged even one document as privileged, the court rejected its “belatedly and casually proffered” objections as “too little, too late.” Accordingly, the court granted the requesting party full use of these documents during pre-trial preparations of the case, but ordered it to timely advise the third party of any facially privileged information it encountered upon review.


This case demonstrates the importance of conducting a proper review and production process. The third party in this case simply “dumped” a significant amount of information onto the requesting party without engaging in a document review; instead, the third party complained that the review and production process would be unduly burdensome and delayed production numerous times. Finally, the third party produced three servers without conducting a privilege review, but did belatedly produce a privilege log for one of the servers.

If the third party had conducted a document review, it would not have been in the position to spend costs on numerous motions and would not have faced the penalty of the court declaring that privilege was waived (with the exception of the server that had an accompanying privilege log). The document review process may seem like a daunting, costly and burdensome task, but it is often a necessary step in ediscovery. Thankfully, Intelligent Review Technology exists and can save you 50 percent on review costs, while improving the quality and defensibility of document review. Through use of Automated Workflow, Intelligent Prioritization and Intelligent Categorization, counsel can avoid conducting the strictly manual process of document review that inherently leads to inaccuracies and inconsistencies. IRT also provides transparent reports and real-time metrics, giving counsel peace of mind as to the defensibility of the technology and process. If the third party in this case invested in technology to conduct the document review, their costs would arguably have been far less than what was spent on numerous motions and the cost of opposing counsel’s access to and use of privileged materials in the pre-trial preparation process.

Intelligent Review Technology Results in Proven Cost-Savings

The ediscovery landscape is proving to be more treacherous given the rise of data proliferation and the increase in sanctions for mismanaging ESI. As the courts grow increasingly intolerant of discovery failures, litigants are faced with two choices: work harder by investing more resources to ensure thorough review, or work smarter by leveraging fewer resources with cutting-edge technology to achieve superior results. Dynamic companies know the latter is always the best option, and for them, the next generation of ediscovery technology has arrived.

Intelligent Review Technology (IRT) combines the best of both worlds by delivering the discerning analytics of a human review team in an automated platform capable of increased processing speed, consistency and accuracy. Although IRT encompasses many different technologies that can work independently or conjunctively to varying degrees, workflow automation, supervised learning and statistical quality control are the cornerstone features of an effective IRT system that together allow review to be conducted faster, more efficiently and accurately than even the best human review teams equipped with current discovery technology.

IRT learns while you work, and empowers the defensibility of your arguments with transparent reports and real-time metrics. By analyzing decisions made by lawyers, the system applies human logic to identify likely responsive documents and make categorization suggestions. IRT integrates human input with smart technology, reduces costs by 50%, and improves the quality and defensibility of document review.

The technology inside, Ontrack® Inview™, has three components:

  • Workflow: Get the right documents to the right people. Workflow is an automated means to distribute and check in documents.
  • Prioritization: See the most important documents first. Prioritization evaluates reviewer decisions to identify and elevate documents that are most likely responsive, enabling reviewers to view the most relevant documents first.
  • Categorization: Harness the power of technology to learn from human decisions.  Categorization analyzes reviewer decisions and applies logic to suggest categories for the documents not yet reviewed

Kroll Ontrack Launches Final Component of Intelligent Review Technology

Innovative Categorization Features, along with Kroll Ontrack Professional Services Experts, Drive Document Review Efficiencies and Decrease Costs

Kroll Ontrack, the leading provider of information management, data recovery and legal technologies products and services, today announced the launch of Ontrack® Inview™ version 6.5, the award-winning document review tool, which now includes Categorization functionality. Completing the Intelligent Review Technology (IRT) suite within the Ontrack Inview 6.5 tool,Workflow, Prioritization and Categorization integrate expert human logic with smart technology by evaluating and learning from document review decisions made by lawyers. Committed to building innovative discovery technology, Kroll Ontrack IRT expedites document review schedules, improves the quality of review decisions and reduces overall discovery costs, all while empowering attorneys to focus on case strategy.

The newest Ontrack Inview feature, Categorization, builds upon the Workflow and Prioritization features, which were released in June and August 2010, respectively. By analyzing human decisions and then making categorization recommendations for documents not yet reviewed, Categorization addresses a common challenge posed by traditional document review – category determinations are often made inconsistently and inefficiently. In fact, studies show that a second, repeat review, even by the same staff, often leads to different category decisions. With  Categorization technology in the Ontrack Inview review tool, learning immediately begins when category decisions are made by lawyers designated as “trainers.” The system learns with each keystroke and then applies intelligent categorizations to the rest of the review set, which can then be evaluated by the review team and validated using quality control measures.

With Workflow, Prioritization and Categorization, clients can achieve greater than 50 percent costs savings on review. Kroll Ontrack IRT works in four steps:

  1. Train: Lawyers designated as “trainers” review documents and determine whether they are responsive, non-responsive, privileged or some other pre-defined category.
  2. Learn: IRT analyzes reviewer category decisions made by trainers to identify and elevate documents that are most likely relevant and suggest categories for documents not yet reviewed.
  3. Evaluate: Reviewers make categorization decisions, leveraging intelligent suggestions.
  4. Validate: IRT is fully transparent with real-time reports and metrics available to optimize the technology and experience cost savings.

“Traditional linear review is no longer tenable. The manual aspect of document review inherently produces inaccuracies and inconsistencies in categorization decisions, which hinders the implementation of a repeatable, defensible process,” said Michele Lange, director of discovery, Kroll Ontrack. “IRT in the Ontrack Inview tool is unique when compared to other tools because it automatically selects and presents sample documents for your review team versus a review team conducting manual, iterative, time-intensive searches to find key documents from which to learn. Furthermore, Kroll Ontrack IRT learns while you work and empowers the defensibility of your arguments with transparent reports and real-time metrics. Kroll Ontrack is truly leading a revolution that is going to change the face of legal discovery.”

In addition, through its Professional Services team, Kroll Ontrack is prepared to help clients leverage this powerful technology when conducting their own reviews in order to achieve consistency, speed and cost savings. Clients also have the option of using Kroll Ontrack Document Review Services, facilities and team of highly qualified document review attorneys to fully leverage the Ontrack Inview tool and its advanced capabilities.

Using Review Technology to Boost Quality & Efficiency

When facing a mountain of ESI and a looming discovery deadline, “something must be done” to expedite the review process in a cost-efficient, yet accurate manner.1

In a recent case from the Northern District of California, producing parties intent on saving costs refused to hire a third party, instead relying on five attorneys to conduct the entire review of “every bit of that giant mass of information” with no search terms to narrow the data. Seeing “no end in sight,” the court noted the need for a new method and ordered the parties to split the cost, as offered by the requesting party, of retaining a third-party service provider to assist with review and production.2

This case highlights the formidable challenges of discovery: managing costs, meeting deadlines, utilizing available means to produce all responsive documents, and resolving disputes in the spirit of cooperation and good faith. Importantly, these challenges can be met with the capabilities of intelligent review technology (IRT), which enables document review teams to conduct a repeatable, defensible and efficient process. IRT augments the human-intensive aspects of the document review process, frees attorneys to work on case strategy, improves the quality of review, and results in faster and cheaper discovery.

The foundation of IRT, workflow, provides the technical framework upon which the other aspects of IRT function. Prioritization is then used to analyze reviewer categorization decisions, elevating documents most likely to be relevant to the case to allow the most relevant documents to be reviewed first. Categorization advances the document review project by analyzing human categorization decisions to recommend categories for documents not yet reviewed by a person. IRT ultimately integrates the irreplaceable input from a human team with smart technology for maximum accuracy and efficiency.

Key features attorneys should look for in intelligent review technologies are:

  • Workflow automation, which minimizes human work and inconsistencies in the staging, distribution, routing, assessment and quality control of the review process.
  • Supervised learning, which automatically produces statistical models to prioritize potentially responsive data by learning from manually reviewed documents.
  • Statistical quality control, especially sampling, which monitors the progress and effectiveness of prioritization and review, and supports defensible decisions to cease review.

1Multiven, Inc. v. Cisco Systems, Inc., 2010 WL 2813618 (N.D. Cal. July 9, 2010).