All posts tagged intelligent prioritization

It’s Your Turn! Vote Today for the National Law Journal’s ‘Best of 2016’

The polls are open, and it’s awards season. Whatever your persuasion, it’s your turn to vote. The National Law Journal recently announced the finalists for its ‘Best of 2016,’ and Kroll Ontrack is proud to be nominated in NINE categories!

  • Best end-to-end litigation consulting firm
  • Best end-to-end ediscovery provider
  • Best technology assisted review ediscovery solution
  • Best data and technology management ediscovery provider
  • Best data recovery solution provider
  • Best managed document review services
  • Best managed ediscovery and litigation support services provider
  • Best online review platform
  • Best case management software

From now until Friday, February 5, you can vote in the annual reader’s choice survey. This is your chance to rate the products and services you’ve been using in litigation. And it’s an opportunity for those of us in the industry to receive valuable feedback.

While you don’t have to answer every question, we greatly appreciate your support and your feedback – thank you for taking the time!

It’s time to vote.

Two Truths and a Fib About Intelligent Categorization

Two Truths and a Fib About Intelligent Categorization

Time is money, and linear document review is almost prohibitively expensive due to the surge in electronic data volume over the past several years and the corresponding increase in resources required to review the data. Besides time and costs, having a multitude of attorneys reviewing and categorizing documents for (potentially) months on end can yield inconsistent results. Innovative technological advances have arrived on the document review scene, but concerns about overall effectiveness persist as the legal industry remains hesitant to explore new technology. 

Enter Intelligent Categorization (iC). Intelligent Categorization is the third component of Intelligent Review Technology (IRT) that analyzes and learns from category decisions made by human reviewers, then identifies and elevates documents most likely to be relevant and suggests categories for documents not yet reviewed. Along with Automated Workflow and Intelligent Prioritization, the other two legs of IRT, reliance on Intelligent Categorization technology is on its way to becoming a well-established practice in 2011. Differing ideas and opinions associated with this technology have been tossed around, giving rise to certain ideas and misconceptions about what iC is, what iC is not and what iC can do for electronic discovery. Today we will dissipate the confusion and set the record straight by exploring two important truths and a common fib associated with Intelligent Categorization.

Defensible? True.

First and foremost, Intelligent Categorization is defensible. One of the early qualms about iC was that until the technology became court-tested, it was too risky to use. That simply is not the case. In fact, such fears have preceded the acceptance of all new technology, including features such as advanced searching and sampling, which are now embraced by jurists and litigants alike.[1] Case law supports the use of a systematic, well-documented and repeatable process, and Intelligent Categorization is specifically designed to increase accuracy and effectiveness while decreasing review time. Indeed, when using all three components of Intelligent Review Technology, it is possible to save 50 percent on review costs. 

Intelligent Categorization also supports the notions of proportionality set forth in Rule 1 of the Federal Rules of Civil Procedure, with the goal of proceedings to be “just, speedy and inexpensive.”[2] As an integrated component of IRT, iC is fully transparent with real-time metrics and analytics available throughout the review process. In addition, experts can explain the technology to judges, opponents, clients and staff if necessary.

Further, the Sedona Conference® has endorsed the use of automated methods (although has not endorsed particular technologies to do so). The Sedona Conference Best Practices Commentary on the Use of Search and Information Retrieval Methods in Ediscovery, Practice Point 1 states:

[R]eliance solely on a manual search process for the purpose of finding documents may be infeasible or unwarranted. In such cases, the use of automated search methods should be viewed as reasonable, valuable, and even necessary.[3]

In addition, The Sedona Conference Commentary on Achieving Quality in the EDiscovery Process advises practitioners to utilize technology that “reasonably and appropriately enable a party to safely and substantially reduce the amount of ESI that must be reviewed by humans.”[4] These commentaries stress the use of technology to realize the important goal of achieving proportionality in the electronic discovery process that has unfortunately spiraled out of control in recent years.

Effective? True.

Closely linked to defensibility is effectiveness. Before investing in new technology, legal teams must be confident that the new feature will work and is worth the change. With sufficient training data, supervised learning can target documents most likely to be relevant. Using supervised learning to identify and pull responsive documents into categories early reduces the time spent organizing documents responsive to particular requests, and helps reviewers and the legal team better understand the case early on. Also, related documents can be dealt with more efficiently as a group and can even be assigned to a reviewer with expertise in a particular category.

The effectiveness of this technology may also be tested through sampling. Sampling is the key to measuring, monitoring, controlling and correcting potential errors in categorization, and is useful in any review to validate results. The technology can systematically and iteratively test the data to evaluate the accuracy of iC (in addition to Intelligent Prioritization). Without the use of sampling, some courts have concluded a party did not take reasonable steps to prevent disclosure.[5] With the flexibility to conduct as much or as little sampling as desired, iC not only reduces the time needed to complete a review, it improves the consistency of and confidence in category determinations.[6]

Independent studies are also proving that the use of Intelligent Review Technology (including Intelligent Categorization) is more effective than traditional, manual review processes. The Ediscovery Institute released a survey that showed using the technology equivalent of Intelligent Categorization resulted in reduced review costs by 45 percent or more.[7] In addition, the TREC Legal Track study from 2008 demonstrated that a “Boolean keyword search found only 24% of the total number of responsive documents in the target data set” while automated searching methods found 76 percent of the responsive documents.[8]

Devoid of Human Control? False.

Intelligent Categorization is not a process devoid of critical human insight and control. In some instances, this new technology has been pitched as a purely hands-off, eyes-off solution. In reality, Intelligent Review Technology as a whole does not replace human reviewers, nor should it. iC works by “learning” from human decisions and applying human logic when suggesting document categories. Human input is required so the technology has data sets with applied classifications from which to learn, and the system learns from both responsive and non-responsive decisions of human reviewers. As more documents are received and sorted, legal teams can rely on technology to continually improve the model while human reviewers can focus their efforts on the content and substance of the documents. In addition, because the tool was designed to increase consistency and accuracy, it affords the flexibility and scalability to give the ediscovery team more control over the review and to leverage as much or as little human input and oversight as is appropriate for the project. Thus, iC is not a substitute for skilled lawyers; rather, it enhances and compliments the work they do.

The question of whether it is reasonable to omit review of some documents altogether is an as-yet undetermined legal question. From a technical standpoint, however, IRT systems can support a range of approaches to selective review, such as extracting documents with a sufficiently low probability of responsiveness from review, guiding a review to read just the most important portions of long documents or focusing extra review on documents likely to belong to sensitive categories.

In short, Intelligent Categorization is a defensible, effective, cost-saving measure that leverages the work of talented attorneys to decrease the time required to complete document review. It is designed to meet flexibility and repeatability needs of the client, and is proving to be the key differentiator in the ability to respond to electronic discovery demands quickly and proportionately.

Note: The above post appeared in the April 2011 issue of the free, monthly e-newsletter, Case Law Update & Trends published by Kroll Ontrack. This newsletter is designed to help busy legal professionals keep pace with case law and information pertaining to electronic evidence. Subscribe and gain valuable and timely information on new ESI court decisions, as well as informative articles and tips for both the corporate and law firm audience.


[1] See, e.g., William A. Gross Constr. Assocs., Inc. v. Am. Mfrs. Mut. Ins. Co., 2009 WL 724954 (S.D.N.Y. Mar. 19, 2009).

[3] The Sedona Conference® Best Practices Commentary on the Use of Search and Information Retrieval Methods in Ediscovery. (Published August 2007).

[4] The Sedona Conference® Commentary on Achieving Quality in the Ediscovery Process, available for download at http://www.thesedonaconference.org/dltForm?did=Achieving_Quality.pdf. (Published May 2009).

[5] Mt. Hawley Ins. Co. v. Felman Prods. Inc., 2010 WL 1990555 (S.D.W.Va. May 18, 2010). 

[6] The Sedona Conference® Commentary on Achieving Quality in the Ediscovery Process, Principle 2, states: “In the ediscovery context, statistical sampling can serve as a check on the effectiveness of…automated tools in identifying responsive information and on the reviewers’ ability to correctly code documents.”

[7] See “Ediscovery Institute Survey on Predictive Coding,” available at http://www.lawinstitute.org/.

[8] For complete results from TREC Legal Track, visit http://trec-legal.umiacs.umd.edu

Case Law: In re Fontainebleau Las Vegas Contract Litigation

Case Law

Document Dump of Servers Leads to Privilege Waiver

In re Fontainebleau Las Vegas Contract Litig., 2011 WL 65760 (S.D. Fla. Jan. 7, 2011). In this bankruptcy litigation, the requesting party claimed the third party waived privilege by producing three servers in response to a subpoena and court orders without conducting a review for either privilege or responsiveness. Seeking to use the information but avoid any adverse consequences, the requesting party offered to “eat” the cost of searching the massive document dump of approximately 800 GB and 600,000 documents for relevant materials in exchange for the right to review and use the data free of the obligation to appraise or return any privileged documents. Reviewing the third party’s conduct, the court found that its failure to conduct any meaningful privilege review prior to production constituted voluntary disclosure and resulted in a complete waiver of applicable privileges. Noting that more than two months after production the third party had not flagged even one document as privileged, the court rejected its “belatedly and casually proffered” objections as “too little, too late.” Accordingly, the court granted the requesting party full use of these documents during pre-trial preparations of the case, but ordered it to timely advise the third party of any facially privileged information it encountered upon review.

Commentary

This case demonstrates the importance of conducting a proper review and production process. The third party in this case simply “dumped” a significant amount of information onto the requesting party without engaging in a document review; instead, the third party complained that the review and production process would be unduly burdensome and delayed production numerous times. Finally, the third party produced three servers without conducting a privilege review, but did belatedly produce a privilege log for one of the servers.

If the third party had conducted a document review, it would not have been in the position to spend costs on numerous motions and would not have faced the penalty of the court declaring that privilege was waived (with the exception of the server that had an accompanying privilege log). The document review process may seem like a daunting, costly and burdensome task, but it is often a necessary step in ediscovery. Thankfully, Intelligent Review Technology exists and can save you 50 percent on review costs, while improving the quality and defensibility of document review. Through use of Automated Workflow, Intelligent Prioritization and Intelligent Categorization, counsel can avoid conducting the strictly manual process of document review that inherently leads to inaccuracies and inconsistencies. IRT also provides transparent reports and real-time metrics, giving counsel peace of mind as to the defensibility of the technology and process. If the third party in this case invested in technology to conduct the document review, their costs would arguably have been far less than what was spent on numerous motions and the cost of opposing counsel’s access to and use of privileged materials in the pre-trial preparation process.

Whitepaper: Intelligent Review Technology

The uncontrolled proliferation of electronic communication and record keeping has transformed the practice of legal discovery. Today, legal professionals conduct discovery in a whirlwind environment, where legal and IT teams identify, preserve, review and produce responsive, non-privileged data to opposing parties, government agencies, and investigatory bodies. Before the Federal Rules of Civil Procedure were amended in 2006 to specifically provide for the discovery of Electronically Stored Information (ESI), lawyers and judges could resist the onslaught of technology in discovery, favoring traditional exchange of paper documents. Since 2006, federal courts require parties to fully comply with requests for production of ESI and leverage technological tools that can reduce the burden and expense of discovery. State and local courts are following closely behind. Thankfully, the legal community has discovered that technology has the power to improve the discovery process – and thus reduce the expense of litigation overall – by getting to the merits of a case early on, encouraging collaboration and promoting principles of proportionality.

Despite these advancements, document review continues to be the most costly, complicated and time-consuming part of discovery. The vast volume of data encountered in discovery makes exhaustive linear review economically untenable, even when typical filtering by custodian, date, and keywords is first applied. Fortunately, a new generation of Intelligent Review Technology (IRT) promises to further reduce the expense and unpredictability of document review by identifying, prioritizing, routing, and categorizing documents that are most likely responsive.

Download the newest whitepaper published by Kroll Ontrack and learn all about:

  • What Intelligent Review Technology Really Is
  • The Technology and Benefits Behind Automated Workflow, Intelligent Prioritization and Intelligent Categorization
  • Why Supervised Learning is An Essential Component to IRT
  • Applications of this Technology Using Real Case Data
  • Importance of Sampling to Validate Results

Intelligent Review Technology Results in Proven Cost-Savings

The ediscovery landscape is proving to be more treacherous given the rise of data proliferation and the increase in sanctions for mismanaging ESI. As the courts grow increasingly intolerant of discovery failures, litigants are faced with two choices: work harder by investing more resources to ensure thorough review, or work smarter by leveraging fewer resources with cutting-edge technology to achieve superior results. Dynamic companies know the latter is always the best option, and for them, the next generation of ediscovery technology has arrived.

Intelligent Review Technology (IRT) combines the best of both worlds by delivering the discerning analytics of a human review team in an automated platform capable of increased processing speed, consistency and accuracy. Although IRT encompasses many different technologies that can work independently or conjunctively to varying degrees, workflow automation, supervised learning and statistical quality control are the cornerstone features of an effective IRT system that together allow review to be conducted faster, more efficiently and accurately than even the best human review teams equipped with current discovery technology.

IRT learns while you work, and empowers the defensibility of your arguments with transparent reports and real-time metrics. By analyzing decisions made by lawyers, the system applies human logic to identify likely responsive documents and make categorization suggestions. IRT integrates human input with smart technology, reduces costs by 50%, and improves the quality and defensibility of document review.

The technology inside, Ontrack® Inview™, has three components:

  • Workflow: Get the right documents to the right people. Workflow is an automated means to distribute and check in documents.
  • Prioritization: See the most important documents first. Prioritization evaluates reviewer decisions to identify and elevate documents that are most likely responsive, enabling reviewers to view the most relevant documents first.
  • Categorization: Harness the power of technology to learn from human decisions.  Categorization analyzes reviewer decisions and applies logic to suggest categories for the documents not yet reviewed

Kroll Ontrack Launches Final Component of Intelligent Review Technology

Innovative Categorization Features, along with Kroll Ontrack Professional Services Experts, Drive Document Review Efficiencies and Decrease Costs

Kroll Ontrack, the leading provider of information management, data recovery and legal technologies products and services, today announced the launch of Ontrack® Inview™ version 6.5, the award-winning document review tool, which now includes Categorization functionality. Completing the Intelligent Review Technology (IRT) suite within the Ontrack Inview 6.5 tool,Workflow, Prioritization and Categorization integrate expert human logic with smart technology by evaluating and learning from document review decisions made by lawyers. Committed to building innovative discovery technology, Kroll Ontrack IRT expedites document review schedules, improves the quality of review decisions and reduces overall discovery costs, all while empowering attorneys to focus on case strategy.

The newest Ontrack Inview feature, Categorization, builds upon the Workflow and Prioritization features, which were released in June and August 2010, respectively. By analyzing human decisions and then making categorization recommendations for documents not yet reviewed, Categorization addresses a common challenge posed by traditional document review – category determinations are often made inconsistently and inefficiently. In fact, studies show that a second, repeat review, even by the same staff, often leads to different category decisions. With  Categorization technology in the Ontrack Inview review tool, learning immediately begins when category decisions are made by lawyers designated as “trainers.” The system learns with each keystroke and then applies intelligent categorizations to the rest of the review set, which can then be evaluated by the review team and validated using quality control measures.

With Workflow, Prioritization and Categorization, clients can achieve greater than 50 percent costs savings on review. Kroll Ontrack IRT works in four steps:

  1. Train: Lawyers designated as “trainers” review documents and determine whether they are responsive, non-responsive, privileged or some other pre-defined category.
  2. Learn: IRT analyzes reviewer category decisions made by trainers to identify and elevate documents that are most likely relevant and suggest categories for documents not yet reviewed.
  3. Evaluate: Reviewers make categorization decisions, leveraging intelligent suggestions.
  4. Validate: IRT is fully transparent with real-time reports and metrics available to optimize the technology and experience cost savings.

“Traditional linear review is no longer tenable. The manual aspect of document review inherently produces inaccuracies and inconsistencies in categorization decisions, which hinders the implementation of a repeatable, defensible process,” said Michele Lange, director of discovery, Kroll Ontrack. “IRT in the Ontrack Inview tool is unique when compared to other tools because it automatically selects and presents sample documents for your review team versus a review team conducting manual, iterative, time-intensive searches to find key documents from which to learn. Furthermore, Kroll Ontrack IRT learns while you work and empowers the defensibility of your arguments with transparent reports and real-time metrics. Kroll Ontrack is truly leading a revolution that is going to change the face of legal discovery.”

In addition, through its Professional Services team, Kroll Ontrack is prepared to help clients leverage this powerful technology when conducting their own reviews in order to achieve consistency, speed and cost savings. Clients also have the option of using Kroll Ontrack Document Review Services, facilities and team of highly qualified document review attorneys to fully leverage the Ontrack Inview tool and its advanced capabilities.

Intelligent Prioritization Leads to Proven Reduction in Document Review Costs and Time

Kroll Ontrack Intelligent Review Technology provides noteworthy savings in both time and money spent on document review using Intelligent Prioritization.

Case Background

A Fortune 100 corporation involved with litigation related to claims for insurance coverage enlisted the services of Kroll Ontrack – the global leader of information management, data recovery, and legal technology products and services – to identify, collect, process, review and produce information related to the case. The document review in this matter required analysis of over 100,000 documents using three responsiveness categories, several privilege categories and additional confidential and redaction categories. Throughout the process, Kroll Ontrack Document Review Services Professionals worked with outside counsel to create and implement the most efficient process possible and meet tight production deadlines. Ultimately, first-pass review was completed within 29 days, and a total of just over 21,000 documents were produced in a timely fashion.

Reenactment Using Intelligent Prioritization

Kroll Ontrack Document Review Services Experts reenacted the above-described project using Intelligent Prioritization (iP) technology only available in the award-winning document review tool, Ontrack® Inview™. iP operates by learning and applying human logic based on early review decisions to documents not yet reviewed, and immediately identifies and routes case-critical documents to reviewers earlier.

Reenactment Metrics

This reenacted project utilized a staff 1/3 the size of the original review team. On day 25 of this review, the team had coded over 21,000 documents as responsive, using only 28% of the labor hours required to achieve this total on the original project. Ultimately, less than half (an estimated 46%) of the original review hours would have been required to complete the entire document review using iP. As illustrated in the chart below (titled “Total Reviewer Hours”), the use of iP dramatically reduced the total number of reviewer hours incurred, resulting in a significant cost decrease.

Intelligent Prioritization technology also increased average review rates compared to the original project. iP enabled review rates to spike early in the project and maintain a consistently higher rate at the end of the review (see chart titled “Review Rate Comparison”).

In addition to higher review rates, the number of responsive documents coded per reviewer each day significantly improved over the original review. Once iP was fully engaged on day 7 of the review, the number of likely responsive documents routed to the reviewers dramatically increased (as demonstrated in the chart titled “Responsive Docs Per Reviewer Per Day”) then sustained at a consistently higher level than the original review.

Resolution

As illustrated in these diagrams, the use of iP dramatically reduced the total number of reviewer hours incurred, increased average review rates and increased the number of responsive documents coded per reviewer each day. These factors resulted in a significant cost decrease.

This reenacted document review study demonstrates that Intelligent Prioritization technology has the power to significantly reduce the cost of document review and:

  • Expedite the review process
  • Increase reviewer efficiency
  • Identify and route critical documents to reviewers earlier to avoid the potential discovery of critical case elements late in a document review.

Using Review Technology to Boost Quality & Efficiency

When facing a mountain of ESI and a looming discovery deadline, “something must be done” to expedite the review process in a cost-efficient, yet accurate manner.1

In a recent case from the Northern District of California, producing parties intent on saving costs refused to hire a third party, instead relying on five attorneys to conduct the entire review of “every bit of that giant mass of information” with no search terms to narrow the data. Seeing “no end in sight,” the court noted the need for a new method and ordered the parties to split the cost, as offered by the requesting party, of retaining a third-party service provider to assist with review and production.2

This case highlights the formidable challenges of discovery: managing costs, meeting deadlines, utilizing available means to produce all responsive documents, and resolving disputes in the spirit of cooperation and good faith. Importantly, these challenges can be met with the capabilities of intelligent review technology (IRT), which enables document review teams to conduct a repeatable, defensible and efficient process. IRT augments the human-intensive aspects of the document review process, frees attorneys to work on case strategy, improves the quality of review, and results in faster and cheaper discovery.

The foundation of IRT, workflow, provides the technical framework upon which the other aspects of IRT function. Prioritization is then used to analyze reviewer categorization decisions, elevating documents most likely to be relevant to the case to allow the most relevant documents to be reviewed first. Categorization advances the document review project by analyzing human categorization decisions to recommend categories for documents not yet reviewed by a person. IRT ultimately integrates the irreplaceable input from a human team with smart technology for maximum accuracy and efficiency.

Key features attorneys should look for in intelligent review technologies are:

  • Workflow automation, which minimizes human work and inconsistencies in the staging, distribution, routing, assessment and quality control of the review process.
  • Supervised learning, which automatically produces statistical models to prioritize potentially responsive data by learning from manually reviewed documents.
  • Statistical quality control, especially sampling, which monitors the progress and effectiveness of prioritization and review, and supports defensible decisions to cease review.

1Multiven, Inc. v. Cisco Systems, Inc., 2010 WL 2813618 (N.D. Cal. July 9, 2010).

 
css.php