All posts tagged Intelligent Review Technology

It’s Your Turn! Vote Today for the National Law Journal’s ‘Best of 2016’

The polls are open, and it’s awards season. Whatever your persuasion, it’s your turn to vote. The National Law Journal recently announced the finalists for its ‘Best of 2016,’ and Kroll Ontrack is proud to be nominated in NINE categories!

  • Best end-to-end litigation consulting firm
  • Best end-to-end ediscovery provider
  • Best technology assisted review ediscovery solution
  • Best data and technology management ediscovery provider
  • Best data recovery solution provider
  • Best managed document review services
  • Best managed ediscovery and litigation support services provider
  • Best online review platform
  • Best case management software

From now until Friday, February 5, you can vote in the annual reader’s choice survey. This is your chance to rate the products and services you’ve been using in litigation. And it’s an opportunity for those of us in the industry to receive valuable feedback.

While you don’t have to answer every question, we greatly appreciate your support and your feedback – thank you for taking the time!

It’s time to vote.

Podcast: Computer-Assisted Coding Implementation, Judicial Commentary & Cloud Computing

EDiscovery Podcast: Get the Latest Details on All the Current Ediscovery Trends!

On The ESI Report, host Kelly Kubacki, Attorney in the Thought Leadership & Industry Relations division at Kroll Ontrack welcomes Jessica Jones, Litigation Support Senior Analyst at Ropes & Gray, and Allison Berres, Legal Consultant for Kroll Ontrack, to discuss the hot issue of computer-assisted coding, including its defensibility, what judges are saying and how to actually implement this technology. In the Bits & Bytes Legal Analysis, Kroll Ontrack Legal Correspondent, Elliot Westman, explores the recent case of Suzlon Energy Limited v. Microsoft Corporation.

Download the Latest Edition of the ESI Report!

Computer-Assisted Coding: Notes from U.S. Magistrate Judge Andrew Peck

Computer-Assisted Coding: Notes from U.S. Magistrate Judge Andrew Peck

On October 1, 2011, Law Technology News published an article from United States Magistrate Judge Andrew Peck titled, “Search, Forward: Time for Computer-Assisted Coding.” In this article, Judge Peck explored the use of computer-assisted coding within document review while discussing the perceived “judicial endorsement” of keyword searching. A common theme throughout his article – which echoes comments made during the Carmel Valley eDiscovery Retreat – is questioning why lawyers seem insistent on receiving a judicial blessing of this technology before using it. Indeed, this was a prevalent question and theme throughout the “What is it and how is it being leveraged?” session at the national Masters Conference that took place in Washington D.C. this week. Some attendance members noted that they were uncomfortable using this technology without agreement from opposing counsel, and may not even use it at all until there was a case certifying its use.

To address the issue of judicial endorsement, Judge Peck provided a thorough analysis regarding the problems inherent in keyword searching, citing several of the well-known opinions on this issue including United States v. O’Keefe, Equity Analytics, LLC v. Lundin, Victor Stanley, Inc. v. Creative Pipe, Inc. and his own opinion, William A. Gross Construction Associates, Inc. v. American Manufacturers Mutual Insurance Co. Judge Peck also cited the Blair and Maron study conducted in 1985, in which a database of 40,000 documents was searched by lawyers. The lawyers believed their manual search retrieved 75% of relevant documents, when only 20% were retrieved.[1]

With this analysis aside, Judge Peck then turned to the newest hot button issue – computer-assisted document review, which he defined as “tools… that use sophisticated algorithms to enable the computer to determine relevance, based on interaction with (i.e., training by) a human reviewer.” After discussing how computer-assisted review tools work, Judge Peck noted that there is no federal or state reported case to his knowledge that has ruled on the use of this technology, further stating that it “will be a long wait” for lawyers waiting for a court to conclude: “It is the opinion of this court that the use of predictive coding is a proper and acceptable means of conducting searches under the Federal Rules of Civil Procedure, and furthermore that the software provided for this purpose by [insert name of your favorite vendor] is the software of choice in this court.”

In addition, Judge Peck noted that if the use of computer-assisted review technology was presented or challenged in a case before him, he “want to know what was done and why that produced defensible results,” perhaps being less interested in the “science behind the ‘black box’…than whether it produced responsive documents with reasonably high recall and high precision.” Further, proof of quality control would be important to defending use of the technology.

Judge Peck concluded his article with: “Until there is a judicial opinion approving (or even critiquing) the use of predictive coding, counsel will just have to rely on this article as a sign of judicial approval. In my opinion, computer-assisted coding should be used in those cases where it will help ‘secure the just, speedy, and inexpensive’ (Fed. R. Civ. P. 1) determination of cases in our ediscovery world.”

This blog post is intended to merely highlight the main themes throughout this informative article. We certainly recommend it as a “must read” for those practitioners debating whether the use of computer-assisted technology is right for them. To provide further proof, here is a recent case study demonstrating the power behind the Intelligent Review Technology (IRT) offered by Kroll Ontrack in its award-winning document review platform, Ontrack® Inview™.

Embroiled in a complex patent litigation, a national law firm representing a major health care provider trusted Kroll Ontrack to ensure the document review process was efficiently conducted to meet case critical deadlines. Already faced with 334 gigabytes of data, totaling over 750,000 documents from the outset of the case, the law firm was surprised to discover an additional 325,000 documents mid-way through the review period. Moving the deadline for review was not an option, but reviewing the new batch of documents using traditional linear review methods would have necessitated a significant break from the budget and no guarantees that the deadline could be met.

Following initial filtering work and use of Ontrack® Advanceview™, the leading early data assessment solution, counsel utilized all three aspects of IRT: Automated Workflow, Intelligent Prioritization and Intelligent Categorization. Intelligent Prioritization (iP) ran from the project outset in the background, elevating documents that that were more likely to be responsive. iP significantly aided the review process, allowing the team to review and produce 50,000 documents after only two weeks.

Once the approximately 325,000 new documents were discovered and added to the review queue, counsel needed a way to review the documents while still meeting the strict deadlines, but did not have the budget available to hire additional contract attorneys. However, by leveraging Intelligent Categorization (iC) counsel met the challenge in stride. After intense sampling and analysis of reporting, counsel instructed Kroll Ontrack Document Review Service Professionals to remove documents determined to have a 90 percent confidence rating of non-responsiveness by the iC technology – instantly eliminating nearly half of the documents from the new data set. Any technology is only valuable if it is defensible, so when a rigid quality control process revealed a staggering 94 percent agreement rate between a human review “dream team” and the iC determinations across the data set, counsel was able to confidently report to the senior partners that the deadline would be met on time, within budget and without risk.

The end result? Counsel confidently removed more than 125,000 documents using iC alone, saving almost $200,000 in review costs – with $65,676 attributable to iC.


[1] David C. Blair & M.E. Maron, An Evaluation of Retrieval Effectiveness for a Full-Text Document-Retrieval System, Communications of the ACM, Vol. 28, Issue 3 (March 1985).

Case Law: CBT Flint Partners, LLC v. Return Path, Inc.

Case Law

Federal Circuit Court of Appeals Vacates Taxation of Costs Decision

CBT Flint Partners, LLC v. Return Path, Inc., 2011 WL 3487023 (C.A.Fed. (Ga.)). Previously in this patent infringement litigation, the Northern District of Georgia court granted summary judgment of invalidity regarding the patent dispute. In addition, the district court determined $268,311.22 in costs related to ediscovery were properly taxable. See CBT Flint Partners, LLC v. Return Path, Inc., 2009 WL 5159761 (N.D. Ga. Dec. 30, 2009). On appeal, the Court of Appeals overturned the root issue in the underlying patent litigation, vacated the ruling regarding costs as the defendant was no longer the prevailing party and remanded to the district court for further proceedings.

Commentary

Although the 2009 opinion regarding taxation of costs was vacated, it is important to remember that the court’s ruling was a result of the Court of Appeals determining the District Court erred in its analysis of the underlying patent dispute – not the Court of Appeals determining that costs were not properly taxable. Indeed, the entire discussion regarding the cost order was brief:

In light of our disposition, Cisco was not a prevailing party and we therefore vacate the district court’s rulings on costs and we deny the cross-appeal. We remand to the district court for further proceedings consistent with this opinion.

Despite this ruling and others on the topic, parties still face the difficult issue of containing costs while navigating ediscovery effectively. How can this seemingly impossible task be achieved? The best advice is for parties to cooperate early on in pretrial conferences. Further, parties must navigate thediscovery process with an eye toward efficiency. Courts that are addressing the issue of costs are largely expressing frustration not only with the lack of cooperation, but the failure to limit discovery so as to keep costs reasonable. The discoverability standard remains extremely broad, and the costs of discovery will vary widely depending upon the facts of the case, but litigants should always do their best to ensure that discovery is at least reasonably scoped to avoid unnecessary expense. Parties should also make use of solutions such as Early Data Assessment and Intelligent Review Technology to conduct a proper, thorough and fast analysis and review of the data potentially at issue.

Podcast: Automated Review Technology, Defensibility & Self-Collection Dangers

EDiscovery Podcast: Get the Latest Details on All the Current Ediscovery Trends!

Bored during the commute? Need a break from work? Interested in learning more about cutting-edge ediscovery issues and technologies? Download the newest episode of the monthly podcast, “The ESI Report” hosted by Kroll Ontrack.

Struggling to control the mounting costs of document review? On this April edition of The ESI Report, host Kelly Kubacki, Staff Attorney in the Legal Technologies division at Kroll Ontrack welcomes Gary Feldon and Maureen Japha, Associates with Covington & Burling LLP and Beth Koehler, Legal Consultant with Kroll Ontrack, to discuss how automated review technology can significantly improve the speed, consistency and defensibility of the entire review process while cutting costs. In the Bits & Bytes Legal Analysis segment, Laura Tushaus, Kroll Ontrack Legal Correspondent, discusses the recent case of Green v. Blitz U.S.A., Inc.

Download the Latest Edition of the ESI Report!

Case Law: Star Direct Telecom, Inc. v. Global Crossing Bandwidth, Inc.

Case Law

Court Grants Motion to Compel Citing Failure to Identify Information Not Reasonably Accessible

Star Direct Telecom, Inc. v. Global Crossing Bandwidth, Inc., 2011 WL 1125493 (W.D.N.Y. Mar. 21, 2011). In this business litigation, the plaintiff sought disclosure of internal e-mails relating to its breach of contract claim. Opposing the motion, the defendant argued the request was untimely and that the information sought was not relevant, responsive or readily accessible. Noting the duty to supplement production continues even after the discovery period closes, the court found the requested e-mails were relevant and responsive to the plaintiff’s initial document request. Despite the defendant argument that producing the e-mails would require searching Exchange databases housed on an external 4 terabyte storage array at a cost of $13,000, the court asserted that the defendant had a duty to identify sources of information that were not reasonably accessible in its discovery response and rejected its belated arguments regarding burden. Accordingly, the court determined the defendant’s initial production was incomplete and granted the motion to compel.

Commentary

This case demonstrates the importance of being prepared for the Rule 26(f) meet and confer conference in order to address important ediscovery issues fully and accurately. It is important to engage in these early discussions as well-informed, prepared counsel may find itself in an elevated bargaining position capable of dictating advantageous terms. In the conference, counsel should make efforts to understand the opposing party’s technical landscape, clarify the scope of document requests, resolve any production format disagreements and pre-empt the negative impact of inadvertent production of privileged documents by entering into a clawback agreement. Other ediscovery topics that counsel should discuss include the preservation of evidence, testifying experts, cost allocation and other anticipated evidentiary disputes.

In order to be prepared for this conference, counsel should also understand their client’s electronic information and can achieve this through collaborating with IT personnel and in-house counsel (or the organization’s ediscovery team). A data map, which is essentially an outline of a company’s information systems and processes, is also an incredibly helpful resource in identifying data sources and making determinations regarding accessibility of data – something that was amiss in the current case. Use the data map to strengthen pre-discoverability inaccessibility arguments by providing credible evidence of undue burden and cost. Other technology, such as early data assessment (EDA), can also be helpful in meet and confer preparations. EDA will help you determine appropriate search terms allowing parties to collaborate on this important discussion point. The reporting functions within this technology can also help document the process used to determine search terms and validate the quality of those terms for both hits and non-hits.

Finally, it is important for counsel to discuss the use of advanced technology at this conference and reach a documented agreement regarding whether it is acceptable to use such things as intelligent review technology. Although not discussed in case law yet, reaching an agreement regarding the use of this technology at the Rule 26(f) conference will greatly bolster defensibility. Be prepared to explain to opposing counsel what the technology is and how you plan to use it – a process that may be aided by enlisting the help of an expert .

Two Truths and a Fib About Intelligent Categorization

Two Truths and a Fib About Intelligent Categorization

Time is money, and linear document review is almost prohibitively expensive due to the surge in electronic data volume over the past several years and the corresponding increase in resources required to review the data. Besides time and costs, having a multitude of attorneys reviewing and categorizing documents for (potentially) months on end can yield inconsistent results. Innovative technological advances have arrived on the document review scene, but concerns about overall effectiveness persist as the legal industry remains hesitant to explore new technology. 

Enter Intelligent Categorization (iC). Intelligent Categorization is the third component of Intelligent Review Technology (IRT) that analyzes and learns from category decisions made by human reviewers, then identifies and elevates documents most likely to be relevant and suggests categories for documents not yet reviewed. Along with Automated Workflow and Intelligent Prioritization, the other two legs of IRT, reliance on Intelligent Categorization technology is on its way to becoming a well-established practice in 2011. Differing ideas and opinions associated with this technology have been tossed around, giving rise to certain ideas and misconceptions about what iC is, what iC is not and what iC can do for electronic discovery. Today we will dissipate the confusion and set the record straight by exploring two important truths and a common fib associated with Intelligent Categorization.

Defensible? True.

First and foremost, Intelligent Categorization is defensible. One of the early qualms about iC was that until the technology became court-tested, it was too risky to use. That simply is not the case. In fact, such fears have preceded the acceptance of all new technology, including features such as advanced searching and sampling, which are now embraced by jurists and litigants alike.[1] Case law supports the use of a systematic, well-documented and repeatable process, and Intelligent Categorization is specifically designed to increase accuracy and effectiveness while decreasing review time. Indeed, when using all three components of Intelligent Review Technology, it is possible to save 50 percent on review costs. 

Intelligent Categorization also supports the notions of proportionality set forth in Rule 1 of the Federal Rules of Civil Procedure, with the goal of proceedings to be “just, speedy and inexpensive.”[2] As an integrated component of IRT, iC is fully transparent with real-time metrics and analytics available throughout the review process. In addition, experts can explain the technology to judges, opponents, clients and staff if necessary.

Further, the Sedona Conference® has endorsed the use of automated methods (although has not endorsed particular technologies to do so). The Sedona Conference Best Practices Commentary on the Use of Search and Information Retrieval Methods in Ediscovery, Practice Point 1 states:

[R]eliance solely on a manual search process for the purpose of finding documents may be infeasible or unwarranted. In such cases, the use of automated search methods should be viewed as reasonable, valuable, and even necessary.[3]

In addition, The Sedona Conference Commentary on Achieving Quality in the EDiscovery Process advises practitioners to utilize technology that “reasonably and appropriately enable a party to safely and substantially reduce the amount of ESI that must be reviewed by humans.”[4] These commentaries stress the use of technology to realize the important goal of achieving proportionality in the electronic discovery process that has unfortunately spiraled out of control in recent years.

Effective? True.

Closely linked to defensibility is effectiveness. Before investing in new technology, legal teams must be confident that the new feature will work and is worth the change. With sufficient training data, supervised learning can target documents most likely to be relevant. Using supervised learning to identify and pull responsive documents into categories early reduces the time spent organizing documents responsive to particular requests, and helps reviewers and the legal team better understand the case early on. Also, related documents can be dealt with more efficiently as a group and can even be assigned to a reviewer with expertise in a particular category.

The effectiveness of this technology may also be tested through sampling. Sampling is the key to measuring, monitoring, controlling and correcting potential errors in categorization, and is useful in any review to validate results. The technology can systematically and iteratively test the data to evaluate the accuracy of iC (in addition to Intelligent Prioritization). Without the use of sampling, some courts have concluded a party did not take reasonable steps to prevent disclosure.[5] With the flexibility to conduct as much or as little sampling as desired, iC not only reduces the time needed to complete a review, it improves the consistency of and confidence in category determinations.[6]

Independent studies are also proving that the use of Intelligent Review Technology (including Intelligent Categorization) is more effective than traditional, manual review processes. The Ediscovery Institute released a survey that showed using the technology equivalent of Intelligent Categorization resulted in reduced review costs by 45 percent or more.[7] In addition, the TREC Legal Track study from 2008 demonstrated that a “Boolean keyword search found only 24% of the total number of responsive documents in the target data set” while automated searching methods found 76 percent of the responsive documents.[8]

Devoid of Human Control? False.

Intelligent Categorization is not a process devoid of critical human insight and control. In some instances, this new technology has been pitched as a purely hands-off, eyes-off solution. In reality, Intelligent Review Technology as a whole does not replace human reviewers, nor should it. iC works by “learning” from human decisions and applying human logic when suggesting document categories. Human input is required so the technology has data sets with applied classifications from which to learn, and the system learns from both responsive and non-responsive decisions of human reviewers. As more documents are received and sorted, legal teams can rely on technology to continually improve the model while human reviewers can focus their efforts on the content and substance of the documents. In addition, because the tool was designed to increase consistency and accuracy, it affords the flexibility and scalability to give the ediscovery team more control over the review and to leverage as much or as little human input and oversight as is appropriate for the project. Thus, iC is not a substitute for skilled lawyers; rather, it enhances and compliments the work they do.

The question of whether it is reasonable to omit review of some documents altogether is an as-yet undetermined legal question. From a technical standpoint, however, IRT systems can support a range of approaches to selective review, such as extracting documents with a sufficiently low probability of responsiveness from review, guiding a review to read just the most important portions of long documents or focusing extra review on documents likely to belong to sensitive categories.

In short, Intelligent Categorization is a defensible, effective, cost-saving measure that leverages the work of talented attorneys to decrease the time required to complete document review. It is designed to meet flexibility and repeatability needs of the client, and is proving to be the key differentiator in the ability to respond to electronic discovery demands quickly and proportionately.

Note: The above post appeared in the April 2011 issue of the free, monthly e-newsletter, Case Law Update & Trends published by Kroll Ontrack. This newsletter is designed to help busy legal professionals keep pace with case law and information pertaining to electronic evidence. Subscribe and gain valuable and timely information on new ESI court decisions, as well as informative articles and tips for both the corporate and law firm audience.


[1] See, e.g., William A. Gross Constr. Assocs., Inc. v. Am. Mfrs. Mut. Ins. Co., 2009 WL 724954 (S.D.N.Y. Mar. 19, 2009).

[3] The Sedona Conference® Best Practices Commentary on the Use of Search and Information Retrieval Methods in Ediscovery. (Published August 2007).

[4] The Sedona Conference® Commentary on Achieving Quality in the Ediscovery Process, available for download at http://www.thesedonaconference.org/dltForm?did=Achieving_Quality.pdf. (Published May 2009).

[5] Mt. Hawley Ins. Co. v. Felman Prods. Inc., 2010 WL 1990555 (S.D.W.Va. May 18, 2010). 

[6] The Sedona Conference® Commentary on Achieving Quality in the Ediscovery Process, Principle 2, states: “In the ediscovery context, statistical sampling can serve as a check on the effectiveness of…automated tools in identifying responsive information and on the reviewers’ ability to correctly code documents.”

[7] See “Ediscovery Institute Survey on Predictive Coding,” available at http://www.lawinstitute.org/.

[8] For complete results from TREC Legal Track, visit http://trec-legal.umiacs.umd.edu

The User Experience – Reinvented

The User Experience – Reinvented

Version 7.0 of the Ontrack® Inview™ review tool features a new design to dramatically enhance usability and reduce the cost of document review.

Building on 10 years of award-winning document review tool innovations, version 7.0 is now complete with a modern, fresh design and offers clients a brand new user experience. Developed by users for users, insight for the new user interface was obtained from a global team of experienced reviewers located in Kroll Ontrack document review facilities around the world. This interface, together with the unique Intelligent Review Technology (IRT), maximize the speed of document review, reiterating the company’s commitment to helping clients achieve a 50 percent plus cost savings on review.

The new, modern Ontrack Inview interface maximizes the speed of your review, saving costs by:

  • Increasing efficiency. With logical task groupings, a new intuitive ribbon bar and more right-click options, clients can gain faster access to commonly used review features.
  • Improving ease of use. Dual monitor support allows for additional screen real estate for easier viewing of potentially relevant documents.
  • Customizing your experience. With drag and drop, dockable viewing panes, the screen layout is now customizable for all review needs.

“Maximizing usability in the industry-leading Ontrack Inview review tool is of the utmost importance to Kroll Ontrack because it directly impacts efficiency, satisfaction and cost savings for our clients,” said Michele Lange, director of discovery, Kroll Ontrack. “When clients are more efficient, it takes less time to accomplish a particular review task, resulting in increased productivity, cost savings and a higher level of customer satisfaction.”

To ensure that Kroll Ontrack was delivering as promised on this new version, Kroll Ontrack conducted a usability study with a user profile of individuals with little to no experience using the Ontrack Inview tool to test the efficiency of the new review tool interface. Kroll Ontrack timed users as they conducted 10 review tasks in previous versions of the tool and in the new Ontrack Inview 7.0 interface. The usability study revealed it was easier to schedule reports, locate documents, highlight key words and translate text. Specifically, these review tasks were conducted 10 percent faster in the Ontrack Inview 7.0 review tool. The new user interface and its enhanced usability and customizability features are also available in the Ontrack® Advanceview™ early data assessment tool, which now features its own distinct look and feel customized for optimum early data assessment in litigation and regulatory matters.

“Kroll Ontrack strives to continually update its technology and services offerings to improve the efficiency and effectiveness of the legal discovery process,” said George May, vice president of product strategy, Kroll Ontrack. “Innovations such as Intelligent Review Technology, which integrates expert human logic with smart technology by evaluating and learning from document review decisions, and the new Ontrack Inview 7.0 review tool reinforce our relentless commitment to helping our clients drive down the total cost of litigation. As the only end-to-end discovery services and technology provider, we are uniquely positioned to offer the most comprehensive, meaningful and cost-effective improvements to our clients to meet their varying needs. Clients can expect and should look forward to continued innovation from us in technology, services and bundled offerings in 2011 and beyond.”

The Ontrack Inview 7.0 review tool and Ontrack Advanceview 7.0 early data assessment tool are available worldwide. For more information about these tools, visit www.krollontrack.com/ontrack-inview or www.krollontrack.com/ontrack-advanceview

Technology Empowers Legal Review Teams and Increases Document Review Efficiency

Technology Empowers Legal Review Teams and Increases Document Review Efficiency

In his recent New York Times article, John Markoff highlights how “advances in artificial intelligence,” (ediscovery software in particular) have created a shift in the way law firms and corporations conduct legal discovery. While Mr. Markoff agrees that technology provides greater efficiencies and cost savings in document review, he claims that a heavier reliance on technology will ultimately result in a decreased demand for lawyers. While we agree that litigants are relying more heavily on technology during discovery, the suggestion that this will have negative repercussions for the practice of law is misguided.

Document review is, by far, the most expensive part of the legal discovery process. Today, for every $1 spent on processing data, $3 or more is spent on document review. This disparity makes traditional document review economically impractical and counter to notions of proportionality and reasonableness. Furthermore, recent court decisions and learned commentary from reputable organizations such as The Sedona Conference® have opened the door for technology to bring efficiencies to the costly practice of document review.

Kroll Ontrack has a long history of developing groundbreaking technology solutions calculated to make the discovery process less expensive and more efficient. These innovations include advanced search technologies, multilingual character recognition and early data assessment. Most recently, Intelligent Review Technology (IRT), which integrates smart technology with legal expertise, has been developed and implemented in the award-winning Ontrack® Inview™ document review tool. IRT expedites document review decisions by prioritizing and suggesting categories for yet-to-be-reviewed documents. IRT learns from lawyers while they work, strengthens the defensibility of document production decisions and ultimately empowers attorneys to focus on what they do best – devising case strategy and making tactical decisions. Most importantly, studies show that this powerful technology can reduce the expense of document review by more than 50 percent.

“Litigants and their counsel are relentlessly pursuing more efficient, cost-effective legal discovery processes,” said George May, vice president of product strategy, Kroll Ontrack. “It’s inevitable that reliance on technology will continue to increase as clients strive to reduce the total cost of litigation.”

Case Law: In re Fontainebleau Las Vegas Contract Litigation

Case Law

Document Dump of Servers Leads to Privilege Waiver

In re Fontainebleau Las Vegas Contract Litig., 2011 WL 65760 (S.D. Fla. Jan. 7, 2011). In this bankruptcy litigation, the requesting party claimed the third party waived privilege by producing three servers in response to a subpoena and court orders without conducting a review for either privilege or responsiveness. Seeking to use the information but avoid any adverse consequences, the requesting party offered to “eat” the cost of searching the massive document dump of approximately 800 GB and 600,000 documents for relevant materials in exchange for the right to review and use the data free of the obligation to appraise or return any privileged documents. Reviewing the third party’s conduct, the court found that its failure to conduct any meaningful privilege review prior to production constituted voluntary disclosure and resulted in a complete waiver of applicable privileges. Noting that more than two months after production the third party had not flagged even one document as privileged, the court rejected its “belatedly and casually proffered” objections as “too little, too late.” Accordingly, the court granted the requesting party full use of these documents during pre-trial preparations of the case, but ordered it to timely advise the third party of any facially privileged information it encountered upon review.

Commentary

This case demonstrates the importance of conducting a proper review and production process. The third party in this case simply “dumped” a significant amount of information onto the requesting party without engaging in a document review; instead, the third party complained that the review and production process would be unduly burdensome and delayed production numerous times. Finally, the third party produced three servers without conducting a privilege review, but did belatedly produce a privilege log for one of the servers.

If the third party had conducted a document review, it would not have been in the position to spend costs on numerous motions and would not have faced the penalty of the court declaring that privilege was waived (with the exception of the server that had an accompanying privilege log). The document review process may seem like a daunting, costly and burdensome task, but it is often a necessary step in ediscovery. Thankfully, Intelligent Review Technology exists and can save you 50 percent on review costs, while improving the quality and defensibility of document review. Through use of Automated Workflow, Intelligent Prioritization and Intelligent Categorization, counsel can avoid conducting the strictly manual process of document review that inherently leads to inaccuracies and inconsistencies. IRT also provides transparent reports and real-time metrics, giving counsel peace of mind as to the defensibility of the technology and process. If the third party in this case invested in technology to conduct the document review, their costs would arguably have been far less than what was spent on numerous motions and the cost of opposing counsel’s access to and use of privileged materials in the pre-trial preparation process.

 
css.php