Continuous Case Assessment (CCA)

Barry Murphy posted an article on his eDiscovery Journal Blog on September 1, 2010 titled, “Earlier Early Case Assessment (ECA)?” in which he outlines the maturation of the Early Case Assessment (ECA) model and how the industry has been contending that ECA needs to happen even earlier in the process to have the biggest impact on the overall cost of eDiscovery.    Since I have been an advocate of this “the earlier the better” approach for the past 24 months, I obviously agree and appreciate the fact that Barry is continuing to write about this important topic.

However, having come from a Six Sigma and Agile development (iterative) background and  therefore having seen the dramatic impact that continuous and iterative process improvement can have on any operating and or process model, I believe that there is room for dramatic improvement in the current approach to the eDiscovery lifecycle.

As a first step, I believe that we need to think about the process of eDiscovery as a circular, interactive and continuously changing  model instead of a linear model as the EDRM indicates.  Once we make this shift in ideology, we will then be able to visualize and apply the concepts of continuous analysis and improvement and possibly end up with a new approach to eDiscovery called Continuous Case Assessment CCA).

Historically, with the disconnected nature of how data moves through the EDRM, this may have been an impossible task to achieve.  However, with the recent advent of end-to-end solutions such as Autonomy that enable an integrated flow of data around a central repository and workflow platforms as Fusion from Exterro that bring together disparate processing platforms into a common work environment,  Continuous Case Assessment (CCA) is now a plausible approach.

Realistically, the disconnected nature of data storage, the geographical and timing challenges of collection and the general disorganized nature of  Information Technology (IT) will continue to pose a barrier to true CCA.  However, wouldn’t it be nice to be able to utilize a common assessment methodology and technology platform with a full arsenal of de-dupe, search, clustering and other related technologies throughout the entire lifecycle?

I posed this question to a well known litigator at the International Legal Technology Association (ILTA) conference in Las Vegas last month over a drink  and was surprised by his response.  He contended that technology was already too invasive and the last thing that he wanted was the ability for someone to easily analyze and therefore question every piece of electronic evidence at every step in the process.  Or, even worse to be able to analyze the accuracy of a manual review process (i.e. analyze documents that have already been reviewed by a lawyer or paralegal).    He went on to say that litigation is an imperfect process that lends itself to forming conclusions not necessarily based on the detailed facts but on the general preponderance of the evidence.  And, he went on to contend that over the years our legal system has figures out how to make it work.    His summary comment was that adding all of this technology is just going to make litigation more expensive by encouraging more analysis that isn’t necessarily going to affect the outcome of most cases.

I am not a lawyer and I have never played one on TV.  However, where I come from, more “analysis”  or better “analysis” (all within reason) is a good thing and methodologies like continuous process improvement have proven to be very valuable to the business decision making accuracy and subsequent success of the enterprise.

I guess that I just don’t get it?  Is this one of the those horseshoes and hand grenade analogies where being close is good enough in most cases. What does everyone else in the industry think?  I am sure that the technology vendors that provide analytics would be in favor of more analytics.  But, I wonder what the enterprise user that is paying the bills for litigation and Governance, Risk and Compliance (GRC) analytics would think about this?  I will ask a few and let you know what they have to say.

The full text of Mr. Murphy’s blog post is as follows:

One of the things I hear in vendor briefings more and more is early case assessment (ECA) happening even earlier.  This can be referred to as “very early case assessment” or “ECA in the wild” or “in-place ECA.”  At the end of the day, it’s all about moving ECA forward to happen in-line with identification and collection so that organizations can save money, make decisions earlier, and simplify the eDiscovery process.

The first “ECA solutions” focused on taking collected data sets, processing the information, and then making it available for review.  As such, these solutions focused on the process, analysis, and review modules of the EDRM, as seen in Figure 1 below.
Figure 1 – First-Generation ECA

Solutions that deliver just the processing, analysis, and review functionality of first-generation ECA can still deliver real value.  The clear trend, however, is for organizations to take more and more activity on the left side of the EDRM (information management, identification, preservation, and collection) in-house.  As such, it is better to have a more integrated “ECA solution,” one that can identify potentially responsive information across multiple data sources, collect it, preserve it (either in-place or in a dedicated preservation repository), process it, run analytics, and present the information for review in a user interface that legal professionals can quickly grasp.

By adding functionality to identify, collect, and preserve, solution providers can deliver even more costs savings and risk mitigation. A more integrated solution does not necessarily mean single-sourcing the ECA solution.  As eDiscovery is an immature market, there are not many vendor that offer end-to-end solutions.  And those that do offer “full” solutions often have modules that are weak links.  The reality is that a full ECA solution may come from a number of vendors that partner to offer full functionality.  It’s important to test that partnerships are real – that integrations are pre-built and proven as opposed to two vendors that have swapped logos for marketing purposes.

The next-generation of ECA solutions adds not only identification, collection, and preservation features, but also adds production, as seen in Figure 2 below.  While not a necessity, production features allow organizations to quickly produce data to other parties as needed without having to send data to a law firm or out to a service provider for production services.

Figure 2 – Next-Generation ECA

We will explore ECA further in a formal report on the topic, including the benefits of an integrated ECA solution, as well as some of the concerns of moving ECA to the left on the EDRM.  If your organization is doing early ECA, we’d love to hear more about it!

About Charles Skamser
Charles Skamser is an internationally recognized technology sales, marketing and product management leader with over 25 years of experience in Information Governance, eDiscovery, Machine Learning, Computer Assisted Analytics, Cloud Computing, Big Data Analytics, IT Automation and ITOA. Charles is the founder and Senior Analyst for eDiscovery Solutions Group, a global provider of information management consulting, market intelligence and advisory services specializing in information governance, eDiscovery, Big Data analytics and cloud computing solutions. Previously, Charles served in various executive roles with disruptive technology start ups and well known industry technology providers. Charles is a prolific author and a regular speaker on the technology that the Global 2000 require to manage the accelerating increase in Electronically Stored Information (ESI). Charles holds a BA in Political Science and Economics from Macalester College.