Failing Expectations: Where’s the Promised ROI of Technology?

July 14, 2017

Businessman using ROI Return on Investment indicator for improving business performance

Email. Blogs. SaaS. CMS. E-Billing. Push Marketing. Law firms seem to be overflowing with new technology. Why, then, does the 2017 Altman Weil Law Firms in Transition Survey say that our profession is faced with an urgent “… need for greater efficiency” in the face of “… the inexorable force of technology innovation” and that legal technology has “… not delivered on its promise of greater efficiencies?”

We want to discuss this failed expectation. Is it really technology’s fault or, rather, the way we implement technology? If technology is just a tool, are we using this tool correctly? We want to hear your stories – the failures as well as the success stories. This first webinar will be the start of a process designed to uncover the core issues and collaborate to find answers that can make a real difference.

The beginning of this discussion will be a webinar on July 26th at 11:00 A.M. CDT.  Chuck Cole, Lucid IQ Director of Client Engagement and expert in knowledge management and business processes, will be joined by Tom O’Connor, attorney, consultant, and independent expert in managing the litigation process, to discuss their combined 60 years of experience helping law firms and corporation’s to take a fresh look at making technology work for you.

You can register for the webinar here or leave your comments below.

Speakers

Chuck Cole, Director of Client Engagement at LucidIQ, has extensive experience in business process management (BPM), project management, strategic knowledge management & other roles that optimize use of information & results. Chuck helps our clients implement innovative practices & strategies that optimize effectiveness & value. Chuck holds an M.B.A from Georgetown U.

 

Tom O’Connor is the Director of the Gulf Coast Legal Technology Center, a legal think tank based in New Orleans. He is a well-known consultant & speaker & is also a prolific writer in the area of computerized litigation support systems. Tom’s consulting experience is primarily in complex litigation matters where he has worked on numerous major cases, most recently the BP litigation.


Why Is TAR Like Ice Cream?

July 10, 2017

From my Advanced Discovery blog of July 7:

Short answer: because both have so many flavors.

You wouldn’t ask me to go to the store to “get some ice cream” without telling me what flavor you want.  But everyone these days is talking about TAR (Technology Assisted Review) like it’s a flavor all its own.  By everyone, I mean columnists, bloggers, consultants and sometimes even judges.

Remember the cases Aurora Cooperative Elevator Company v. Aventine Renewable Energy or Independent Living Center of Southern California v. City of Los Angeles, where courts ordered the use of predictive coding after extensive discovery squabbles? Or more recently, Judge Peck declining to order the parties to use TAR in Hyles v. New York City, by which, in that case, he meant predictive coding.

Which illustrates my point: what do we mean when we say TAR?

When it comes to TAR, pretty much everyone agrees with this framing statement made by Maura Grossman and Gordon Cormack in their seminal article, Technology-Assisted Review in E-Discovery Can Be More Effective and More Efficient Than Exhaustive Manual Review, (XVII RICH. J.L. & TECH. 11 (2011):

Overall, the myth that exhaustive manual review is the most effective—and therefore, the most defensible—approach to document review is strongly refuted. Technology-assisted review can (and does) yield more accurate results than exhaustive manual review, with much lower effort.

But then things go south. Why?

First, because it is always unclear if TAR is a synonym for predictive coding as the cases above illustrate.  And as a further example, in a recent post on Ralph Losey’s blog, E-Discovery Team, a sub-title states, “New First Class Added to the TAR Course,” with the first sentence then stating, “We also added a new class on the historical background of the development of predictive coding.”

Second, because any discussion of TAR involves selecting documents using algorithms. Algorithms. Math. Warning. Warning. Danger Will Robinson. Attorneys react to math the way astronaut David Bowman reacted to HAL in 2001: A Space Odyssey – like it’s trying to kill them.

Want a good example? Take a look at Ralph’s most recent blog about his TAR course.  Great course, extremely comprehensive, tremendous insight into TAR.  But here’s a paragraph from the site:

Ralph misspoke in the video at point 8:39 when he said 95% confidence interval, he meant to say 95% confidence level. The random sample of 1,535 documents created a probability of 95% +/- 2.5%, meaning a 95% confidence level subject to an error range or interval of plus or minus 2.5%.

Gain curves, x-axis vs y-axis, HorvitsThompson estimators, recall rates, prevalence ranges and my personal favorite “word-based tf-idf tokenization strategy.”  All this geek talk makes me yearn for the days of Trover and Replevin.

Third, because we’re talking about a process, not a product. The Wall Street Journal made exactly that point in a 2012 article entitled “Why Hire a Lawyer? Computers are Cheaper.”  Ralph calls it the multi-modal approach: a combination of people and computers to get the best result.

Everyone agrees that manual review is inefficient (the lawyer part), but nobody can agree on what software the lawyers should use and how: the geek part.  And when geeks start disagreeing over technology, that’s when things get uncertain.

So where does this leave us? 

The idea behind predictive coding – that technology can help reduce the cost of eDiscovery – is a great one. But figuring out what pieces of technology to apply at what point in the workflow is not so easy, especially when the experts disagree as to the best methodology.

Remember when Judge Facciola said in the O’Keefe case that areas of technical expertise are where even angels fear to tread? Believe me — the angels are taking a BIG detour around this subject.

My advice?

Before you think about using more advanced technology, use basic tools early on: dedupe, denist, cull by dates, sample by custodians and start with basic search terms that are agreed upon by both sides. Then get an expert to perform more advanced analytics who has legal experience and can explain the procedure to you in simple English.

TAR isn’t a piece of software. It’s a process that can include many different steps, several pieces of software, and many decisions by the litigation team. If you and/or your expert can’t quickly and concisely explain that process to the Court as easy as you can order a waffle cone with one scoop of rocky road, one scoop of chocolate pecan fudge, jimmies and some pineapple sauce, then you may find yourself on the receiving end of a document exchange protocol order drafted by the party who could.

To see all the Advanced Discovery blogs, go to http://www.advanceddiscovery.com/blogs/


Still Time to Register for Georgetown Law eDiscovery Training Academy

April 21, 2017

Supreme Court

Georgetown Law’s eDiscovery Training Academy is guaranteed to provide you with a unique learning experience. The Academy’s full-week curriculum will give you a total immersion in the subject of eDiscovery, featuring a highly personalized and interactive instructional approach designed to foster an intense connection between all students and a renowned faculty.

The Academy has been designed by experts to be a challenging experience leading to a comprehensive understanding of the discipline. It is demanding, but it will be one of your most exciting and successful learning experiences if you are determined to invest the time and effort.

This years Academy will be held from June 4 -9.  To register for the Academy, please visit this page.


Bill Hamilton of U of Florida Levin College of Law & Tom O’Connor of Advanced Discovery talk computer basics for lawyers on a free webinar

March 1, 2017

Join Bill and Tom on Wed March 8 at 12PM Eastern as they discuss “Computer Basics for Lawyers: Building the Foundation of E-Discovery Competence”

This one-hour program will present for lawyers an overview of computer and network operation basics and illustrate why understanding basic computer operations and architecture is critical for a successful e-discovery practice. The program will build from a discussion of computer logical gates to understanding the structure of computer files as collections of on and off bits.  It will also explain basic computer programing and how a computer performs such task as adding and “remembers.”

From this foundation the program will explore the operation and implications computer peripherals, the range of computer devices, computer networks, social media, cloud computing, and the emerging “Internet of Things.” The program will emphasize how the various operational and architectural features of computers and networks impacts and triggers the decisions lawyers must make when navigating the preservation, collection, processing, review, and production phases of electronic discovery.

Free registration is available at: https://zoom.us/webinar/register/e9926315a616f9be34538d7d4481ef37


WHAT EXACTLY IS E-DISCOVERY ANALYTICS?

December 11, 2016

A funny thing happened on the way to the webinar. On Wed Dec. 14th, Advanced Discovery is presenting a live webinar entitled “New Developments in Analytics”. (link and full information at the end of this article) But while preparing the slide deck and speaking with several of our internal experts on the AD Consulting Team (special thanks to Susan Stone, Julia Byerson and Todd Mansbridge for all their feedback) as well as several clients about the topic, I found that we had a surprising lack of agreement on some of the key terms.

I had always viewed TAR (Technology Assisted Review) as the granddaddy of this discussion because many years ago I felt that TAR essentially meant keyword searching. I also felt it then evolved into Predictive Coding and later in the game the phrase ”analytics” was grabbed from big data people to refer to some data analytics tools.

So my world view of TAR looked something like this:

Structured Analytics
•  Email threading
•  Near Duplicate detection                                                                         •  Language detection

Conceptual Analytics
• Keyword expansion
• Conceptual clustering
• Categorization
• Predictive Coding

And a recent article by another vendor expressed the view that there are three classes of analytics – structured, conceptual and predictive, with predictive including TAR.

Finally, this graphic from Relativity shows that their world view of RAR (Relativity Assisted Review) appears to be one all-encompassing definition.

rar

 

But other people were looking at these terms from a different perspective. One of Solutions consultants elaborated that:

Conceptual indexing is an internal (non-client facing) analytics tool.
Predictive coding is a class of workflows that can sit on top of different internal analytics tools.
RAR is a product (or a feature of a product) that combines both the internal analytics tool of conceptual indexing with a repeatable, defined predictive coding workflow.

Another of our experts expressed it much more simply:

Predictive coding is a process not a product or service.

And of course, I had to add to the confusion by asking where CAL (Continuous Active Learning) fit into this hierarchy. One of our senior analytics gurus responded to that query with:

In my opinion CAL is a workflow. It selects seed documents based on categorization which is live rather than passive which requires you to submit after you have reviewed a set number.

Finally, when I ran all this by Matthew Verga, the Advanced Discovery VP of Marketing Content, he fell clearly in the tools vs workflow camp, saying:

TAR is process that uses analytic tools to amplify human decision-making. Relativity Assisted Review is a form of TAR [and] is powered by categorization, an analytic tool. Neither Analytics nor TAR contain the other as a subset, they are different categories of things

And indeed it is this workflow paradigm that is much more prevalent today, as we will discuss in the webinar. But what I’m interested in is hearing what YOU think. Analytics, TAR, Predictive Coding …. how do YOU define these terms and how do you use them in your work with ESI?

Drop me a note at tom.oconnor@advanceddiscovery.com  and join us on Wednesday at 1pm EST for an hour-long nuts and bolts discussion about how to use technology in your eDiscovery practice. I’ll be joined by Anne Bentley McCray from McGuireWoods LLP, an attorney experienced in working with ESI to discuss these concepts and others.

The registration page can be found at: https://attendee.gotowebinar.com/register/1528164311149570051


ANOTHER GREAT ILTACON

September 3, 2016

It’s Friday morning and ILTACON16 is in the books.  This is my favorite conference of the year, for several reasons. First, ILTA is a user group for IT folks at law firms, which means they have a very degree of technical understanding. Second, they are interested in solutions that work well in their IT structure so they have a wider view of technical specifications.  And third, they talk with each other about vendors and solutions so they are well versed in the overall market tensions and variations.

The show was well attended as always, with approximately 1500 people on site. This made for good interaction in the conference venue, the National Harbor Gaylord. But the size of the venue also meant that the exhibit hall had plenty of elbow room. With 1500 attendees and close to 200 vendors in the hall, this was a welcome change from other shows with small uncomfortable venues.

Another feature of ILTACON is close placement of social events so that vendor parties and receptions for special groups were easy to find and therefore attend.  My favorite this year was the Bryan U reception at the Public House which had a great attendance of ESI luminaries including Michael Arkfeld, Casey Flaherty, Scott Cohen, Craig Ball, Ian Campbell and Kim Taylor. Host Bill Hamilton showed us a short video trailer discussing the schools latest project, an online ESI competency course.  It looks very promising and more information will be forthcoming on the schools web site.

The educational sessions were, as always, tremendous. Three and a half days of multiple sessions on a host of technical issues.  Security was a topic of high interest as was info governance with overtones of ediscovery.  And panels on project management and metrics with Mike Quartararo of Stroock & Stroock & Lavan and Scott Cohen of Winston Strawn were big draws.

Mike also was at the Authors Corner showing off his new book Project Management in Electronic Discovery. It is literally the first book on the subject and was a big hit. KCura even bought a number of copies to give away in their booth.  Great job Mike

And I was personally heartened to see the attendance at two lit support specific panels I was speaking on that were held Thursday afternoon, the last slot of the conference for educational sessions.  On the first one I was the moderator for A Road Map To Gathering and Analyzing Client Discovery Data Across Matters which featured AD’s own Kate Head as a panelist with ILTA stalwart, Chad Papenfuss, Litigation Support Services Manager at Kirkland & Ellis, roving the audience with a microphone and prompting discussions. Great session!

The second was the lit support groups conference ending  annualGather ‘Round for a Litigation Support Roundtable.  Great turnout of over 40 people including Kate, Chad, Julie Brown of Vorys, Sater, Seymour and Pease and Craig Ball with a lively discussion on new technology and trends that left us heading home on a high note.

Finally I must note that this ILTACON was the swan song for Executive Director Randi Mayes, who has announced her retirement.  I’ve known Randi for more years than either one of us cares to admit and she has always been a great leader and an even better person. We’ll all miss her.

All in all a great conference with excellent content and attendee discussions. I highly recommend it and hope to see you more of you next year for ILTACON17 at Mandalay Bay in Las Vegas.


Tech Assessment and You

August 22, 2016

From this weeks Advanced Discovery Advanced Discovery blog:

I’ve been following the blog series on Insourcing vs. Outsourcing, by my Advanced Discovery colleague Matthew Verga, and found this week’s chapter especially interesting. The series is basically a more detailed deep dive into the topic that Matthew and I addressed in a webinar a while back (you can see a replay of the presentation here: https://attendee.gotowebinar.com/register/2825368125753789186).

The most recent installment is called Organizational Self-Assessment: Technology Factors and can be found on the Advanced Discovery blog page athttp://www.advanceddiscovery.com/blog/2016/08/organizational-self-assessment-technology-factors/.  The topic is Technology Factors which, as Matthew defines in this context, refers to an organization’s overall technology resources, sophistication, and comfort level.  In reading that, I came to realize that when we did our webinar, we didn’t mention the wonderful tech audit tool for attorneys.

The tool was first developed by Casey Flaherty ( www.linkedin.com/in/dcaseyflaherty ) when he was Corporate Counsel at Kia Motors. The short version is that while at Kia, Casey decided to test the tech skill level of the company’s outside counsel.  A full background of that story can be found in an ABA Journal article athttp://www.abajournal.com/legalrebels/article/could_you_pass_this_in-house_counsels_tech_test.

The more recent development is, that after leaving Kia, Casey started doing consulting work, and teamed up with Professor Andrew Perlman, Dean of Suffolk University Law School (http://www.suffolk.edu/law/faculty/26811.php), to create a new tech test he calls the tech audit. Suffolk has a legal technology think tank called the Institute on Law Practice Technology & Innovation (http://legaltech.suffolk.edu/), and together they have created a Legal Tech Assessment project (http://www.techassessment.legal/). This project uses the tech audit to not only show how much you know (or do not know), but also how much time you waste on basic tech tasks when you don’t know enough.

It’s a fascinating story with equally fascinating results.  Take a look at some of the links to read about the results. You’ll be surprised.

Or not.

And don’t forget to follow the rest of Matthews’s series. Next week he’ll be writing about financial factors, the third of his four categories of key factors for organizational self-assessment. And later in the month, he’ll begin the fourth category, human resources factors before finally turning to key takeaways from the entire series.  Don’t miss it.