Discover Insights

Add Your Heading Text Here

An Update and Possible Solutions in the Apple Watch Saga

Balancing The Equation Between Technology and Effective Legal Project

Most of the hype around new
technology holds the promise of
a magic pill that can solve our
business problems at speeds
never before imaginable. Those
of us who have experienced
more than one technology wave
concede the truth involves integrating
proven tech with solid
work methodology to achieve
reliable results. Being the first to
the party otherwise might end up
being both awkward and disastrous
at the same time.

Add to the mix a highly time-compressed
business threat
requiring all-hands-on-deck with
stratospheric resource demands
and who’s ready to bet the farm?
In this new article series, we
explore where the balance point
between technology selection use and an effective legal project
management approach exists to
solve critical business use case
challenges.

Let’s Buy a Company!

Business combinations can
be powerful (particularly with
slowing organic growth in many
industries). The strategic decision
to acquire a company can
quickly expand market share,
geographic reach, add new product/
service capabilities and gain
access to an entirely new client/
customer base. It can also
trigger regulatory scrutiny that
can thwart completion of the
deal while destroying equity
value, market capitalization and
goodwill all in a single blow.
This potential failure can even
weaken a company to the point
of its own undoing.

Under U.S. antitrust law, a second
request is a discovery procedure
by which the Antitrust Division of the U.S. Department
of Justice and the Federal Trade
Commission investigate mergers
and acquisitions that have anticompetitive
consequences. Unlike
many other business challenges, a
second request is a highly time-compressed
process, relegating a
90-day period in which the target
is given to respond in combination
with a voluminous magnitude
of data it must review for
responsiveness/privilege and ultimately
produce.

When Cost Is No Object But
Space Travel Remains Elusive

Elon Musk sending his sports
car into space just because he
could is one thing (albeit still a
curiosity), but companies willing
to write blank checks when contested
don’t always succeed. Most
large enterprise second request
legal budgets are proportional to
the risk they present, inclusive of
the downside risk/costs of failure
to launch and the potential loss
of what the business combination
otherwise would amount to
in revenue growth/market access
(frequently in the billions of U.S.
dollars).
Despite a cost-be-damned
strategy to get a second request
response approved, scaling to
meet those demands is rarely successful
without outside counsel
and a capable alternative legal
service provider (specializing
in e-discovery technology best
practices and global review core
competency) partnering on some
level with the company.
This
remains true despite the company
selecting any one of an array of
the most reputable outside counsel
law firms who have integrated
some level of captured document
review services.

All Documents Are Not
Created Equal

One of the initial technologies
adopted by many legal departments,
and ultimately by the
courts, was sample based learning
found within technology assisted
review (TAR 1.0). TAR uses samples
or “seed sets” to train the
algorithm to then apply coding to
the larger review universe. Statistical
analysis is then run on the
entire review set to confirm the
accuracy of the algorithm. If the
desired accuracy isn’t met, you
then must train the system again,
in an iterative fashion, until you
meet your accuracy level.
The issue becomes that this process
needs to be recalibrated each
time you change search terms for
the issues sought (which is a frequent
need during most cases)
or upon a change in the data
set. This is not ideal and can be
fatal in a second request scenario
where a 90-day, time-compressed
period for production is not flexible.
This could also back you in
a corner by requiring an army of
attorneys/review team to boil the
ocean on every issue subject to
this make-or-break endeavor.
Keep in mind the end game here
is to review and produce relevant,
nonprivileged documents so getting
to them and putting eyes on
those most critical in terms of relevancy
and privilege at the earliest
possible time juncture (otherwise
known as speed to legal intelligence).
Recalibrating TAR 1.0 is
not something we have the bandwidth
to accomplish if we need
to go back to the well each time
there are material changes to the
search terms or data set.

Game Changer

Technology-assisted review
comes in a variety of flavors, with
protocols that include simple
passive learning (SPL), simple
active learning (SAL) and a newer
approach, continuous active
learning (CAL). The “continuous”
aspect of CAL refers to the
ongoing process of ranking and
re-ranking documents for manual
review based on a constant stream
of incoming coding throughout
the review’s lifecycle. In other
words, CAL takes into account
not only an initial set of training
assessments to rank and prioritize
documents, but continuously
updates those rankings based on
the most recent assessments. This
removes any need to go back to
the well, recalibrate document
seed sets and recalibrate results
until a target level of accuracy is
achieved.
While we have had great success
in negotiations with the
government (helping them understand
and become comfortable
with the process and benefits of
using CAL), it can also be helpful
even if it isn’t used to make coding
decisions.

Some of the benefits of utilizing
CAL:

  • CAL can be run in the background
    and used to prioritize
    documents for review.
  • Enablement to more effectively
    utilize finite time windows to
    review responsive rather than
    nonresponsive documents.
  • Identification of documents for
    secondary workflow review
    (privilege, redaction, logging)
    earlier in the review process.
  • The ability to quickly rank
    and prioritize new data as it
    is added.
  • No additional cost in most (if
    not all) cases to use CAL (and
    most frequently realization of
    a measurable spend reduction
    by doing so).

CAL vs. TAR 1.0 (Flying
Versus Walking)

Continuous active learning
(CAL or TAR 2.0) uses support
vector machine learning (SVM)
to draw a line of best fit to rank
documents from 1-100, with 1
being least likely to be relevant
and 100 being most likely. Documents
are ranked from the outset
of review and are continuously
re-ranked during the course of
review. There are two different
workflows we can use in active
learning:

● Priority review — In this workflow,
unreviewed highly ranked
documents are given to reviewers
first feeding them the most
responsive documents to
review.

● Coverage review—Here, documents
that the model is struggling
to categorize (around
rank 50) are prioritized allowing
the system to learn the
most quickly and “stabilize”
where training the system further
doesn’t result in greater
accuracy of the model.

With CAL workflows being an
ongoing process, the general
assumption is that all responsive
documents will be manually
reviewed and that all coded documents
will be incorporated into
the continuously growing training
set. Put in the context of a second
request, this means “eyes on”
the relevant documents and those
that may be privileged is mission
critical.

Unlike other TAR 1.0 approaches,
the end goal is not to automatically
classify documents either
as responsive or nonresponsive.
CAL, on the other hand, is optimized
to route likely responsive
documents to the manual review
queue while curtailing inclusion
of nonresponsive documents. In
this context, ongoing document
prioritization is the driver rather
than automated one-time classification,
making the notion of
“seed set” largely irrelevant.

The benefits of CAL over TAR
1.0 are:

  • Review of priority documents
    faster.
  • Less time on administration.
  • Far greater flexibility and versatility.
  • Can easily handle rolling
    continuously added data.
  • Validation with confidence.
  • Resistant to incorrect/contradictory
    coding decisions.

Next Steps

The balance between technology
selection/use and effective legal
project management strategy/workflow
is a dance that can be mastered
for each business-use case. Through
a proactive and highly collaborative
team approach, critical business
tasks can be solved in both a cost-and
process-effective manner. Our
next article will dig deeper into the
skill each team member needs to
have for us to get there.

H. Bruce Gordon is the manager
of e-discovery in the Office of
the General Counsel for The Vanguard
Group. Gordon’s career
spans over 20 years of ESI response
management and as an IT manager/
liaison to legal departments
including Teva Pharmaceuticals,
AmerisourceBergen Corp. and the
Rohm and Haas Co.

This story was originally published on Corporate Counsel.

Asset#1XSBg4H6kYMI5cySmxEXcC

Related Content

5 Litigation and AI Trends to Watch in 2024

In-depth analysis of immediate use cases for AI, including case analysis, document review, privilege logs, response drafting and budgeting and settlement

Haleon & UnitedLex – Building Your Legal Dream Team: A Case Study in Data-First Collaboration

Understand the goals, challenges, and lessons learned when Haleon and UnitedLex created a hybrid legal department delivering ongoing innovation and value for the business.

Source Code Review: A Powerful Tool in Technology Patent Infringement Litigations  

Source code review provides an inside look of how software products work. It is critical to getting an objective look at how a technology behaves, communicates with other devices, stores data, and executes various operations.

Apple Forced to Halt Sales of the Apple Watch Series 9 and Ultra 2

Apple could avoid the import ban by moving manufacturing to the US, but that would introduce a host of legal and business challenges — including wilful infringement – which can lead to enhanced damages.
Whitepaper

Saving Money Through Smarter Legal Invoice Review Processes

Reduce spend and administrative burden with a strategic legal invoice review program anchored in data.

2024 IP Impact Study: Trends in Benchmarking Value