Litigation is a complex and costly process. Motions practice, discovery and trial comprise an intricate dance, but it is surprising that the discovery phase of litigation frequently ends up being the most complicated and expensive part of the process. It often steals the show. Discovery—and e-discovery in particular—has a number of moving parts, but from a 10,000-foot view it should be fairly straightforward. Discovery is about fact-gathering in the form of interrogatories, document exchange and depositions.
Document exchange, however, ends up being much more than just gathering and sorting through documents in order to identify those that are responsive and not privileged. In fact, it ends up being unnecessarily complicated. The mechanics of document exchange, oddly, are not the most complicated part of the process. Managing time, money and expectations throughout document exchange are the real tricks in the process.
What if we were able to better manage time, money and expectations? Once those concerns are eliminated (particularly cost), the discovery phase of litigation will no longer steal the show, and we can make litigation a much more predictable process.
Discovery (and document exchange in particular) is most intensive in the first six months of litigation. During this time decisions about custodians, data sets and review are being formed. Much of this time is spent figuring out how to minimize the number of custodians, reduce the overall data sets and minimize the number of documents that need to be reviewed. Why? Because historically, the cost of review was directly proportional to the size of the review . . . and the size of review was a result of the number of documents . . . and the number of documents was a result of the number of custodians. With earlier technology, controlling document sizes and custodian counts saved money. But trying to engineer the most effective size of review consumes a great deal of time.
The less we try to engineer the size of the review and focus on simply getting started with the review at hand, the more time we will save. And now a number of new technologies (discussed below) greatly minimize the time spent on the review process and allow for more effective discovery.
Document review has long been a costly process. In the past, teams of attorneys (hundreds at times) would pour over documents to determine relevancy and privilege. In that context, it was obviously important to reduce the number of documents to be reviewed in order to reduce attorney costs. As a result, a tremendous amount of technology has been used to reduce data sizes and increase the speed of review. But in the past, all of this data-reduction technology came at a cost, which ended up repeating the negative cycle of increased costs.
Modern e-discovery software no longer needs to repeat that cycle—and here’s why. There are many service providers that will help take all of your custodians’ data, across disparate sources, and immediately reduce data sets by removing explicitly nonresponsive data. This can be accomplished through deduplication and noise-reduction technology that can remove all nonrelevant data at the time of ingestion.
And the cost? With the proper negotiation, this process can be accomplished at a cost less than the fees associated with paying counsel to discuss the best way to minimize custodians and overall data size. In other words, reducing the set of data from a veritable mountain down to a pile of potentially relevant documents now can be accomplished at a wash.
That pile of information still needs to be reviewed, however, and even with substantial reduction in size there still potentially will be hundreds of thousands of documents to review. This is where further technology comes into the picture. Technology-assisted review (TAR, aka “predictive coding”) can enable a small team of attorneys to review the remaining pile of documents in a matter of weeks, not months. And while it used to cost a substantial fee to use the technology, in today’s e-discovery landscape TAR can be applied at minimal-to-no additional cost beyond the cost of hosting the data. The result is a streamlined document review process that enables litigators to collect more data while expending less time and money to review the data at hand. All you need to do is find technology providers that will accept the challenge and provide the technology at a lower cost.
This new discovery workflow will free up a tremendous amount of time spent managing expectations. No longer will it be necessary to determine priority custodians and the most relevant data sources. In the past, much energy was expended managing those two aspects of discovery alone. Today, you simply need to focus on the merits of the litigation and use the technology to work through the data at play in the matter.
With the introduction of such technologies, however, there are new expectations to manage. For example, it is important to make sure that all data relevant to a matter is being collected and that the right TAR process is being followed. But concerns about time and cost will no longer lead the conversation.
Discovery, with its extreme costs and extended timelines, used to steal the show in litigation. Using modern technologies, however, we can move the focus from managing data sizes and document counts to making sure we are concerned with the merits of the case, efficacy of motion practice and the strategy of the overall review.
Let the technology do the heavy lifting. Technology will not resolve every issue that we have in litigation—and new issues may arise—but overall there are enormous economies of scale to gain by applying the right technology. Thankfully, the industry is maturing at a rapid rate, and service providers are rising to meet the challenge to help litigators focus less on time, money and expectations—and more on the case itself.
The LawTech Silicon Valley, presented by ALM Media publications The Recorder and Law Technology News, was a day-long event in Palo Alto, Calif. on October 7, 2014. The event featured eight panels that addressed some of the challenges law firms face when adopting legal technology. Here are ten “take-aways” from the conference:
1. More technology than ever before is available to support the diverse law practice of small, midsize and large firms. Selecting appropriate technology is a challenge facing many practitioners.
2. Technology has enabled the creation of different legal service delivery models, such as virtual law firms, national and global collaboration, and consumer legal services delivery. State bar regulations created to avoid these new models are eroding.
3. A growing number of attorneys are tech savvy. Curiously, many panelists observed that law firms have been slow to adopt technologies that could improve their practice. This view was echoed in the recent LTN article Surveys: Law Firm Tech Adoption Sluggish, October 2, 2014
4. Among the transformative technologies available to law firms are cloud services, which help relieve firms (particularly solo practitioners and small firms) from the burden of managing IT infrastructure. Cloud services may provide more security than the firm’s data infrastructure, as cloud service providers can invest more in security than many law firms.
5. Effective legal technology should respond to the users’ needs; it has to solve the users’ problems to be useful and adopted. Effective tools are intuitive, simple to use and well designed.
6. Data analytics technology has, and will continue to have, a significant impact on data management. “Predictive coding” technology has revolutionized litigation-related document review and related technologies can be broadly applied to records management.
7. Many law firms are employing new technologies and associated processes to deal with their clients’ technology such as cloud services and bring your own device (BYOD) environments.
8. Security, security, security. Recent data breaches highlight the need for firms to have strong security procedures in place to protect their data and their clients’ data. A growing number of clients are requiring law firms to provide them with information regarding the firm’s security practices and protocols. Law firms may look to insurance products to help them manage disclosure risks.
9. Security again. Lawyers have the added ethical responsibility to ensure that they protect their client’s confidential information. Attorneys should ensure that they understand how clients’ data are protected when data are in their custody.
10. As diverse technology adoption permeates law firms, clients and attorneys should have a good “IT IQ.” This conclusion is supported by Comment 8 to the ABA’s Model Rule 1.1 regarding attorney competence which admonishes attorneys to “keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology.”
We may never know whether it was the quality of the lecture or the prospect of two mandatory continuing legal education credit-hours that kept close to two hundred attendees glued to their seats for two hours without air-conditioning in Boalt Hall’s Booth Auditorium on a very hot October 6. It’s a fair bet that when the Berkeley Center for Law Technology’s 7th Annual Privacy Lecture ended, its attendees took away some compelling insights into government surveillance cases they may be asked to argue.
The lecture was moderated by Paul Schwartz, Jefferson E. Peyser professor of law at the University of California, Berkeley School of Law, and featured Cambridge University’s professor of security engineering, Ross Anderson, who presented highlights from his paper, Privacy vs. Government Surveillance: Where Network Effects Meet Public Choice. Three commentators with practices spanning intelligence risk management, innovation economics and international law were also on board to offer alternate perspectives about Anderson’s work.
Anderson has garnered a fair amount of press lately for offering less-traveled opinions about the fallout from Snowden-based leaks. Representing more than just a chilling narrative of international spy intrigue, Anderson argues that the revelations also offer a rare view about the economics of surveillance, a dynamic he says is a critical driver shaping the future of government surveillance, at both the domestic and international levels. The legal community should take particular note, he adds, because this dynamic will eventually find its way into U.S. courtrooms where the ‘separation of powers’ may be tested.
Those who understand the playbook that vaulted such companies as Google Inc., Microsoft Corp. and Facebook to the pantheon of tech dominance will see very similar tactics at play in the evolving landscape of government surveillance networks. Anderson says that the same forces that create monopolies in the private sector—network effects, technical lock-in and low margins—are also at work in the public sector:
When faced with the choice of aligning themselves with a small “spy network” like Russia or a much larger one, like the U.S., smaller nations will likely opt for the larger one, where the network effect of critical mass comes into play.
Cisco’s domination of the router market created the technical lock-in that forced China, which didn’t want to use U.S. products, to take the more expensive path of using Huawei Technologies Co., a Chinese vendor, to build out the country’s communications infrastructure.
PRISM, the National Security Agency’s mass electronic surveillance data mining project, gets cheap access (low margins) to customer data from such avid collectors as Google and Facebook.
And it is similar economic forces that will likely create “one of the thorniest problems for courts and legislators in the short-to-medium term,” Anderson predicts. Snowden’s findings revealed just how intermingled are the workings of U.S. law enforcement and intelligence agencies. So much so that PRISM, says Anderson, is essentially an “NSA code word for a data feed managed by the FBI.” And it may only be a matter of time, driven by the pressures of economics, that these agencies follow the pattern of information networks, and merge.
Carl Shapiro, Transamerica professor of business strategy at UC-Berkeley, suggested that many barriers still exist to a scenario where such U.S. agencies as law enforcement and intelligence could successfully merge, citing a long tradition of not sharing information. Law enforcement faces other challenges too, said James Aguilina, executive managing director of the intelligence and risk management firm, Stroz Friedberg. “They still need resources and training to deal with digital evidence.”
It’s time to think about establishing a “global due process”, advocated Anupam Chander, director of the California International Law Center, University of California, Davis. “Soon even the countries that now try to sequester user data will be sharing it with everyone else.” It’s a fertile area for future treaties. But whether nations will find and exert the collective will needed to make it happen, Chander says, is anyone’s guess.
With the ghosts, goblins and mini chocolate bars, another holiday of sorts is being celebrated this month: National Cyber Security Awareness. October is the time the higher powers spread public engagement and education about cybersecurity.
This marks the 11th annual year of the event, which is sponsored by the Department of Homeland Security, the National Cyber Security Alliance, and the Multi-State Information Sharing and Analysis Center. “Cybersecurity is a shared responsibility. Every one of us must practice basic cybersecurity because an intrusion into one computer can affect an entire network,” said secretary of Homeland Security Jeh Johnson, in a prepared statement.
To celebrate, event organizers suggest a different activity for each business day in October, such as adding an auto signature highlighting the event to your email, writing a blog post about it or posting a message on social media. The month has four core focuses:
Week one promotes online safety.
Week two is for the secure development of information technology products.
Week three covers critical infrastructure and the Internet of Things.
Week four is about protecting small and medium businesses.
And the last week (there are five in October) deals with cybercrime and law enforcement. Boo!
Although good business sense should lead law firms toward legal technology innovation and investment, firms are largely static in their adoption of technology, said recent blog posts from Boston-based Blue Hill Research.
The three-part blog series called “Why Do Law Firms Struggle With Strategic IT?” was penned by David Houlihan, principal analyst at Blue Hill Research. Houlihan examines the last several years of the annual purchasing surveys from the International Legal Technology Association and legal tech blog InsideLegal. Legal tech spending is up from 2013 to 2014, according to the most recent survey. However, there has not been a significant change in IT spending or strategy since 2010, Houlihan said.
Houlihan identified several factors in firms that resisted technology change:
“Attorney Luddite-ism” and cultural resistance can “hamstring” technology deployments.
Time is valuable, and the billable hour frowns upon efficient technology.
More emphasis is placed on the time needed for software training than comprehending its potential benefits.
The partnership model generally focuses on the “disincentive to invest in operation.”
To embrace new technology, Houlihan suggests providing IT stakeholders with information regarding how IT can support growth and profit margins, as well as with education opportunities.
Communication between IT and attorney stakeholders is essential, Houlihan said, and may be accomplished via “cross-functional technology steering committees that combine IT, managing partners, practice leaders, associates and legal support and firm administration staff.”
An uproar recently erupted in the legal community that raises the question of where the market should expect to rely on the government—and where it should look to private industry.
The catalyst for the recent debate was when decades' worth of public records from five federal courts—the U.S. Court of Appeals for the Second, Seventh, and Eleventh circuits and the Federal Circuit, as well as the U.S. Bankruptcy Court for the Central District of California—were summarily taken offline with less than a paragraph of notice to the public.
Notably, some of the removed dockets were from recent cases in the Federal Circuit, where many pivotal patent issues are decided.
In announcing upgrades to the federal government's official warehouse of court dockets and documents—Case Management/Electronic Case File Public Access to Court Electronic Records system (aka PACER) —the U.S. Administrative Office noted, "As a result of these architectural changes, the locally developed legacy case management systems in the five courts … are now incompatible with PACER; therefore, the judiciary is no longer able to provide electronic access to the closed cases on those systems."
Law firms, academics and journalists were apoplectic. Should they be?
A historical perspective is useful. PACER, as a database, was created in 1988, while the ability for the public to file court records electronically was adopted by most courts by 2005.
The intent of the PACER system, similar to that of the Electronic Data Gathering, Analysis and Retrieval database (aka EDGAR), was to relieve the burden of processing and storing paper documents to make court operations more efficient. Public access was not the central requirement. Neither were the needs of practitioners, even though their acceptance was critical to the system's success and adoption.
It was an incredible feat to get the entire legal community to comply with new electronic filing methods, but it has been quite successful. At a time when there are still entire state court systems without a digital database, all federal courts have adopted PACER. There are 2,100,000 cases filed into CM/ECF PACER every year, containing 60,000,000 entries, according to a report by consultant J. Michael Greenwood, one of the system's creators.
Today, only a small number of cases are exempted from e-filing, creating a vast treasure trove of data that makes court proceedings more transparent—and opens up avenues of research and awareness that were at one point too time consuming or cost prohibitive to perform.
Before PACER started gathering and archiving data, accessing a single document meant sending an employee to a courthouse with a bag of quarters in hand to physically copy dockets and documents. Now there is a database that attorneys can use for various tasks, such as comparing briefs in multiple cases or getting a sense of how hard an adversary may fight back on an issue.
Essentially, the government did the heavy lifting by creating a system for filing and retrieving documents and mandating compliance to use it. So with the recent uproar in mind, why does the legal community continue to expect it to do more?
This is reminiscent of the creation of EDGAR. When created, it took important company filings out of the Redweld folders and cardboard boxes at the Securities and Exchange Commission and made them accessible. Of course, accessible meant just what the name of the system implied—gathering and retrieval. Tools to search and analyze the data were in the domain of private industry and third-party vendors.
Today, there are a plethora of tools on the market that make the retrieval process on EDGAR seem like the Dewey Decimal system.
Private vendors have delivered tools that can drill down to the deal and document type, evolving the data to become a negotiation and drafting tool giving practitioners instant knowledge as to "what's market" in deal terms and clauses. Some of the more advanced platforms can even deliver results as targeted as individual clauses within EDGAR exhibits.
PATENT & TRADEMARK
The U.S. Patent and Trademark Organization is another interesting example of the interplay between government and industry. With a priority on public access, all of its data—including patents and patent applications—are now hosted on a privately owned search engine to avoid service disruptions because the third-party vendor, Google Inc., has a much stronger technology platform.
Further, after the White House recently issued a call to improve the strength of the U.S. patent system, a private analytics vendor, Lex Machina, using USPTO and PACER data, is now providing its expertise to the market—as well as back to the USPTO. The company is supplying data to help recipients of demand letters that claim patent infringement to determine the best course of action based on the prior activity of the non-practicing entity issuing the demand.
Court dockets are no different. Once the government forced litigants and judges to file electronically through PACER CM/ECF, it became the responsibility of private industry to take the data further. They have the means, the knowledge and the incentive to develop what the market needs.
In fact, private industry has for years been enhancing the value attorneys derive from PACER. For example, there are a number of products on the market that archive data from court dockets, including LexisNexis Courtlink, Westlaw CourtExpress and my company's Bloomberg Law.
These products offer tools that PACER does not, such as more advanced search options, the ability to receive alerts on new cases and to track the progress of cases in which the attorney is not appearing. (These products still contain many of the cases that were recently removed from PACER.)
There is plenty of knowledge and talent in the market to build on what the government has already provided. If PACER did not create those records in the first place no one would be upset over removing them. None of the docket vendors would have any record of them.
Now that there is a long history of digitized case dockets, law firms have figured out that the data are more useful than the case management function for which they were traditionally used.
What's the future of innovation look like with regard to PACER?
» Private industry continues to seek ways to decipher the trends, patterns and relationships contained within millions of records to help them make better strategic decisions in their cases or to figure out what the next big issue for their practice is.
» Lawyers are looking for judicial scorecards to glean how often a judge grants or denies a particular type of motion or how long he or she generally takes to reach a decision. A practitioner coming into a new matter may need to quickly understand the entirety of a docket that has the potential to grow to thousands of entries, needing a way to easily filter to the important aspects of the docket.
» Law firms are looking to discard dusty old-form files for a system that delivers recent and relevant exemplar documents—and perhaps goes even further by delivering clauses in briefs, similar to breakthroughs with EDGAR data.
Who is going to deliver these tools? The answer is certainly not government. We should thank the folks at PACER for bringing the federal courts into the modern age, but then we should turn to industry and ask what they plan to do to continue improving PACER's value.