Once, people entering the legal profession could safely assume that the word "metrics" had meaning only to statisticians, engineers and those in similar occupations. But today, with the growing sophistication of electronic data discovery, anyone running legal EDD projects simply must have a working knowledge of metrics. Just look at the renewed focus on constraining costs expressed in the proposed amendments to the Federal Rules of Civil Procedure. The bottom line: you cannot predict and control EDD costs if you cannot measure them.
In general, a metric is simply a measurement of some quantifiable element. This means that the first criterion for a metric is that it be measurable. In EDD, some measurable items often include data volume, time and expenses.
However, an element can be measurable without being relevant. Therefore, the second criterion is that the metric be meaningful in the context of the matter and to the specific client. Measurable and meaningful metrics can vary by phase. For example, the cost of printing or copying may be relevant during the presentation, review, processing or production phase, but have little or no importance during the analysis, collection or identification phases. Metrics that typically span all phases include labor costs, broadband usage, certain overhead expenses and electronic storage costs.
The term e-discovery covers all aspects of mining electronically stored information for relevancy in a regulatory, criminal or civil matter, including identifying, collecting, storing and searching. Using metrics is essential to accurately forecast the cost of EDD, budget for discovery, control costs in each matter as it progresses and create efficiencies over time. This is, of course, different from merely receiving a quote from an EDD e-discovery vendor and then seeing if it is met. For example, knowing how much it costs for Corp. X to collect data for one custodian from say, two standard data sources, for a one year date range, might be essential to claim undue burden to opposing counsel to restrict the scope of EDD. Yet many clients do not have access to this kind of sophisticated metric.
Beyond budgeting, metrics enable efficiencies in the EDD process. If you can standardize the overall process, you can apply metrics, identify improvement—and repeat. This is the application of basic Kaizen continuous improvement principles. Armed with metrics you can insist that, over time, counsel or vendors identify efficiencies at every step—to perform EDD quickly, more economically and more accurately. For example, that can result in less project management time billed processing a gigabyte of data, or more useless data being culled before a review.
A key first step to liberating the power of metrics is to adopt the special EDD billing codes designed for in-house legal departments, vendors and outside counsel. The UTBMS L600 Code Series was created by the Legal Electronic Data Exchange Standard Oversight Committee Board, based on the model proposed by the Electronic Discovery Reference Model (http://www.ledes.org) expressly for EDD use. Although no universal metrics currently exist for EDD and must necessarily vary by client and matter, the LEDES model for EDD divides the process into stages or phases, providing a first step to begin to measure costs for the key dimensions of EDD. From those baselines, you can work with vendors and outside counsel to seek efficiencies moving forward.
Industry standard metrics can be helpful at first, but because each corporate client has a different data universe, the goal is to develop metrics for a specific client based on actual EDD over time. For example, one client may find its employees generally have less gigabytes of email stored per year than the industry average. With the help of a good vendor cooperating with IT and legal, baseline metrics and costs can be developed and continually fine-tuned. For example, after measuring perhaps a dozen matters, knowing that on average it costs $800 to collect and move to a document review tool one gigabyte of email would be a key metric for forecasting cost.
The EDRM Metrics Model is also a valuable framework to help users think about EDD metrics. It provides a framework for planning, preparation, execution, and follow-up of EDD matters and projects by showing the relationship between the EDD process and how information, activities and outcomes may be measured.
By combining metrics, such as the average cost to do X, with the relevant pricing model, it is possible to create a budget calculator designed specifically for EDD. Most of these have already incorporated a number of assumptions, but they normally include the ability for users to make modifications. Typically, these programs base budgets on the cost-per-gigabyte of data by unit, sometimes referred to as the cost-per-volume. This means that users may need to perform some tweaking—as documents vary widely in the amount of gigabytes used; for example, one gigabyte might hold a 6,000-page document or a 60,000-page file.
However, it can be just about impossible to develop meaningful metrics or budget calendars over time if the client uses different vendors and/or counsel, or does not require the same standards. A good EDD vendor working with client’s legal department and IT team can help put a metrics and cost containment program in place.
Are the big bucks being shelled out on big data not having the big impact anticipated? Michael Schrage wrote in the Harvard Business Review that the best data analytics lead to the same organizational culture.
Schrage’s research suggests it’s how the companies use their analytics that really matters. The ones that have moderate outcomes are employing big data for decision support, he said, whereas the most successful return on analytics is when “firms use them to effect and support behavior change.” It seems analytics are the most effective when “they’re used to invent and encourage different kinds of conversations and interactions,” he said.
However, this isn’t as easy as it may seem. “People may need to share and collaborate more; functions may need to set up different or complementary business processes; managers and executives may need to make sure existing incentives don’t undermine analytic-enabled opportunities for growth and efficiencies,” suggested Schrage. He used the example of a medical supply company that employed their results not to support existing sales programs, but to implement entirely new ones. “[T]he most productive conversations centered on how analytics changed behaviors rather than solved problems,” he said. Ask not what analytics can do for you, but what you can do for analytics.
Authenticating social media for litigation is a relatively new phenomenon, but it has not demanded the creation of any new legal theories. In today's litigation, evidence may come from Facebook, Twitter, YouTube, blogs or other social forums. Fortunately, the Federal Rules of Evidence (or applicable rules in your jurisdiction) are all that lawyers need to authenticate social media, blogs, websites and other online content. Basic steps have not changed, but social media adds a few new wrinkles.
Courts recognize that there are special challenges in authenticating social media. As noted inGriffin v. State of Maryland No. 74 (Maryland; Apr. 28, 2011), there is always a “potential for abuse and manipulation of a social networking site by someone other than its purported creator and/or user.” In that case, the court concluded that a printout from a social media site “requires a greater degree of authentication than merely identifying the date of birth of the creator and her visage in a photograph on the site,” to determine whether a person in question actually published a post.
As Griffin found, problems arise from a lack of detailed information. Lawyers often try to enter social media evidence into the record in the form of a website printout. But when social media is out of context, or identifying metadata and links are stripped, authentication is nearly impossible. However, when properly preserved, social media and website content can be vetted using the existing rules of evidence.
U.S. District Judge Paul Grimm, of the District of Maryland has addressed how courts can resolve digital media and social media disputes. In Lorraine v. Markel American Insurance Co., 241 F.R.D. 534 (D. Md. 2007), Grimm identified the issues a lawyer must consider when determining admissibility of digital evidence. They include the same standards applied to other types of evidence: relevance, authenticity, hearsay, the original writing rule and probative value as compared with possible unfair prejudice.
The rule that applies most directly to authenticating social media is Rule 901 of the Federal Rules of Evidence. However, Grimm notes that lawyers and judges have made mistakes admitting or denying social media when they forget to first consider Rule 104(a) and (b). Under those rules, a court must consider if a jury could reasonably find that the evidence is authentic—even if there is reason to question the evidence, a judge should not throw it out, but allow the jury to consider the issue. For example, in Parker v. State of Delaware (Del. Sup. Ct. Feb. 5, 2014), the defendant argued that social media evidence had to be authenticated by “testimony of the creator, documentation of the Internet history or hard drive of the purported creator's computer, or information obtained directly from the social networking site.” That standard had been applied in the Griffin case. However, the Parker court, following precedent from a Texas court case, said only that “the jury ultimately must decide the authenticity of social media evidence.”
That means a party objecting to digital evidence has a high burden. They have to show that the evidence is in fact a fake. “A trial judge should admit the evidence,” Grimm wrote, “if there is plausible evidence of authenticity produced by the proponent of the evidence and only speculation or conjecture—not facts—by the opponent of the evidence about how, or by whom, it ‘might’ have been created.” He continued, “Too many courts that considered admissibility of social media evidence completely overlooked this important distinction and, in doing so, made questionable rulings excluding evidence that should be admitted.”
Given that analysis, social media and website evidence is actually difficult to reject from most matters. However, in order to convince a jury that any tweet, Facebook post or email is ultimately authentic, a lawyer will need solid forensic analysis. As in Lorraine v. Markel American Insurance Co., Grimm noted that in applying[the authentication standard to website evidence, "there are three questions that must be answered, explicitly or implicitly: 1) What was actually on the website? 2) Does the exhibit or testimony accurately reflect it? 3) If so, is it attributable to the owner of the site?”
Getting to those facts sometimes takes research and analysis. In State of Connecticut v. Eleck, AC 31581 (Conn. Ct. App. Aug 9, 2011), the court found “an electronic document may continue to be authenticated by traditional means such as the direct testimony of the purported author or circumstantial evidence of ‘distinctive characteristics’ in the document that identify the author.” Once the proponent produces sufficient evidence to convince a reasonable juror that the social media evidence is authentic, the burden of production shifts to the party objecting to demonstrate the item is a fraud. Lorraine outlined some sensible steps to find out if a social media posting is likely authentic:
1. Ask the purported creator if he or she indeed created the profile and also added the posting in question.
2. Search the computer of the person who allegedly created the profile and posting, and examine the computer’s Internet history and hard drive to determine whether that computer was used to originate the social networking profile and posting in question.
LinkedIn endorsements may be good for the ego, but what they’re not good for is professional ethics. In a recent webcast on Lexblog, Megan McKeon, the senior marketing manager at Katten Muchin Rosenman explains why the legal industry needs to be wary of endorsing attorneys over the social media site.
“With the ABA model rules regulating what we as attorneys do for advertising and marketing it’s important for us to know that endorsements pretty much do run afoul of those model rules,” explains McKeon. According to Rachel Zahorsky in an ABA Journal story, ABA Model Rule 7.1 prohibits a lawyer from making a false or misleading claim about his or her services. The problem with LinkedIn, is that sometimes people will “endorse” the skill of a professional but have no first-hand experience with that person’s work. This is where lawyers can potentially run into problems, though the issue is not as black and white as the site itself. Zahorsky reports at least one professional ethics expert says just because an endorser doesn’t know the lawyer directly, doesn’t make it false.
However, McKeon takes the cautious approach. “Most of our attorneys, in pretty much every situation, should not be engaging with endorsements,” she told LexBlog.
While smartphones and tablets grab the headlines, innumerable lawyers still rely on laptop computers to handle an array of demanding legal and business applications that smaller mobile devices with their limited display, input, processing, memory and storage capabilities often struggle with.
Like all mobile devices, laptops have improved significantly over the past few years in terms of power, storage and connectivity. Today's top models easily outperform most of their desktop counterparts and even entry-level laptops are now more than capable of supporting routine legal and business tasks. At all price points, major advancements have also been made in battery life, portability (size and weight), durability and style.
The challenge facing a lawyer shopping for a new laptop is finding the model that most closely matches his or her work and lifestyle needs and preferences. To narrow down the number of choices, we've selected the top systems in five key categories. Here they are. (Click images to enlarge.)
SMALL AND LIGHTWEIGHT LAPTOP
For many lawyers, having an easy to carry laptop is far preferable to dragging around a larger system that's weighed down by a big screen and other nonessential components. Still, when shopping for a small and lightweight laptop, it's important not to sacrifice essential components, such as a decently sized display and comfortable keyboard, for carrying comfort. Go too far, and you'll find yourself with a system that's actually less useful than a tablet.
HP Spectre 13t-3000
The HP Spectre 13t-3000 strikes an acceptable balance between portability and performance. The system offers a highly readable, 13.3-inch display, a great keyboard, an extra-wide touch pad, a touch-screen, and a lightning-fast, solid-state drive.
Processor: 1.6GHz Intel dual-core i5 processor OS: Microsoft Windows 8 Screen/Display: 13.3-inch, 1920 x 1080 dots, touch-screen Memory/Storage: 4MB RAM/128GB solid-state drive Size/Weight: 12.75 x 8.66 x 0.59 inches (width x depth x height); 3.3 pounds MSRP: $999.99
MacBook Air 11-inch
Mac users looking for highly compact laptops are pretty much limited to the MacBook Air 11-inch. The system's small screen and weak processor are balanced by a solid-state drive and a very lightweight and compact form factor. Apple, incidentally, doesn't offer touch-screens on its laptops. The late Steve Jobs believed that vertical touch-screens are awkward and, over extended periods of time, painful to use.
Processor: 1.3GHz Intel dual-core i5 processor OS: Mac OS X Mavericks Screen/Display: 11.6-inch, 1366 x 768 dots Memory/Storage: 4GB/128GB solid-state drive Size/Weight: 11.8 x 7.56 x 0.68 (w x d x h) inches; 2.38 pounds MSRP: $999.00
DESKTOP REPLACEMENT LAPTOP
Using a single computer both inside and outside the office is a concept that appeals to lawyers who don't want to waste time synchronizing, configuring and maintaining two systems. The key to this approach is finding a laptop that's powerful enough to effortlessly handle everyday legal and business tasks, yet sufficiently portable to be a reliable traveling companion.
Lenovo ThinkPad X240
Don't let the compact size fool you; the Lenovo ThinkPad X240 is a little dynamo that's more than capable of serving as a desktop replacement. The system's perky processor is complemented by 8 gigabytes of RAM and a relatively spacious 256-gigabyte solid-state disk drive. Whether functioning as a desktop or laptop, the ThinkPad X240 has more than enough power to support virtually any legal or business task.
Processor: 2.10GHz Intel dual-core i5 processor OS: Microsoft Windows 8 Screen/Display: 12.5 inch, 1366 x 768 dots, touch-screen Memory/Storage: 8GB RAM/256GB solid-state storage Size/Weight: 12.03 x 8.19 x 0.79 (w x d x h) inches; 3 pounds MSRP: $1,879.00
Apple MacBook Pro 15-Inch With Retina Display
The stylish top-of-the-line Apple MacBook Pro 15-inch with Retina Display looks good in any office while outperforming most ordinary desktop PCs. The system offers a powerful processor, a boatload of memory (16 gigabytes) and a half-terabyte of storage.