Tuesday, February 24, 2015

Our deepest fear is not that we are inadequate. Our deepest fear is that we are powerful beyond measure. It is our light, not our darkness that most frightens us. We ask ourselves, Who am I to be brilliant, gorgeous, talented, fabulous? Actually, who are you not to be? You are a child of God. Your playing small does not serve the world. There is nothing enlightened about shrinking so that other people won't feel insecure around you. We are all meant to shine, as children do. We were born to make manifest the glory of God that is within us. It's not just in some of us; it's in everyone. And as we let our own light shine, we unconsciously give other people permission to do the same. As we are liberated from our own fear, our presence automatically liberates others.

Wednesday, February 11, 2015

Jason Atchley : Information Governance : Why IT Shouldn't Settle for Limited Visibility into VM Storage

Jason Atchley

Why Shouldn’t IT Settle for Limited Visibility Into Their VM Storage?

The modern data center has heavily invested in hosting, developing and managing virtual workloads, making the necessity for data-aware virtual machine storage greater than ever. In organizations where data is distributed across virtualized infrastructure and in thousands of VMs, fostering proactive, real-time security and compliance processes can be complex. To combat external and internal threats, as well as human errors that put sensitive data at risk, IT teams must expand their visibility into the storage layer of VM environments.
Why shouldn’t IT settle for poor VM storage visibility? Because as the data center shifts and expands, understanding what’s lurking in the shadows of stored files becomes essential to protecting an organization, its employees and its customers. To overcome some of the data management problems typically associated with virtualized environments, IT needs 360-degree, file-level visibility into virtualized data sets.
The shift to data-aware VM environments at the storage level provides organizations with the ability to:
  • Troubleshoot and remediate privacy, security and compliance issues;
  • Search, protect and govern stored data
  • Gain operational, security and business insights;
  • Evaluate data at the VM or file level based on several key factors; and
  • Avoid common dark data pitfalls.
Click through our newest SlideShare, “Bring Data Awareness to Your VMware Environment: Why IT shouldn’t settle for incomplete VM storage visibility” to learn how data-aware storage helps control and optimize VM environments.
BringDataAwarenesstoYourVMwareEnvironmnet_SlideShare_DG
This entry was posted in General on February 11, 2015 by .

About Jeff Boehm

Jeff Boehm is the vice president of marketing at DataGravity. Jeff brings more than 20 years of experience with a rare combination of marketing skills, organizational leadership and technical background to DataGravity. Having shaped the BI and search markets working for industry pioneers and disrupters, Jeff is excited to be redefining the storage market.

Wednesday, November 5, 2014

Jason Atchley : Information Technology : FIve Software Approaches to Harness the Internet of Things

Jason Atchley : Information Technology : Five Software Approaches to Harness the Internet of Things

jason atchley

Five Software Approaches To Harness The Internet Of Things

Research is evidence-based. But how much information is too much? The life sciences industry is facing an onslaught of digital data, coming from the proliferation of mobile devices and genetic sequencing, to new biomarkers and diagnostics. For the life science industry to make use of all this input – to really succeed in the new “Internet of Things” – technology companies are searching for a holistic approach to harness this vast data.
As the technology industry evolves to meet the needs of life science, software-as-a-service (SaaS) companies – companies that host their software in a cloud infrastructure – add a variety of tools to support research. Some tools are purchased and others built in-house, and this can create a dilemma. How do you get diverse products to behave as a single, logical unit without tearing everything down and starting again?
Software Programmer
The manufacturing industry is a far cry from life sciences: it deals with “things,” not people, and its mechanisms, though complicated, are nothing compared to human bodies. Still, there is a lesson to be learned by looking how Volkswagen solved a problem in supplying parts to a large variety of models quickly, reliably and with massive volumes.
Volkswagen is one of the biggest automobile manufacturers in the world and owns companies like Audi, Seat, Skoda, each with many different models. Cars of all shapes and sizes from super-minis to hulking people carriers, but what do they have in common? They are all based on the VW Golf. No matter what size, shape and purpose the cars have, they all share VW Golf’s MQB platform.
The system is a shared modular platform for transverse engine, front-wheel-drive cars. Using the Golf MQB platform, VW can produce a wide variety of vehicles that sell quickly and reliably on a scale that satisfies its market but does not lead to waste via over-production in anticipation of demand.
Drawing a comparison, how can a software platform be developed that delivers the same benefits as the VW’s standardized component platform? Starting from a common platform is obvious, but creating one from scratch would be too expensive. VW’s route to a platform was evolutionary, with gradual adoption over time. That is the approach Medidata is adopting with Service-oriented Architecture (SOA).
At Medidata, our products are being designed to behave in concert with one another, as a platform, but still to retain each product’s unique characteristics and intrinsic technology stack e.g. Java, .Net, Ruby, etc. This can be achieved by adding services into the existing product mix so they can handle the activities that individual products share like authentication, feature authorization, workflow management, alerts and notifications, etc.
By separating out the common product functionality and delegating it into services, we’ve reduced complexity, standardized interchange-ability and increased product utility just as VW did. However, developing services, or standard components, to take on the tasks already handled by innate product functionality only gets you so far.
For the whole to be greater than the sum of its parts, each new service has to outperform its original in-product role. This is achieved by the adoption of new cloud methodologies/approaches:
The 12-factor app: Each service must be compliant with the twelve-factor methodology for building software-as-a-service. One of the tenets of the 12 Factor methodology covers a topic called “process formation.” This means when a developer creates a service, the developer should build it so each constituent process can be duplicated across to another cloud server to take on more work should the load increase. This allows the application to scale horizontally and increase performance to match the level of processing required by the users. In the pre-cloud days, greater performance meant buying a bigger server, which was both expensive and slow to spin-up.
Auto-scaling and flexibility: It is one thing to spawn another process to carry an ever-increasing user load, but it’s quite another to control it. Auto-scaling allows your service to automatically scale up when certain pre-defined conditions are met and more importantly, to scale down and stop paying for the extra nodes when the user load drops off.
12-factor compliance: Make services more reliable by improving the way they are built, tested and deployed into production by having each one as self-contained as possible. That way, when a bug does arise, it can be easily traced and rectified by a group of developers working in unison.
Hypermedia service adoption: Application programming interfaces (APIs) should behave like web-pages, where a developer can navigate to and visualize data more easily. This should allow them to write better quality integrations.
Unified reporting strategy: Using SOA, you can combine the audit trails from all component modules and create a stateless, homogenized, service-driven data feed that we can use for comprehensive, platform-wide reporting and analytics.
In short, to meet the ever-increasing demands for data from regulatory authorities, software-as-a-service providers have to make relatively diverse products work together closely. At Medidata, adopting a common component approach, as VW has done, gives us a flexible platform that is reliable and robust. And using a cloud-based, service-orientated architecture allows us to scale up and down. That helps our customers run complex clinical trials cost-effectively.
[A version of this article appeared as “A Platform In The Cloud” in Pharma Technology Focus.]


Thursday, October 30, 2014

Jason Atchley : Legal Technology : New Discovery Rules to Rein in Litigation Expenses

jason atchley

New Discovery Rules to Rein in Litigation Expenses

, Corporate Counsel
    | 0 Comments

Ask U.S. businesses about the country’s legal system, and the primary complaint almost always involves cost. Not far behind is the lament that the high cost of litigation forces companies to offer generous settlement payments, even when the merits of the case suggest that the case should be taken to trial or settled for a much smaller amount. Although lawsuits are likely to remain expensive, the federal judiciary has approved new rules that represent an important assault on runaway costs.
Most of the cost involved in a typical business lawsuit is incurred in pretrial discovery. Although discovery has long been expensive, costs exploded with the introduction—and now near-ubiquity—of electronic data, including emails and databases. In larger cases, the cost of document discovery can easily reach into the millions of dollars per lawsuit. Perhaps more important, because the court rules place little restraint on the tendency of lawyers to search through as much data as possible in the hope of finding something useful, the process is not only expensive but inefficient. One survey of large lawsuits found that for every 1,000 pages of documentation produced in discovery, only one page became an exhibit at trial.
This treasure hunt for documents imposes costs on companies in another respect: The current litigation system requires companies to incur significant costs to ensure that documents and data—primarily in electronic form—are preserved for potential litigation. The reason is that, with increasing frequency, the outcome of lawsuits is determined not so much by whether a contract was broken or whether a fraud took place, but how well the litigants preserved electronic data. Earlier this year, for example, a federal judge in Louisiana instructed a jury that because a pharmaceutical company had failed to preserve certain records—even though the records may not have been useful in the lawsuit—the jury was “free to infer those documents and files would have been helpful to the plaintiffs or detrimental to” the pharmaceutical company. The jury went on to slap the pharmaceutical company with a $6 billion punitive damages verdict.
The risk of such an instruction drives many companies to spend huge sums on document preservation and storage that would otherwise be unnecessary for their business operations. A report issued earlier this year by professor William Hubbard of the University of Chicago Law School pegged the “fixed” cost of implementing hardware and software systems to preserve electronic data to be $2.5 million per year for large companies, and the additional, lawsuit-specific costs of preserving data to range from $12,000 per year for small companies to nearly $39 million per year for large companies.
On September 16, the Judicial Conference of the United States—a group of 26 federal judges that serves as the policy-making body for the federal courts—announced several proposed amendments to the rules that govern discovery and trials in the federal courts that address the problem of discovery cost in several significant ways. First, the new rules emphasize that discovery is to be “proportional to the needs of the case, considering the importance of the issues at stake in the action, the amount in controversy, the parties’ relative access to relevant information, the parties’ resources, the importance of the discovery in resolving the issues, and whether the burden or expense of the proposed discovery outweighs its likely benefit.” The explanatory notes accompanying the proposed rules make it plain that the intent of the rules is to encourage active judicial oversight of the discovery process to ensure that discovery is not excessive, redundant or more expensive than necessary.
Second, the proposed rules give judges more explicit authority to ask the party requesting documents and information to share in the costs of locating and producing such documents and data. Whereas the cost of locating, processing, reviewing and producing information—including electronically stored information—today is almost always borne by the producing party, the new rules portend greater judicial willingness to condition approval of expensive document and data requests on the requesting party’s willingness to pick up the tab, particularly when the importance of the requested data is not obvious or when the likelihood of finding useful information is low relative to the effort and cost of searching for it.
Third, the new rules place explicit limitations on the circumstances under which courts may mete out the most severe sanctions for failure to preserve electronically stored information. At present, some federal courts take the position that only intentional loss or destruction of documents may result in an “adverse inference instruction”—that is, the judge telling the jury that a party has failed to preserve data, and, as a result, the jury may (or even must) presume that the unavailable data or documents are unfavorable to that party. Those courts similarly require willfulness before imposing even more draconian sanctions such as dismissal. Other federal courts are of the view that even inadvertent failures to preserve data (typically electronic data) may give rise to such sanctions.
The new rules make it plain that judges cannot impose any sanctions without first determining that the loss of data or documents prejudiced the opposing party, and that any sanctions may be “no greater than necessary to cure the prejudice.” Moreover the proposed new rules are explicit that the most severe sanctions, such as adverse inference instructions or dismissal, may be imposed “only upon finding that the party [that failed to preserve data] acted with the intent to deprive another party of the information’s use in the litigation.”
The new rules will help bring clarity and uniformity to the imposition of sanctions for spoliation and should limit the imposition of the most severe sanctions (as well as the expensive motion practice concerning such sanctions). Further, the new rules will likely reduce the cost of corporate document preservation, as companies will no longer have to over-preserve electronically stored data and information as protection against the possibility that merely inadvertent loss of such data and information could be grounds for the most serious sanctions (and billion-dollar punitive damages verdicts).
The new rules will not be a panacea, however. Whether spoliation occurred “with the intent to deprive another party of the information’s use in the litigation” still will be decided by district court judges. In the Louisiana pharmaceutical case, for example, the judge determined that the destruction of documents before the pharmaceutical company anticipated litigation over the particular dispute at issue constituted intentional spoliation, because a litigation hold involving different and previously resolved claims had never been withdrawn (imposing, in the judge’s view, an ongoing preservation obligation). Still, the new rules’ unmistakable directive should give federal judges pause before imposing the more severe types of spoliation sanctions.
The proposed new rules are now before the U.S. Supreme Court, which is likely to approve them. Absent (unlikely) action by Congress to reject or modify the rules, the new rules will go into effect on December 1, 2015. Although that is a year away, the clear statements of the Judicial Conference concerning the need to reduce litigation costs and to cabin the circumstances under which the most severe sanctions may be imposed for failures to preserve (primarily electronic) data and information can be expected to begin influencing the decisions of judges and lawyers much sooner.
Creighton Magid is head of the Dorsey & Whitney’s Washington, D.C., office. He’s a member of the electronic discovery practice group and co-chair of the firm-wide products liability practice. He focuses on technology, commercial and products liability litigation.


Read more: http://www.corpcounsel.com/id=1202674945208/New-Discovery-Rules-to-Rein-in-Litigation-Expenses#ixzz3HdUe4RtQ



Wednesday, October 29, 2014

Jason Atchley : Data Security : Hackers Want Your Healthcare Data

jason atchley

Hackers Want Your Healthcare Data

Medical data is more valuable to hackers than credit cards, says Baker Hostetler.
, Law Technology News
    | 0 Comments

Doctor Explaining Consent Form To Senior Patient
Doctor Explaining Consent Form To Senior Patient
Social security number or credit card information? According to partner Lynn Sessions and associate Nita Garg of Baker Hostetler, medical information is the most valuable to hackers.
“Hackers have increased their focused attacks on the U.S. healthcare industry,” they said, relying on information from the Ponemon Institute citing a 20 percent increase in healthcare organizations reporting cyberattacks between 2009 and 2013. According to security experts, this increase stems from weak institutional security coupled with profitability of health records. “Unlike credit cards, which may be quickly canceled once fraudulent activity is detected, it often takes months or years before patients or providers discover the theft of medical information,” said Sessions and Garg.
We reported recently that the majority of health care breaches stem from the health care providers themselves. Between 2011 and 2012, protected health information was the leading cause of breaches in the health care industry, whether through theft, loss, unauthorized access or hacking incidents. The authors noted in their post that many healthcare companies rely on old computer systems. This, along with the switch from paper medical records to electronic ones, adds electronic fuel to the online fire. In the black market, health information is 10 to 20 times more valuable than a credit card number, they said. This information includes names, birthdays, policy number, diagnosis codes and billing information, they said.
Attorney Marlisse Silver Sweeney is a freelance writer based in Vancouver. Twitter: @MarlisseSS.


Read more: http://www.lawtechnologynews.com/id=1202674717300/Hackers-Want-Your-Healthcare-Data-#ixzz3HXpXbPl4