Lawyers are a conservative group when it comes to adopting new technology. This continue to hold true for the ever popular cloud technologies. Concerns about privacy and security related to data breaches are holding some firms back from transitioning over to cloud storage and services. In a 2015 Cloud Security Survey released Netwrix reveals the concerns around cloud adoption among lawyers include: security and privacy of data (26 percent), migration costs (22 percent) and loss of physical controls (17 percent). Moreover, security risks include unauthorized access (32 percent), insider misuse (18 percent) and account hijacking (18 percent.)

Alex Vovk, CEO and co-founder of Netwrix, told Legaltech News “Legal departments will be reluctant to entrust their valuable data and customers’ sensitive information, until they are absolutely sure that cloud providers can offer better security than the company can ensure on-premises.” Although data security is a privacy issue for all industries, legal departments are less likely to adopt technologies that do not guarantee full protection for their data.

Law firms may be cautious, but that doesn’t mean that they are uninterested in cloud technologies. According to the survey, 44 percent of the respondents indicated they their firms were in a stage of evaluation and discovery concerning cloud services. “This indicates that [law firms] are potentially ready to invest more in additional cloud security and consider various cloud options,” Vovk said. In fact, when it comes to hybrid cloud models, legal entities have the same interest in making the transition as private companies. In addtion, 37 percent of those surveyed favor a private cloud model.

Vovk summed up by stating that “… as soon as cloud providers are ready to provide additional security measures and to some extent ease the compliance burden …lawyers would become less skeptic[al] about cloud adoption.”

Article via Legaltech News, 3 December 2015

Photo: Cloud Solutions via NEC Corporation of America [Creative Commons Attribution-NonCommercial-NoDerivs]

It is common knowledge these days that you exist all over the internet.  Each site you view, app you use, and company you deal with tracks you in different ways, building databases of information which help them develop more effective (and profitable) services.  While this data is often protected by privacy policies, these policies generally allow data to be shared with anyone, given certain steps are taken to anonymize the data.  However, as we mentioned in a previous post, Harvard Researcher Latanya Sweeney has recently shown that data can never truly be anonymized, but can be pieced together using other publicly available information to “fill in the blanks”.

This is unfortunate since mining big data can be incredibly useful, not just for maximizing profits, but for measuring larger social trends and analyzing regional health concerns.  So how can we analyze data without sacrificing privacy, when traditional anonymization does not cut it? One solution is through harnessing differential privacy.  With differential privacy, whenever data is transferred between parties, it is randomly altered in ways which do not change how the database behaves statistically, but provides a mathematical limit on the probability of identifying any one entry.   That limit is the database’s privacy score.

Currently, differential privacy is facing a number of mathematical hurdles including developing more efficient algorithms which require less computing time and ensuring that the random alterations to the database cannot be sniffed out.  Given the fractured state of American privacy law, even once these technical hurdles are surmounted, it will be difficult to have differential privacy become the norm.  Were it to succeed, this tool would be invaluable to wide-scale social research, with very promising implications in the fields of medicine, sociology, economics, etc.

One of the many fronts in technology’s war against personal privacy comes from the refinement of facial recognition software.  Like the eye scanners in the movie “Minority Report”, these new programs are able to scan and identify people’s faces on the fly, and can be simply implemented into common security systems or personal devices.  While this tech has tons of useful and beneficial applications (such as blurring out the faces on the people captured in Google Maps’ “street view”) the privacy implications are terrifying.

What is worse is the efficiency with which these new programs can identify you.  Some of the more sophisticated of them can identify a face in profile, from many angles, even when partially obscured.  A trench coat and sunglasses will no longer be enough to keep you out of the spotlight, and in response a new movement has sprung up trying to develop ways of foiling facial recognition technology.  The website attempts to meld camouflage and fashion by developing provocative hair and makeup styles, to confuse computer (and human) onlookers.  The Japanese National Institute of Informatics disregards style entirely with its new anti-recognition goggles.  The goggles are covered in LEDs which blast infra-read light to wash out computer images but without blinding the people around them.  However, it is likely that further advances in the technology will be able to see through these disguises, leading to an arms-race of sorts between the hiders and seekers.  As this technology becomes more prevalent, it is likely that we will need a more legal solution to this surveillance problem.

As Americans are becoming more privacy conscious over what they voluntarily make available on the internet, a new and exciting product from Google may pose a significant risk in the form of traditional snooping. Google Glass is essentially a futuristic pair of glasses which provide a heads-up display to the wearer, allowing them to view a wealth of information hands-free. Privacy concerns arise from the integrated photo/video camera, which can record both video and audio at any time, and without giving any sort of external indication that it is doing so.

Google has stated that they are conscious of the privacy concerns, and are attempting to build in ways to prevent unauthorized snooping. However, just as cell phones can be jail-broken, tech enthusiasts will likely be able to modify their devices to circumvent any sort of protections which Google would build into the device. Essentially, this means that anyone could be under surveillance from private individuals at any time, and be totally unaware of it.

While such snooping would in many circumstances still be illegal, were Google Glass to become common, it may become difficult or impossible to properly police this surveillance, and would easy fodder for abuse. Clearly, there needs to be a greater dialogue on the issue before the technology can be widely disbursed.


Image provided by

The proliferation of GPS devices represent a prime example of technology outpacing the law, with profound effects on individual privacy. As of yet there is no unified law dictating when using GPS tracking is acceptable or not.  Although there have been some cases on the issue, it is far from clear when businesses are allowed to track employees, when the government can track suspects (or individuals in general), when cellphone companies can track their users, or even how that data should be handled when collected.

The proposed Geolocation Privacy and Surveillance Act (hr. 1312/s. 639, or simply the GPS act) is an attempt by lawmakers to give “government agencies, commercial entities, and private citizens clear guidelines for when and how geolocation information can be accessed and used”.  Information on the bill and other proposed legislation can be found at

The National Institute of Standards and Technology have recently released new security guidelines for protecting digitally stored information from intrusions.  NIST security guidelines represent a collection of the best company practices, and in the past have represented industry standards for digital information security.  When so much of the onus of keeping individual’s personal data private and secure falls on the companies themselves, these guidelines become an incredibly important gauge of the trustworthiness of the companies holding your data.  Avoid dealing with businesses not conforming to the NIST’s recommendations.