FTC told to disclose the data security standards it uses for breach enforcement (Computerworld, 2 May 2014) – The Federal Trade Commission (FTC) can be compelled to disclose details of the data security standards it uses to pursue enforcement action against companies that suffer data breaches, the agency’s chief administrative law judge ruled Thursday. The decision came in response to a motion filed by LabMD, a now-defunct medical laboratory that has been charged by the FTC with unfair trade practices for exposing sensitive information belonging to 10,000 patients in 2010. LabMD has accused the FTC of holding it to data security standards that do not exist officially at the federal level. It has maintained that the agency must publicly disclose the data security standards it uses to determine whether a company has reasonable security measures in place. The judge held that while LabMD may not inquire about the FTC’s legal standards or rationale, it has every right to know what data security standards the commission uses when pursuing enforcement action. The FTC’s Bureau of Consumer Protection “shall provide deposition testimony as to what data security standards, if any, have been published by the FTC or the Bureau upon which [it] intends to rely on at trial,” Chappell ruled. [ Polley : Steptoe writes : “LabMD is surely hoping that having the FTC acknowledge on the record that it does not actually have “data security standards” will underscore – for the ALJ, for courts, for Congress, and the public – LabMD’s contention that the FTC is acting as a lawless bully.”]

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/Victor Habbick.

Apple releases guidelines for law enforcement data requests (CNET, 7 May 2014) – Apple has published a new set of guidelines regarding how law enforcement agencies and other government entities may request information from the company about user data. The new rules , which were posted to Apple’s website late Wednesday, reflect Apple’s move toward notifying its customers when it receives law enforcement requests for user data. “Apple will notify its customers when their personal information is being sought in response to legal process except where providing notice is prohibited by the legal process itself, by a court order Apple receives (e.g., an order under 18 U.S.C. §2705(b)), or by applicable law or where Apple, in its sole discretion, believes that providing notice could create a risk of injury or death to an identifiable individual or group of individuals or in situations where the case relates to child endangerment,” the guidelines state. Apple says it can extract active user-generated data from native apps on passcode-locked iOS such as SMS, photos, videos, contacts, audio recording, and call history. However, it can’t provide email, calendar entries, or any third-party app data. Also it can only perform data extraction from devices running iOS 4 or later “in good working order” at its Cupertino headquarters. Apple also said that upon the receipt of a valid wiretap order, it can intercept users’ email communications but not their iMessage or FaceTime communications because those communications are encrypted.

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/renjith krishnan.

FCC decides that it will no longer enforce the Zapple doctrine – killing the last remnant of the Fairness Doctrine (Broadcast Law Blog, 8 May 2014) – The Zapple Doctrine was an outgrowth of the FCC’s Fairness Doctrine. The Zapple Doctrine required that broadcast stations that give air time to the supporters of one candidate in an election give time to the supporters of competing candidates as well. Even though the Fairness Doctrine has been defunct for years, having had various manifestations of the Doctrine declared unconstitutional either by the Courts or the FCC, Zapple apparently lived on, or at least a death certificate had never been issued (see, for instance, our articles mentioning the continued life support of the Doctrine, here and here ). Thus stations had to be concerned about giving air time to supporters of political candidates for fear of having to provide a similar amount of time to those supporting competing candidates. Apparently, that uncertainty has now been resolved, as in two just released cases, the FCC”s Media Bureau has declared that Zapple, like the rest of the Fairness Doctrine, is dead. The cases just decided (available here and here ) both involved the recall election of Wisconsin Governor Scott Walker, where complaints were filed against the renewals of two radio stations, complaining that those stations did not provide equal opportunities to supporters of Walker’s recall opponent even though station hosts provided on-air support for Walker. The FCC rejected those complaints, declaring: Given the fact that the Zapple Doctrine was based on an interpretation of the fairness doctrine, which has no current legal effect, we conclude that the Zapple Doctrine similarly has no current legal effect.

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/Stuart Miles.

Legal Loop: ABA on lawyers mining social media for evidence (NY Daily Record, 4 May 2014) – Social media has been around for more than a decade now and its impact on our society is indisputable. But it’s only been in recent years that lawyers have begun to fully realize what a treasure trove of useful information can be obtained from social media throughout the litigation process. Of course, mining social media for evidence has both drawbacks and benefits. Lawyers who seek to use social media evidence to obtain evidence for their cases must tread carefully and ensure that they fully comply with their ethical obligations when doing so. Fortunately, there is a good amount of guidance available since a number of jurisdictions have addressed the ethics of mining social media for evidence. For the most part, the ethics boards have concluded that lawyers may not engage in deception when attempting to obtain information on social media, regardless of whether the party from whom information is sought is represented by counsel. See, for example: Oregon State bar Ethics Committee Op. 2013-189 (lawyer may access an unrepresented individual’s publicly available social media information but “friending” known represented party impermissible absent express permission from party’s counsel); New York State Bar Opinion No. 843 [9/10/10] (attorney or agent can look at a party’s protected profile as long as no deception was used to gain access to it); New York City Bar Association Formal Opinion 2010-2 (attorney or agent can ethically “friend” unrepresented party without disclosing true purpose, but even so it is better not to engage in “trickery” and instead be truthful or use formal discovery); Philadelphia Bar Association Opinion 2009-02 (attorney or agent cannot “friend” unrepresented party absent disclosure that it relates to pending lawsuit); San Diego County Bar Association Opinion 2011-2 (attorney or agent can never “friend” represented party even if the reason for doing so is disclosed); and New York County Lawyers Association Formal Opinion No. 743 (attorney or agent can monitor jurors’ use of social media, but only if there are no passive notifications of the monitoring. The attorney must tell court if s/he discovers improprieties and can’t use the discovery of improprieties to gain a tactical advantage). The American Bar Association’s Standing Committee on Ethics and Responsibility weighed in just last month. In Opinion 466, the committee considered “whether a lawyer who represents a client in a matter that will be tried to a jury may review the jurors’ or potential jurors’ presence on the Internet leading up to and during trial, and, if so, what ethical obligations the lawyer might have regarding information discovered during the review.”

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/photoraidz.

Google halts student Gmail advertisement scans (BBC, 30 April 2014) – Google has stopped scanning millions of Gmail accounts linked to an educational scheme – a process it uses to target adverts. The decision includes email accounts associated with Google Apps for Education (GAE) . This initiative provides teachers and students with access to free apps and storage, as well as customised @schoolname.edu email addresses. The move follows reports the scans might have breached a US privacy law. Google highlighted its use of such scans when it updated its terms and conditions last month. “Our automated systems analyse your content (including emails) to provide you personally-relevant product features, such as customised search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored,” the terms read . However, the Education Week website said this data-mining activity might place the firm in breach of the US Family Educational Rights and Privacy Act. “We’ve permanently removed all ads scanning in Gmail for Apps for Education, which means Google cannot collect or use student data in Apps for Education services for advertising purposes,” wrote Google for Education director Bram Bout on a company blog. The change is also promised for users who signed up to Gmail as part of the service while at school or university, but have now moved on.

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/digitalart.

Phones are giving away your location, regardless of your privacy settings (Quartz, 28 April 2014) – Sensors in your phone that collect seemingly harmless data could leave you vulnerable to cyber attack, according to new research. And saying no to apps that ask for your location is not enough to prevent the tracking of your device. A new study has found evidence that accelerometers-which sense motion in your smartphone and are used for applications from pedometers to gaming-leave “unique, trackable fingerprints” that can be used to identify you and monitor your phone. Here’s how it works, according to University of Illinois electrical and computer engineering professor Romit Roy Choudhury and his team: Tiny imperfections during the manufacturing process make a unique fingerprint on your accelerometer data. The researchers compared it to cutting out sugar cookies with a cookie cutter-they may look the same, but each one is slightly, imperceptibly different. When that data is sent to the cloud for processing, your phone’s particular signal can be used to identify you. In other words, the same data that helps you control Flappy Bird can be used to pinpoint your location. Choudhury’s team was able to identify individual phones with 96% accuracy. “Even if you erase the app in the phone, or even erase and reinstall all software,” Choudhury said in a press release, “the fingerprint still stays inherent. That’s a serious threat.” Moreover, Choudhury suggested that other sensors might be just as vulnerable: Cameras, microphones, and gyroscopes could be leaving their smudgy prints all over the cloud as well, making it even easier for crooks to identify a phone.

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/KROMKRATHOG.