Google’s algorithm can manipulate elections

According to a study done by Robert Epstein and Ronald E. Robertson, changes made to Google’s search algorithm have the ability to manipulate voting preferences of undecided voters by 20 percent or more. Published in the Proceedings of the National Academy of Sciences (PNAS), the study experimented with the Search Engine Manipulation Effect (SEME) in two countries with over 4,500 participants.

The investigators conducted an experiment where participants were randomly assigned to one of three groups in which search rankings favored Candidate A, B, or neither. Before researching for 15 minutes on a search-engine called Kadoodle, participants were provided a short description of both candidates and asked whom they would be voting for. The 30 search results were the same for everybody, but ordered differently depending on the group. The number of people favoring a candidate increased between 37 and 60 percent due to the biased search algorithm.

Google adjusts its search algorithm 600 times a year. In refutation of SEME, Google comments: “Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine the people’s trust in our results and company if we were to change course.”

 

Articles: Politico Magazine, August 19, 2015; via MILRN

Photo: Campaigning with a Smile via Jack [Creative Commons Attribution-NonCommercial-NoDerivs]


Data encryption thwarts criminal investigations

According to Manhattan’s District Attorney, smartphone data encryption hinders criminal investigations in state courts. Cyrus R. Vance, Jr. testified to the Senate Judiciary Committee on July 8, 2015 in an effort to advocate legislation allowing law enforcement officials to access private phone data with judicial authorization.

Vance, Jr. cites that 71% of phone evidence in his office comes from Apple or Android devices. As a result, Apple and Google’s move to fully integrate data encryption in their next devices will significantly affect prosecution processes in state courts.

State courts adjudicate over 90% of all criminal cases annually, which means over 100,000 cases for Vance’s office alone.

“To investigate these 100,000 cases without smartphone data is to fight crime with one hand tied behind our backs,” he asserts.

Vance does not support bulk data collection or surveillance without authorization. Civil liberty and privacy advocates are still wary, however, and endorse data encryption overall. This sentiment is in relative accordance with statements from Deputy Attorney General Sally Yates and FBI Director James Comey. They say that the Obama administration has no current plans to mandate companies to provide federal agents encryption keys for their products, but they also recognize that companies should not make their devices “warrant-free zones” that impede law enforcement’s authorized access to criminal evidence.

Article via Legaltech NewsAugust 10, 2015

Photo: IPhone via Jorge Quinteros [Creative Commons Attribution-NonCommercial-NoDerivs]


Non-lethal drones can be used in North Dakota by police

Legislation was passed last spring that allows police in North Dakota to utilize drones not only for surveillance but also as non-lethal weapons. The bill, which was originally introduced by Representative Rick Becker, did not permit the use of any kind of weaponry to be used, but lobbyists advocated for the bill to be amended to allow non-lethal weapons in order to win the support of law enforcement. There are restrictions incorporated into the law that limit the scenarios in which drones can be used by police, though. For example, a drone may only be used for surveillance if the data will be used in investigating a felony, and law enforcement must obtain a warrant to use the drone which includes very specific details on how, when and where it will be used. There are also limits on how personal information that the drone uncovers may be dealt with. Even with some restrictions within the legislature limiting the use of drones, some say that the drones are providing law enforcement with too much power.

Jay Stanley from the American Civil Liberties Union states that even non-lethal weapons can still have lethal results. Tasers, though non-lethal, still lead to approximately fifty deaths a year. Additionally, using drones may lead to detachment between the person operating the drone and the suspect on the other side, which could lead to regrettable choices. Jim McGregor disagrees, explaining that police officers out in the field may have a harder time making the right call than someone operating a drone in an offsite location with less to distract them. He likens drones to other methods, including SWAT teams and snipers, to show that drones are not so different from well-known choices for dealing with dangerous situations.

Whether non-lethal drones are a positive or negative development in law enforcement technology, Representative Becker intends to propose a  new law that will make non-lethal as well as lethal weaponry on drones illegal.

Article via TechNewsWorld, August 28, 2015

Photo: Drone via ninfaj [Creative Commons Attribution-NonCommercial-NoDerivs]


Artificial intelligence raises new legal and civil rights questions

Artificial intelligence features heavily in science fiction, from 2001: A Space Odyssey to Star Wars, but AI systems may soon become commonplace in day to day activities. Ten years from now, one third of standard jobs are expected to be completed by robots. While this may raise many concerns, perhaps the most unexpected issue lies with which legal and civil rights will be granted to artificially intelligent beings.

Recently, an artificially intelligent system in Switzerland that was programmed to shop online purchased several illegal items and was subsequently arrested. However, neither the system nor its creators were prosecuted. Cases like this question who should be held accountable for crimes committed by AI systems. Can artificially intelligent beings be held responsible for their actions? If the system is acting independently and is charged with a crime similar to how a human being would be charged, should the AI system be considered to have its own individual identity? And if AI systems have their own identities, should they also be granted the same rights as a human being? After all, a robot was created that could pass a test signifying that that it is self aware, a test that before only humans could pass.

While science fiction appears to have spent years preparing the world for the introduction of artificial intelligence into every day activities, the ethical and legal problems concerning the new technology are still surprising law enforcers and creators alike.

Article via TechCrunch, August 22, 2015

Photo: Electric Neuron via Ronny R [Creative Commons Attribution-NonCommercial-NoDerivs]


Antitrust complaint made against Google over banned privacy app

A recent complaint against Google claims that the company was violating antitrust regulations when it banned a privacy app. The app, Disconnect, stops malevolent advertising and secretive user tracking. The creator of the app filed a complaint with the EU on Tuesday. According to Disconnect, Google’s decision to discontinue the app on the Google Play store was an abuse of its dominant position in mobile tech. Google said the app violates its policies and their complaint is groundless.

“We don’t oppose advertising and understand ad revenue is critically important to many Internet companies, publishers and developers,” Disconnect Co-founder and CEO Casey Oppenheim said. “But users have the right to protect themselves from invisible tracking and malware, both of which put sensitive personal information at risk. Advertising doesn’t have to violate user privacy and security.”

The complaint is just a part of the increased scrutiny Google is undergoing as it hopes to increase the influence of its Android Mobile operating system. The ongoing European Commission investigation into Android shines a light on regulators’ worries concerning how Google exercises its power.

Disconnect argues that Google is using Android in monopolistic ways and has consolidated inadequate security and privacy features into its presiding products, thus bringing harm to consumers and giving itself a discriminatory advantage.

Google, on the other hand, claims that Disconnect has violated its policies, according to clause 4.4 of Google’s Developer Distribution Agreement which forbids apps from interfering with other apps.

Article via CNET, 2 June 2015

Photo:Big Google brother?  via Alain Bachellier [Creative Commons Attribution-NonCommercial-NoDerivs]


Lawyers Ethically Required To Understand E-Discovery?

Botched e-discovery can be an ethics violation, proposed opinion says (ABA Journal, 14 April 2014) – A proposed ethics opinion says California’s duty of competence requires lawyers to have a basic understanding of e-discovery issues and could require greater technical knowledge in certain cases. The proposed opinion (PDF) by the California State Bar’s Standing Committee on Professional Responsibility and Conduct says lawyers without the necessary competence have three options. They can acquire sufficient skill, they can seek out technical consultants or competent counsel, or they can decline the representation. The committee is accepting comments on the proposed opinion through June 24. The proposed ethics opinion is based on a hypothetical situation in which a lawyer agrees to opposing counsel’s search terms for a search of his client’s database. The lawyer instructs his client to allow the opposing counsel’s database search, wrongly assuming a clawback agreement would allow for recovery of anything inadvertently produced. After the search results are turned over to the opposing counsel without the lawyer’s review, the lawyer learns the search produced privileged information and showed that his client had deleted some potentially relevant documents as part of a regular document retention policy. The lawyer in the hypothetical not only breached his duty of competence, he also breached a duty to maintain client confidences and to protect privileged information, the proposed opinion says. In addition, the proposed opinion says, the lawyer should have assisted the client in placing a litigation hold on potentially relevant documents as part of the ethical duty not to suppress evidence.

Provided by MIRLN.

Image courtesy of FreeDigitalPhotos.net/digitalart.