February 28, 2014
Operation Optic Nerve Strikes A Nerve With Yahoo User Privacy
Lee Rannals for redOrbit.com - Your Universe OnlineGuardian show how GCHQ with assistance from the US National Security Agency was able to intercept and store webcam images from millions of Internet users not suspected of any wrongdoing.
Agency files dating between 2008 and 2010 state that a surveillance program known as Optic Nerve collected still images of Yahoo webcam chats in bulk and saved them to a database. During a six-month period, the agencies were able to collect webcam images from more than 1.8 million Yahoo user accounts around the world according to Guardian reporters Spencer Ackerman and James Ball.
“Yahoo reacted furiously to the webcam interception when approached by the Guardian,” the British news agency wrote. “The company denied any prior knowledge of the program, accusing the agencies of ‘a whole new level of violation of our users' privacy’.”
Currently, there are no restrictions under UK law to prevent Americans’ images from being accessed by British analysts without a warrant. The report said the documents show how the agencies had a hard time keeping sexually explicit content away from its staff.
Optic Nerve was designed as an experiment in automated facial recognition to monitor existing GCHQ targets and to discover new targets of interest. This program could potentially be used to find terror suspects or criminals, but it also captured non suspects of interest.
"Face detection has the potential to aid selection of useful images for 'mugshots' or even for face recognition by assessing the angle of the face," one document reads, according to Guardian. "The best images are ones where the person is facing the camera with their face upright."
Another document says that they were allowed to display images associated with similar Yahoo identifiers as their target, meaning it pulled in a large number of innocent users.
“Rather than collecting webcam chats in their entirety, the program saved one image every five minutes from the users' feeds, partly to comply with human rights legislation, and also to avoid overloading GCHQ's servers,” the Guardian wrote. “The documents describe these users as ‘unselected’ – intelligence agency parlance for bulk rather than targeted collection.”
According to the report, GCHQ did not make any specific attempts to prevent collecting data or storage of explicit images that could have been associated with innocent people. However, the agency did try to prevent some of the lewd images from being seen by analysts by not including images that did not show a face.
One Optic Nerve document says it is possible to handle and display undesirable images.
“There is no perfect ability to censor material which may be offensive. Users who may feel uncomfortable about such material are advised not to open them. You are reminded that under GCHQ’s offensive material policy, the discrimination of offensive material is a disciplinary offense,” the document read.
A spokeswoman for Yahoo strongly condemned the agencies’ actions, adding that it was not aware of the reported activity.
"This report, if true, represents a whole new level of violation of our users' privacy that is completely unacceptable, and we strongly call on the world's governments to reform surveillance law consistent with the principles we outlined in December,” the spokeswoman told Guardian. "We are committed to preserving our users' trust and security and continue our efforts to expand encryption across all of our services."
A GCHQ spokesman said in a statement that it has a longstanding policy that the agency does not comment on intelligence matters.
"Furthermore, all of GCHQ's work is carried out in accordance with a strict legal and policy framework which ensures that our activities are authorized, necessary and proportionate, and that there is rigorous oversight, including from the secretary of state, the interception and intelligence services commissioners and the Parliamentary Intelligence and Security Committee,” the spokesman told Guardian.