Quantcast

Government Pre-Crime, Data Mining Operations Raising Privacy Concerns

October 10, 2011

The Department of Homeland Security (DHS) is allowing members of the public to voluntarily participate in a new program that has been created to try and predict when an individual will commit a crime–or possibly engage in terroristic activities–Declan McCullagh of CNET reported Friday.

McCullagh said that an internal document originating from Homeland Security, obtained by the Electronic Privacy Information Center (EPIC) under the Freedom of Information Act, reveals “efforts to ‘collect, process, or retain information on’ members of ‘the public’” through a “pre-crime” system known as Future Attribute Screening Technology (FAST).

The document was dated June 2010, McCullagh said, and also contained comments from FAST program manager Robert Middleton Jr. referring to a “limited” trial that would use Homeland Security employees as subjects. He quotes Middleton as stating that FAST “sensors will non-intrusively collect video images, audio recordings, and psychophysiological measurements from the employees.”

In a statement, DHS Deputy Press Secretary Peter Boogaard told CNET, “The department’s Science and Technology Directorate has conducted preliminary research in operational settings to determine the feasibility of using non-invasive physiological and behavioral sensor technology and observational techniques to detect signs of stress, which are often associated with intent to do harm.”

“The FAST program is only in the preliminary stages of research and there are no plans for acquiring or deploying this type of technology at this time,” Boogaard added.

On May 27, 2011, Sharon Weinberger of Nature.com wrote that the program, which has drawn comparisons to the pre-crime program featured in the motion picture “Minority Report,” had already completed a first round of field tests “at an undisclosed location in the northeast.”

“Like a lie detector, FAST measures a variety of physiological indicators, ranging from heart rate to the steadiness of a person’s gaze, to judge a subject’s state of mind,” Weinberger wrote. “But“¦ FAST relies on non-contact sensors, so it can measure indicators as someone walks through a corridor at an airport, and it does not depend on active questioning of the subject.”

McCullagh reports that FAST was designed to “track and monitor“¦ body movements, voice pitch changes, prosody changes (alterations in the rhythm and intonation of speech), eye movements, body heat changes, and breathing patterns. Occupation and age are also considered. A government source told CNET that blink rate and pupil variation are measured too.”

On October 7, a Homeland Security official told CNET.com, “The FAST program is entirely voluntary and does not store any personally-identifiable information (PII) from participants once the experiment is completed. The system is not designed to capture or store PII. Any information that is gathered is stored under an anonymous identifier and is only available to DHS as aggregated performance data. It is only used for laboratory protocol as we are doing research and development. It is gathered when people sign up as volunteers, not by the FAST system. If it were ever to be deployed, there would be no PII captured from people going through the system.”

In related news, late last week, the Government Accountability Office (GAO) released a report on the Department of Homeland Security’s use of data mining, which is identified as a technique used for “extracting useful information from large volumes of data“¦ to help detect and prevent terrorist threats.”

In their analysis, the GAO said, “While data-mining systems offer a number of promising benefits, their use also raises privacy concerns,” recommending that executives at Homeland Security “address gaps in agency evaluation policies and that component agency officials address shortfalls in their system evaluations.”

According to Grant Gross of IDG News, the report discovered that the DHS policies were insufficient in requiring the organization to evaluate the effectiveness of these data-mining programs, and that two of the six programs reviewed by the GAO had not completed privacy impact assessments.

Furthermore, the GAO found that the DHS violated their own privacy rules “by sharing information from the Immigration and Customs Enforcement Pattern Analysis and Information Collection (ICEPIC) program with state and local law enforcement agencies,” and that the ICEPIC had “rolled out its law enforcement sharing component before it was approved by the DHS privacy office.”

In a press release quoted by Gross in an October 7 report, North Carolina Representative Brad Miller said that federal data mining programs “should have tough-minded oversight if we’re going to keep Americans safe from terrorism, avoid wasting tax dollars on one boondoggle technology after another, and protect the privacy of innocent Americans“¦ The intelligence community has to stop using the legitimate need for some secrecy in counter-terrorism to hide from oversight.”

Likewise, Miller’s colleague, Donna Edwards of Maryland, said that it was “alarming” that Homeland Security officials “needed GAO to point out that the agency’s data mining program has been violating its own privacy protocols for more than three years by sharing sensitive personal information with local, state, and federal officials.”

On the Net:


Source: RedOrbit Staff & Wire Reports



comments powered by Disqus