Report On Big Data: Guide To Transforming The Government
Michael Harper for redOrbit.com — Your Universe Online
TechAmerica Foundation´s Federal Big Data Commission released their report on Big Data yesterday, suggesting what value Big Data will have, how it should be supported, where it should take us, and once and for all attempting to draw a more specific definition of just what Big Data is.
In “Demystifying Big Data: A Practical Guide To Transforming The Business of Government,” TechAmerica defines Big Data as “a phenomenon defined by the rapid acceleration in the expanding volume of high velocity, complex, and diverse types of data.” Additionally, TechAmerica claims Big Data can be defined in terms of volume, velocity and variety. Practically speaking, Big Data for the government could include the mass amount of medical data collected from Medicare patients or the huge files of video footage gathered by the military as they conduct surveillance operations.
One prime point the commission hits on in their report is the fact that the government already has most of the Big Data supply in house. The next step now, according to the report, is training workers to handle, analyze and store all this data.
Big Data has indeed started to skyrocket in terms of volume and availability. One reason for this, according to commission co-chairman and SAP´s global executive vice president Steve Lucas, “is the dramatic reduction in the cost of computing and of storage.”
Steve A Mills, another co-chairman and senior vice president at IBM, suggests that the government now needs to make an effort to keep their workforce growing along with the amount of data they’re collecting and storing. Without analysis, this data simply takes up space and becomes useless.
To kick start this analysis, the report eyes the slew of graduating college seniors looking for internship opportunities and ways to break into the field. TechAmerica suggests the government create internship programs focused on analyzing big chunks of this data, as well as creating a “leadership academy.” In addition, the report recommends the Office of Science and Technology Policy (OSTP) put together a nationwide research and development strategy for Big Data. In this plan, each agency would follow the example of the FCC and name a Chief Data Officer.
“We also recommend a broader coalition between Big Data Commission members, academic centers, and professional societies to articulate and maintain professional and competency standards for the field of Big Data,” reads the report.
“Such standards will guide colleges and universities across the country to develop relevant programs, thus increasing the pool of qualified students for Big Data related roles.”
The good news for the government, according to Mills, is that Big Data could be much cheaper to set up, analyze and maintain.
““¦government doesn´t have to plow money into pure research in order to achieve results, because the commercial, off-the-shelf technology has become so robust,” he said.
Together, both the government and private sectors can work together to ensure a strong and resilient approach to Big Data, suggests the report. The report confidently states that we are in the position now to tackle this challenge head on.
“Ten years from now, we may have forgotten the term, but its principles will underpin society.”