Algorithm Helps Make Cloud Computing More Efficient
Lee Rannals for redOrbit.com – Your Universe Online
A new software system could help reduce cloud computing hardware requirements by 95 percent, as well as help improve performance.
MIT researchers are developing a new system called DBSeer that uses machine-learning techniques to build accurate models of performance and resource demands of database-driven applications.
Barzan Mozafari, lead author on a paper presented at the recent Biennial Conference on Innovative Data Systems Research, said with virtual machines, server resources need to be allocated according to an application’s peak demand. He also said increased demand means a database server will store more of its frequently used data in its high-speed memory. A slight increase in demand could also cause the system to slow down due to too many requests.
DBSeer monitors fluctuations in both the number and type of user requests and system performance and uses machine-learning techniques to correlate the two. This approach helps predict the consequences of fluctuations that do not fall too far outside the range of the training data.
Sometimes database managers are interested in the consequences of large increases in demand. The software uses a model referred to as a “gray box” model that takes into account the idiosyncrasies of particular database systems, making those fourfold and tenfold increases in demand easier to handle.
“We’re really fascinated and thrilled that someone is doing this work,” says Doug Brown, a database software architect at Teradata, in a statement from MIT. “We’ve already taken the code and are prototyping right now.”
He says Teradata will use the team’s prediction algorithm to determine customers’ resource requirements.
“The really big question for our customers is, ‘How are we going to scale?’” Brown says.
He hopes the algorithm will help allocate server resources on the fly. If servers can assess the demands imposed by individual requests and budget correctly, then they will be able to ensure transaction times stay within the bounds set by customers’ service agreements.
According to a report released by the IDC last year, cloud services will be seeing as much as a 41 percent growth over the next four years. The report said spending on IT cloud services around the world will reach $100 billion by 2016, and the compound annual growth rate from 2012 to 2016 will be 26.4 percent.
“The IT industry is in the midst of an important transformative period as companies invest in the technologies that will drive growth and innovation over the next two to three decades,” said Frank Gens, senior vice president and chief analyst at IDC, in a statement on the report. “By the end of the decade, IDC expects at least 80 percent of the industry’s growth, and enterprises’ highest-value leverage of IT, will be driven by cloud services and other 3rd Platform technologies.”
With the growth of the cloud computing industry, it is important for scientists like the MIT team to continue their research into how to make the service work more efficiently; both for the consumer and the service provider.