# New Computing Tool

July 14, 2010

In this simulated mixing of two fluids, blue and red spheres represent critical points, and the lines between them represent the branching of pockets of fluid. This kind of topology-based analysis is applicable to scalar fields in many scientific disciplines.

More about this Image This identification and tracking of pockets of fluid in a simulated mixing of two fluids was performed using a powerful computing tool developed by scientists at the University of California, Davis (UC-Davis), and the Lawrence Livermore National Laboratory. The tool, which is a set of problem-solving calculations called an algorithm, allows scientists to extract features and patterns from enormously large and complex sets of raw data. And it is compact enough that it can be run on a computer using as little as two gigabytes of memory.

Researchers have been using computers to simulate real-world phenomena for some time; however, as the size of these data sets has grown, so has the size of computer capacity, making analysis of data increasingly difficult.

The team--led by Attila Gyulassy of UC-Davis--used the algorithm to divide data sets into parcels of cells, with each parcel being analyzed separately using the Morse-Smale complex, a mathematical tool that partitions sets by similarity of features, then encodes them into mathematical terms. Results of the computations were then merged together, and as new parcels are created from the merged parcels, they are analyzed and merged yet again. To reduce the computing power that is required to run these complicated calculations, data that does not need to be stored in the computer's memory is discarded.

Gyulassy used the algorithm in a test to analyze and track the formation and movement of pockets of fluid in a simulated mixing of two fluids: one dense and one light. The data set was very complex, consisting of more than one billion data points on a three-dimensional grid--so complex in fact that Gyulassy says it would challenge a supercomputer! But the streamlined features of the new algorithm allowed it to perform the analysis on a laptop computer with only two gigabytes of memory. It took 24 hours for the laptop to complete the calculations but in the end, Gyulassy could pull up images in mere seconds to illustrate phenomena he was interested in, such as the branching of fluid pockets in the mixture. Gyulassy is currently developing software that will allow others to use the algorithm tool as well.

A paper describing the new algorithm was published in the November-December issue of IEEE Transactions on Visualization and Computer Graphics. Other authors of the paper are Valerio Pascucci of UC-Davis, and a computer scientist and project leader at Lawrence Livermore National Laboratory at the time and now at the University of Utah); and Peer-Timo Bremer of Lawrence Livermore National Laboratory. The research was supported by National Science Foundation grant CCF 07-02817, and with support from the Lawrence Scholar Program. (Date of Image: March 2008)

More about this Image This identification and tracking of pockets of fluid in a simulated mixing of two fluids was performed using a powerful computing tool developed by scientists at the University of California, Davis (UC-Davis), and the Lawrence Livermore National Laboratory. The tool, which is a set of problem-solving calculations called an algorithm, allows scientists to extract features and patterns from enormously large and complex sets of raw data. And it is compact enough that it can be run on a computer using as little as two gigabytes of memory.

Researchers have been using computers to simulate real-world phenomena for some time; however, as the size of these data sets has grown, so has the size of computer capacity, making analysis of data increasingly difficult.

The team--led by Attila Gyulassy of UC-Davis--used the algorithm to divide data sets into parcels of cells, with each parcel being analyzed separately using the Morse-Smale complex, a mathematical tool that partitions sets by similarity of features, then encodes them into mathematical terms. Results of the computations were then merged together, and as new parcels are created from the merged parcels, they are analyzed and merged yet again. To reduce the computing power that is required to run these complicated calculations, data that does not need to be stored in the computer's memory is discarded.

Gyulassy used the algorithm in a test to analyze and track the formation and movement of pockets of fluid in a simulated mixing of two fluids: one dense and one light. The data set was very complex, consisting of more than one billion data points on a three-dimensional grid--so complex in fact that Gyulassy says it would challenge a supercomputer! But the streamlined features of the new algorithm allowed it to perform the analysis on a laptop computer with only two gigabytes of memory. It took 24 hours for the laptop to complete the calculations but in the end, Gyulassy could pull up images in mere seconds to illustrate phenomena he was interested in, such as the branching of fluid pockets in the mixture. Gyulassy is currently developing software that will allow others to use the algorithm tool as well.

A paper describing the new algorithm was published in the November-December issue of IEEE Transactions on Visualization and Computer Graphics. Other authors of the paper are Valerio Pascucci of UC-Davis, and a computer scientist and project leader at Lawrence Livermore National Laboratory at the time and now at the University of Utah); and Peer-Timo Bremer of Lawrence Livermore National Laboratory. The research was supported by National Science Foundation grant CCF 07-02817, and with support from the Lawrence Scholar Program. (Date of Image: March 2008)

**Topics:**Technology Internet, Theoretical computer science, Mathematics, Computing, Supercomputer, Algorithm, Computer, Computer graphics, Lawrence Livermore National Laboratory, University of California, Attila Gyulassy

comments powered by Disqus