Quantcast
Last updated on April 18, 2014 at 17:24 EDT

Live Demonstration of 21st Century National-scale Team Science

August 15, 2005

There are four wings to the Earth Science building of the Goddard Space Flight Center (GSFC) in Greenbelt, MD. But on August 8, “we added a virtual fifth wing,” says NASA Emeritus Scientist Milton Halem. That new wing used experimental OptIPuter software to create a ‘high-performance collaboratory’ with the University of California, San Diego’s Scripps Institution of Oceanography, which allowed scientists to establish high-definition telepresence while also collaborating in real time on visualizing massive amounts of remote land and weather data.

Working closely with researchers at the California Institute for Telecommunications and Information Technology (Calit2) at UCSD and Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC), GSFC networking and visualization staff conducted the first successful system test of a new coast-to-coast, 10-Gigabit per second (Gbps) Ethernet cyber backplane ““ or ‘lambda’ ““ linking the NASA research center to UCSD some 3,000 miles away.

The demonstration was dedicated to outgoing NASA Associate Administrator of Science Alphonso Diaz, who funded the project in March 2004 when he was still director of GSFC. “The demo exceeded every expectation I had when I initiated the program for information technology at Goddard,” says Alphonso Diaz, who attended the event only days before his retirement from NASA. “At that time I hoped that this project would serve as a demonstration of the value of IT investments in the conduct of NASA-sponsored science, particularly Earth science. Not only did it do that, but I hope the demonstration served to promote further investment.”

With this high performance optical pathway established, OptIPuter technologies were shown to perform complex scientific analysis and visualization displays of voluminous data sets remotely located across the continent as though the data resided on a local server. Other demonstrations showed the interactive visualization or roaming over data images with resolutions forming objects 1,000 times larger than is feasible using the World Wide Web over the shared Internet. The OptIPuter computer research project is funded by the National Science Foundation (NSF) and led by Calit2 and EVL, and Earth science research at Scripps is one of the two driving applications for the OptIPuter project.

“Information technology is changing the way that teams of scientists collaborate on large-scale problems that involve huge amounts of data,” says NASA’s Halem. “We are prototyping a new information infrastructure to provide NASA scientists along with their observational data holdings and models interoperable links with scientists at other research institutions engaged in collaborative investigations and studies. These successful demonstrations showed significant progress in achieving the Holy Grail of collaborative computational Earth and space science for the early decades of the 21st century.”

The Aug. 8 demonstration gave Goddard officials a glimpse of a future when a scientist at the research center will be able interactively to visualize large remote data sets generated either by satellites or supercomputers connected to the National LambdaRail (NLR). NLR is a nationwide fiber optic network infrastructure designed to give academic scientists the predictable and large bandwidth they need to do advanced research.

At GSFC, researchers led by Christa Peters-Lidard were able to call up large Land Information System (LIS) data sets consisting of more than 20GBs residing at Scripps and being visually rendered on the OptIPuter visualization cluster in La Jolla. The graphic images then traveled over the NLR to Goddard and were displayed at Goddard on a large, tiled display called a HyperWall. Simultaneously, on the same screen, the Goddard group was viewing a high definition (HD) video stream of Calit2 director Larry Smarr and Scripps scientists V. Ramanathan and Jean-Bernard Minster sitting in front of theVizCluster. Observes Smarr: “It really is like the two research centers are just next door to each other!”

The software framework driving the HyperWall was the Scalable Adaptive Graphics Environment (SAGE), developed at EVL. It enabled the simultaneous remote visualization of NASA’s large-scale datasets and HD video streams from Scripps. “From the user’s perspective it acts like a desktop window manager allowing scientists to easily place different applications on different areas of the display,” says Nicholas Schwarz, an EVL graduate student who spent the summer at UCSD and worked closely with the NASA team for many weeks before the demonstration to help them visualize their data with EVL’s software. “SAGE decouples the rendering of high-resolution graphics from its display so that sources can be streamed from remote locations to a tiled display over high-bandwidth networks.” Two applications — also developed at EVL — operated using SAGE for the NASA demonstrations: TeraVision, a system for streaming HD video from any source; and JuxtaView, a system for displaying high-resolution image data.

“This capability is going to become increasingly important as the explosion in sensor networks or grids continues, real-time data streams grow in size, and models assimilate these data to extend predictions into future hours and days,” notes Scripps’ John Orcutt, Director of UCSD’s Center for Earth Observations and Applications (CEOA). “Next month UCSD, Scripps, the University of Washington and Woods Hole will use CAVEwave to bring real-time HD video from the seafloor to the iGrid meeting at Calit2 and HD back to the ship above the remotely operated vehicle (ROV) on the seafloor. The NSF Laboratory for Ocean Observatory Knowledge and Information Grid (LOOKING) project is funding this effort. We’re all getting tangible samples of the collaborative science of the future.”

The collaboration began serendipitously in March 2003, when Smarr was invited to give the annual Distinguished Information Science and Technology colloquium seminar at GSFC. He described the then newly NSF-funded OptIPuter project and the large-scale science research it could enable at NASA Goddard. Diaz, then-director of the research center, seized on the potential that extreme bandwidth could hold for NASA researchers. He asked his former Goddard Chief Information Officer, Milt Halem, to accept an assignment to work with GSFC information scientists and Smarr’s team to explore the use of the OptIPuter paradigm for meeting the next-generation needs of NASA’s Earth Observing System Data Information System. One year later, Halem and his information pathfinding colleagues submitted and won an internal GSFC IRAD to help prototype a 10-Gbps lambda network to create a virtual presence for Scripps at GSFC.

Even though GSFC’s participation in establishing the network was partially funded through an internal award, its academic partners are freely contributing their time and resources in hopes of extending the value of their own research and infrastructure investments. “Volunteerism is a huge thread in this collaboration,” says Calit2′s Smarr, who is also the principal investigator on the OptIPuter project and previous chair of NASA’s Earth Systems Science Advisory Committee. “NASA did not fund Calit2 or SIO to do this, and NSF didn’t have NASA in mind when it funded the OptIPuter project. But it became clear to Al Diaz and our team that NASA could become an important testing ground for these cyberinfrastructure technologies that NSF-funded researchers are developing to enable a new generation of scientific and engineering facilities.”

The OptIPuter uses the 10-Gbps CAVEwave lambda (funded by EVL and dedicated to the OptIPuter and allied national and international projects) which travels over the National LambdaRail from San Diego to Seattle, and then to the StarLight facility in Chicago. From Chicago, the OptIPuter uses NLR to the latter’s hub in McLean, VA. The final leg from McLean to GSFC’s facility in Greenbelt, MD, was provided by the mid-Atlantic DRAGON (Dynamic Resource Allocation over GMPLS Optical Networks) consortium.

“The technical challenges often involved troubleshooting the convergence of many leading-edge, pre-commercial hardware/software components and individual systems into a seamless integration,” notes GSFC Lambda Network Project Manager Pat Gary. “To isolate, diagnose, and resolve problems perceived to be network-related, we utilized 10-GE connected workstations hosting the GSFC-developed software-based nuttcp network performance measurement tool deployed at UCSD, StarLight/Chicago, McLean, and GSFC. We wanted and achieved 10-Gbps wire-speed performance end-to-end across the network; but we had to tune not only network hardware-based features but also end-user computer device drivers, TCP stacks, Linux operating system parameters, and user application software.”

“Establishing this optical ‘clear channel’ between two of the nation’s premier centers for earth system science has been a project Milt and I have driven for the past two years,” explains Calit2′s Smarr. Adds Aaron Chin, the lead OptIPuter network engineer at UCSD: “With the new 10-Gbps link, we have extended the Calit2 OptIPuter “living laboratory” for earth sciences on the UCSD campus to our colleagues at Goddard.”

Participants uniformly agreed that it took an extraordinary group effort to pull off the demo, with researchers and technicians in California and Maryland working long hours and weekends. One problem to be overcome was the last mile of the NLR from Goddard in Greenbelt, MD to McLean, VA where the NLR hub was located. This final link was completed September 28, 2005 with the collaboration of another NSF-funded project called DRAGON Systems at the University of Maryland, College Park. Compounding efforts to test the system, three days before the event, a 24-hour network outage occurred ““ the result of a backhoe accidentally cutting the Seattle-to-Sunnyvale segment of the network. “The backhoe accident had us temporarily out of action and could have been a demo killer except for the rapid response of our friends at NLR,” recalls NASA Goddard network engineer Bill Fink. “In the end, the demo was a great success owing to the splendid collaboration of literally scores of people who made significant contributions to the effort.”

“We learned again just how complex these ‘eruptive’ networking events can become,” adds Calit2 and EVL researcher Tom DeFanti, a co-PI on the OptIPuter project. “This was a big deal and the culmination of a year’s effort, but it was all worth it. I’m sure we will have other opportunities to work with our friends at NASA Goddard based on the success of this demo.”

Officials hope to extend the network infrastructure further within NASA, initially to its Ames Research Center in Silicon Valley. For the Aug. 8 demo, Ames researcher Chris Henze demonstrated real-time forecasting visualizations for Hurricane Irene at every time-step of a model predicting Irene’s movements. But since Ames is not yet hooked up to the NLR, the animations were assembled and compressed into MPEG movies and delivered over Internet2 instead.

“Everyone could sense how much more value there would be if those real-time images could be accessed without any compression at all, and that’s what a 10-Gbps link would allow,” says NASA’s Halem. “The hurricane demo indicated the vast potential that the NLR network offers for simulating and analyzing multi-decade climate simulations.”

The three applications in the demonstration involved Earth science research, where collaboration is made difficult because the data sets are so large. Scripps researchers Ramanathan and Minster are eager to use the newly established link to greatly increase their access to NASA Goddard’s 2-petabyte repository of satellite data on earth systems science and to the Project Columbia supercomputer at NASA Ames. “Having been involved in satellite instruments and global modeling, I know that less than one percent of the information has been looked at, primarily because of the lack of access to remote users,” said Ramanathan. “This system seems to have the capability to open this new door to the research community.” Ramanathan now hopes to use the infrastructure to peer 100 to 150 years into the future for an assessment of the impact of aerosols on world climate change.

“I believe that this project will greatly enhance the ability of scientists to fuse very large data sets currently stored at NASA’s Distributed Active Archive Centers and Federation of Earth Science Information Partners,” adds Minster. “They will be able to do so without having to endure the tedious task of first subsetting all these data sets in a consistent way, and collecting all the subsets in a single location.”

The OptIPuter project played a critical role in ensuring that the San Diego end of the network could support the extreme bandwidth required for transmitting data to and from Scripps. Much of the integration of the tiled display used at Scripps occurred at the UCSD-based National Center for Microscopy and Imaging Research (NCMIR), an NIH/NCRR-supported biotechnology resource. NCMIR and Scripps are the two drivers of applications for the OptIPuter: Earth sciences, and biomedical imaging. NCMIR’s David Lee was on hand for the demo at Scripps (see Collaborators below).

Participating in the demo as one of his last official functions before retiring from NASA, Al Diaz was impressed and touched at the effort. “To have the demo turned into a tribute was an emotional highlight to a major transition in my career and my life,” says Diaz. “I am thankful to Milt Halem and Larry Smarr for turning one of my dreams into reality.”

GSFC officials hope to do a more advanced demonstration in September. There are also plans to extend the network to NASA Ames in Silicon Valley, and subsequently to the Jet Propulsion Laboratory at Caltech in Pasadena, CA. “It shows what power bottom-up teams have to demonstrate the capability of new technologies to transform scientific research,” concludes Smarr. “It’s one of those quantum leaps that come along only once in a long while. We will now work to obtain the funding necessary to enable researchers who want to use this new infrastructure that we have prototyped.”

On the World Wide Web:

University of California – San Diego