New inexpensive and reliable sensing devices, such as weather cameras and floats, are allowing us to monitor in real-time a variety of phenomena on an unprecedented scale. This influx of data requires a rapid processing turnaround, frequently on tight deadlines, motivated by decision making and infrastructure adaptation. By providing on-demand resources cloud computing makes it possible to construct observatories that turn the incoming observation data into insight on the fly. Combining real-time data and real-time processing, observatories stand to become a new, transformative tool that has the potential to revitalize many areas of science.
One such observatory is being constructed by the Ocean Observatory Initiative (OOI) project, which gathers and processes data allowing us real-time insight into processing taking place in the ocean. Kate Keahey, a computer scientist in Argonne’s Mathematics and Computer Science Division and lead designer for OOI’s Common Execution Infrastructure, is exploring how such observatories can transform the way science is done.
The construction of an observatory raises many challenges, from collaborative facilities to ensuring analysis code execution in highly available and scalable fashion while maintaining a satisfactory response time in the face of failures and peaks in demand. Similar challenges and requirement patterns are present in observatories focused on other areas of science such as the Forest project at Argonne, which focuses on environmental data.
Keahey discussed these challenges in a keynote address at the ACM Cloud and Autonomic Computing Conference (CAC 2013) in Miami, Florida, in early August. She described Nimbus Platform services that her team is developing and that integrate streams generated by sensor devices, are scalable to different loads, and support dynamic addition and deletion of processing elements.
For further information about the conference and the keynote address, see the CAC 2013 website.