Prof. Dr. Peter Baumann / Computer Science, Jacobs University, Bremen
Friday, January 19 2018, 10.15 am, INF 348, Room 013, Heidelberg University, Institute for Geography
Datacubes form an enabling paradigm for serving massive spatio-temporal Earth data in an analysis-ready way by combining individual files into single, homogenized objects for easy access, extraction, analysis, and fusion – “one cube says more than a million images”. In common terms, goal is to allow users to “ask any question, any time, on any size” thereby enabling them to “build their own product on the go”.
Today, large-scale datacubes are becoming reality: For server-side evaluation of datacube requests, a bundle of enabling techniques is known which can massively speed up response times, including adaptive partitioning, parallel and distributed processing, dynamic orchestration of mixed hardware, and even federations of data centers. Known datacube services exceed 600 TB, and datacube analytics queries have been split across 1,000+ cloud nodes. Intercontinental datacube fusion has been accomplished between ECMWF/UK and NCI Australia, as well as between ESA and NASA.
From a standards perspective, as per ISO and OGC, datacubes belong to the family of coverages, aka “spatio-temporally varying objects”. the coverage data model is represented by the OGC Coverage Implementation Schema (CIS) standard, the service model by OGC Web Coverage Service (WCS) together with its OGC Web Coverage Processing Service (WCPS), OGC’s geo datacube query language. Additionally, ISO is finalizing application-independent query support for massive multi-dimensional arrays in SQL.
In our talk we present the concept of datacubes, the standards that play a role, as well as interoperability successes and issues existing, based on our work on the OGC Reference Implementation, rasdaman.
Further details + dates?
We are looking forward to a large attendance!