HotCDP12 Hot Topics on Cloud Data Processing (Bern, Switzerland)

Processing large data sets in the cloud has been dominated by the map/reduce programming model, originally proposed by Google and widely adopted through the Apache Hadoop implementation. Over the years, developers have identified weaknesses of processing data sets in batches and have proposed alternatives. One such alternative is continuous processing of data streams. This is particularly suitable for applications in online analytics, monitoring, financial data processing and fraud detection that require timely processing of data, making the delay introduced by batch processing highly undesirable. This processing paradigm has led to the development of systems such as Yahoo! S4 and Twitter Storm.

The cloud ecosystem, however, is rather rich in application requirements. While map/reduce and stream processing together cover a large fraction of the development space, there are still many applications that are better served by different models and systems. The main goal of the Workshop on Hot Topics in Cloud Data Processing (HotCDP) is to foster research in large-scale data processing and gather both researchers and practitioners working on data processing techniques. The venue will favor work on radical alternatives to existing models and new designs to process data in the cloud. Our goal is to attract research work that will underpin the next generation of scalable, interactive data processing applications in the cloud.

Call for papers (TXT)
Keynote Speakers
Important Dates
Paper Submission deadline
Jan 27, 2012
Notification of acceptance
February 27, 2012
Camera-ready deadline
March 5, 2012
April 10, 2012

Please email any questions to: