In conjunction with:
7th IEEE/ACM International Conference on Utility and Cloud Computing (UCC 2014)
http://www.cloudbus.org/bdc2014
Rapid advances in digital sensors, networks, storage, and computation along with their availability at low cost is leading to the creation of huge collections of data -- dubbed as Big Data. This data has the potential for enabling new insights that can change the way business, science, and governments deliver services to their consumers and can impact society as a whole. This has led to the emergence of the Big Data Computing paradigm focusing on sensing, collection, storage, management and analysis of data from variety of sources to enable new value and insights.
To realize the full potential of Big Data Computing, we need to address several challenges and develop suitable conceptual and technological solutions for dealing them. These include life-cycle management of data, large-scale storage, flexible processing infrastructure, data modelling, scalable machine learning and data analysis algorithms, techniques for sampling and making trade-off between data processing time and accuracy, and dealing with privacy and ethical issues involved in data sensing, storage, processing, and actions.
The International Symposium on Big Data Computing (BDC) 2014 -- held in conjunction with 7th IEEE/ACM
International Conference on Utility and Cloud Computing (UCC 2014), December 8-9, 2014, London, UK, aims at
bringing together international researchers, developers, policy makers, and users and to provide an international
forum to present leading research activities, technical solutions, and results on a broad range of topics related to Big
Data Computing paradigms, platforms and their applications. The conference features keynotes, technical
presentations, posters, workshops, tutorials, as well as competitions featuring live demonstrations.
Topics of interest include, but are not limited to:
I. Big Data Science
• Analytics
• Algorithms for Big Data
• Energy-efficient Algorithms
• Big Data Search
• Big Data Acquisition, Integration, Cleaning, and Best Practices
• Visualization of Big Data
II. Big Data Infrastructures and Platforms
• Programming Systems
• Cyber-Infrastructure
• Performance evaluation
• Fault tolerance and reliability
• I/O and Data management
• Storage Systems (including file systems, NoSQL, and RDBMS)
• Resource management
• Many-Task Computing
• Many-core computing and accelerators
III. Big Data Security and Policy
• Management Policies
• Data Privacy
• Data Security
• Big Data Archival and Preservation
• Big Data Provenance
IV. Big Data Applications
• Scientific application cases studies on Cloud infrastructure
• Big Data Applications at Scale
• Experience Papers with Big Data Application Deployments
• Data streaming applications
• Big Data in Social Networks
• Healthcare Applications
• Enterprise Applications
• Papers Due: September 22nd, 2014 (midnight Anywhere on Earth)
• Notification of Acceptance: October 15th, 2014
• Camera Ready Papers Due: October 31st, 2014
Authors are invited to submit papers electronically. Submitted manuscripts should be structured as technical papers and may not exceed 10 letter size (8.5 x 11) pages including figures, tables and references using the CPS format for conference proceedings (print area of 6-1/2 inches (16.51 cm) wide by 8-7/8 inches (22.51 cm) high, two-column format with columns 3-1/16 inches (7.85 cm) wide with a 3/8 inch (0.81 cm) space between them, single-spaced 10- point Times fully justified text). Submissions not conforming to these guidelines may be returned without review. Authors should submit the manuscript in PDF format and make sure that the file will print on a printer that uses letter size (8.5 x 11) paper. The official language of the meeting is English. All manuscripts will be reviewed and will be judged on correctness, originality, technical strength, significance, quality of presentation, and interest and relevance to the conference attendees. Papers conforming to the above guidelines can be submitted through the BDC 2014 paper submission system (https://www.easychair.org/conferences/?conf=bdc2014).
Submitted papers must represent original unpublished research that is not currently under review for any other conference or journal. Papers not following these guidelines will be rejected without review and further action may be taken, including (but not limited to) notifications sent to the heads of the institutions of the authors and sponsors of the conference. Submissions received after the due date, exceeding length limit, or not appropriately structured may also not be considered. Authors may contact the conference PC Chair for more information.Selected papers from BDC 2014 will be invited to extend and submit to the Special Issue on Many-Task Computing in the Cloud in the IEEE Transaction on Cloud Computing (http://datasys.cs.iit.edu/events/TCC-MTC15/CFP_TCC-MTC15.pdf).
General Co-Chairs:
• Rajkumar Buyya, University of Melbourne, Australia
• Divyakant Agrawal, University of California at Santa Barbara, USA
Program Co-Chairs:
• Ioan Raicu, Illinois Institute of Technology and Argonne National Laboratory, USA
• Manish Parashar, Rutgers, The State University of New Jersey, USA
Track Co-Chairs:
• Big Data Science
Cyber Chair:
• Amir Vahid, University of Melbourne, Australia
Publicity Chairs:
• Carlos Westphall, Federal University of Santa Catarina, Brazil
• Ching-Hsien Hsu, Chung Hua University, Taiwan & Tianjin University of Technology, China
• Rong Ge, Marquette University, USA
• Giuliano Casale, Imperial College London, UK
Organising Chair:
• Ashiq Anjum, University of Derby, UK
Program Committee:
For traditional datacenter applications capacity is a fixed upfront cost, so there is little incentive to stop using it once it’s been allocated, and it has to be over-provisioned most of the time so there is enough capacity for peak loads. When traditional application and operating practices are used in cloud deployments there are immediate benefits in speed of deployment, automation, and transparency of costs. The next step is a re-architecture of the application to be cloud native, and significant operating cost reductions can help justify the development work. Cloud native applications are dynamic and use ephemeral resources that are only charged for when they are in use. This talk will discuss best practices for cloud native development, test and production deployment architectures that turn off unused resources and take full advantage of optimizations such as reserved instances and consolidated billing.
Adrian Cockcroft has had a long career working at the leading edge of technology. He’s always been fascinated by what comes next, and he writes and speaks extensively on a range of subjects. At Battery, he advises the firm and its portfolio companies about technology issues and also assists with deal sourcing and due diligence.
Before joining Battery, Adrian helped lead Netflix’s migration to a large scale, highly available public-cloud architecture and the open sourcing of the cloud-native NetflixOSS platform. Prior to that at Netflix he managed a team working on personalization algorithms and service-oriented refactoring.
Adrian was a founding member of eBay Research Labs, developing advanced mobile applications and even building his own homebrew phone, years before iPhone and Android launched. As a distinguished engineer at Sun Microsystems he wrote the best-selling “Sun Performance and Tuning” book and was chief architect for High Performance Technical Computing.
Joint keynote with UCC and the Cloud Control Workshop.
Modern medicine is becoming a data-driven science. Improving patient care and developing personalized therapies depends increasingly on an organization’s ability to rapidly and intelligently leverage complex molecular and clinical data from a variety of internal, partner and public sources. In Imperial, we are leading the development of the eTRIKS platform which aims at providing an open source architecture for the analysis of this massive amount of molecular data in the context of personal medicine research. The project mandates the development of a scalable, performant, resilient to failure and flexible solution for medical big data on the cloud. This presentation gives an overview of the eTRIKS project. Then we will focus on the cloud based analytical engine built upon the ICBIG data management system which is newly established Data Science Institute at Imperial College London. Base of this platform is consisting a stack containing Hadoop and Spark and many other frameworks in apache Hadoop ecosystem to provide big data management support. Based on that, we are building an analytical engine to provide high performance medical bioinformatics services for the large scale clinical research for personal medicine.
Professor Yike Guo is the founding director of the multi-disciplinary Data Science Institute at Imperial College London. He has carried out research and development in the field of data intensive analytical computing at Imperial since 1995, when he was the technical director of Imperial’s Parallel Computing Centre. Professor Guo focuses on applying data mining technology to scientific data analysis, in the fields of life science, healthcare, environment science and economy. He has published nearly 200 research papers.
Joint keynote with UCC
Understand how Roche Diagnostics is taking clinical intelligence to the next level: the talk includes experiences on tackling ever-changing data models in a validated environment, the inclusion of genomics or proteomics data to stratify patients in a clinical trial and how to take the next step towards real-life data science.
Dr. Andreas Koop is IT program manager within the solution center for research & development at Roche Diagnostics and globally responsible for the clinical intelligence & integration program at Roche. With a background in medical informatics, Andreas has many years of experience in the healthcare industry.
Oliver Vettel is chief technology & information officer at the inhive group, located in Lorsch, Basel and Dubai. As a technology advisor and solution architect to Roche, he leads the execution of the clinical intelligence & integration strategy from a technical perspective.
Headquartered in Basel, Switzerland, Roche is a leader in research-focused healthcare with combined strengths in pharmaceuticals and diagnostics. Roche is the world’s largest biotech company, with truly differentiated medicines in oncology, immunology, infectious diseases, ophthalmology and neuroscience. Roche is also the world leader in in vitro diagnostics and tissue-based cancer diagnostics, and a frontrunner in diabetes management. Roche’s personalised healthcare strategy aims at providing medicines and diagnostics that enable tangible improvements in the health, quality of life and survival of patients. Founded in 1896, Roche has been making important contributions to global health for more than a century. Twenty-four medicines developed by Roche are included in the World Health Organisation Model Lists of Essential Medicines, among them life-saving antibiotics, antimalarials and chemotherapy.
Joint keynote with UCC