La WebTV TeratecS'inscrire à la newsletter >
Big Compute and Big Data are combining in the cloud, enabling new use-cases and insights across industries. The availability of web-scale, low cost storage and data analytics, coupled with pay-by-the-hour access to many thousands of CPU cores on-demand, has led to more rapid, higher quality product engineering, more accurate and timely financial analysis, research into new drugs, and more rapid scientific discovery.
This session covers the emerging area of data-intensive, cloud-based HPC, with topics that include cloud-based cluster and job management, automated deployment methods for HPC and data management, and the use of cloud for remote simulation, visualization, and secure design collaboration. Real-world public and private sector examples.
Solver software infrastructure for exascale applications
David KEYES, Director, Extreme Computing Research Center, King Abdullah University of Science and Technology
At the heart of the G-8’s International Exascale Software Project (launched in 2009) is the belief that a vast array of applications share a core of tasks that can be abstracted and layered in such a way as to be served by a common software infrastructure, more efficiently than if each community develops their own independently. This philosophy is embedded in the “DNA” of the speaker’s institution, KAUST (also launched in 2009), especially with respect to solver software, represented by the speaker’s Extreme Computing Research Center (ECRC). What is general enough to be leveraged over many applications, however, needs to be tuned to specific hardware. Without co-design of software and hardware, performance of future applications may never exceed today’s few Petaflop/s. The algorithmic adaptations required to migrate today’s successful “bulk synchronous” open source parallel scientific sof
tware base to the exascale environment include:
We illustrate with examples from ongoing ECRC research.