Keynote Speakers

Keynote Speakers

Per Stenstrom

Title: Efficient Computing in the Post-Moore Era

Monday, 25 June 2018 | 9:15 – 10:00

Abstract: Moore’s Law has enabled a smooth ride towards higher computational speed through an exponential growth of computational resources on a compute-chip. Unfortunately, this ride is about to end. We need innovative ways of using fixed compute-chip resources more efficiently, thus keeping the software community productive. My vision, driving the agenda of this talk, is a paradigm in which compute resources are used efficiently through collaborative measures across the compute stack. I will provide two examples currently being explored in my research. In a first approach, collaborative measures are taken across the parallel programming model, the run-time system and the chip multiprocessor architecture so as to manage cache hierarchies more efficiently. In a second approach, controlled approximations are at the center. By having the user provide intuitive measures of quality, especially in computer graphics applications, resource utilization can be improved. These measures can propagate from the programming model level, through the compiler to the GPU architecture to use on-chip memory resources more efficiently in an automatic fashion. These are examples of opportunities of a major vision currently being explored in my research group.

Short bio

Per Stenstrom is professor at Chalmers University of Technology. His research interests are in parallel computer architecture. He has contributed four textbooks, more than 150 publications and 10 patents in this area. He is a leader in computer architecture having chaired the technical program of premiere scientific conferences such as IEEE/ACM ISCA, IEEE HPCA, IEEE IPDPS and the editorial board of premiere scientific journals such as ACM TACO and the Journal of Parallel and Distributed Computing. He has also been involved in several startup companies currently as Chief Scientist of ZeroPoint Technologies. He is a Fellow of the ACM and the IEEE and a member of Academia Europaea, the Royal Swedish Academy of Engineering Sciences and the Royal Spanish Academy of Engineering Science.


Henning Müller

Title: Research infrastructures and medical image analysis

Monday, 25 June 2018 | 14:00 – 14:45

Abstract: Medical institutions produce massive amounts of data and imaging data is among the largest volumes. Quantitative analysis of visual imaging data is still relatively difficult and often a semi-automatic process but automatic pipelines aim at better extracting quantitative information from medical imaging data in approaches such as radiomics.
Research infrastructures play a major role in image analysis, both to treat the large amounts of data and complex algorithms but also to simply get access to the data, that can often not easily leave institutions in large amounts. For this, Evaluation-as-a-Service approaches have been developed that move the algorithms towards the data. This talk will explore the link between research infrastructure and medical visual decision support.

Short bio

Henning Müller studied medical informatics at the University of Heidelberg, Germany, then worked at Daimler-Benz research in Portland, OR, USA. From 1998-2002 he worked on his PhD degree at the University of Geneva, Switzerland with a research stay at Monash University, Melbourne, Australia in 2001. Since 2002 Henning has been working in medical informatics at the University Hospitals of Geneva where he habilitated in 2008 and was named titular professor in 2014. Since 2007 he has been a professor in business informatics at the HES-SO Valais in Sierre and since 2011 he has been responsible for the eHealth unit in Sierre. Henning was coordinator of the Khresmoi project, scientific coordinator of the VISCERAL project, initiator of the ImageCLEF benchmark. He has authored over 400 scientific papers, is in the editorial board of several journals and reviews for many journals and funding agencies around the world.

For 2015-2016 Henning was a visiting professor at the Martinos Center in Boston, MA, USA part of Harvard Medical School and the Massachusetts General Hospital (MGH) working on collaborative projects in medical imaging and system evaluation among others in the context of the Quantitative Imaging Network of the National Cancer Institutes.


Franck Cappello

Title: Keeping-up with the flood of scientific data

Tuesday, 26 June 2018 | 9:15 – 10:00

Abstract: 
Extreme-scale scientific simulations and experiments on scientific instruments are already generating more data that can be communicated stored and analyzed. The data flood will get even worse with future exascale systems and scientific instruments updated with higher-definition sensors. Scientific data reduction is a necessity to drastically accelerate I/O, reduce data footprint on storage but also to speed-up significantly computation, as demonstrated by the 2017 Gordon Bell award winner. But reduction should be performed wisely, to keep the information that matters for the scientists. The current approach of data decimation (dropping x% of the produced data) drastically tampers with the data analytics performed on the reduced data. Instead of dropping data, we can try to develop application-specific lossy data reduction techniques or to compress the dataset with advanced generic lossless compression algorithms. Unfortunately, these two approaches are either unpractical for most applications or do not provide enough data reduction for scientific datasets. Lossy compression seems the only practical and effective direction to reduce significantly scientific datasets for a broad spectrum of applications.
Lossy compression of scientific data is a very young research domain with many opportunities to explore and discover new techniques. In this talk, we will present challenges and opportunities in terms of compression algorithms and application of lossy compression for scientific data. We will detail not only the best-in-class compression algorithms but also the tools to assess comprehensively the error introduced by lossy compression. Although impressive compression factors can be reached on scientific datasets, many questions remain open: What are the right metrics to qualify compression quality? How can we control the compression to match the increasing list of user quality requirements? How do errors injected during the compression affect the following steps of the application?

Short bio

Franck is senior computer scientist at Argonne National Laboratory and adjunct associate professor in the department of computer science at University of Illinois at Urbana Champaign. He is the director of the Joint-Laboratory on Extreme Scale Computing gathering six of the leading high-performance computing institutions in the world: Argonne National Laboratory (ANL), National Center for Scientific Applications (NCSA), Inria, Barcelona Supercomputing Center (BSC), Julich Supercomputing center (JSC) and Riken AICS.  Franck is an expert in parallel/distributed computing and high-performance computing. Recently he started investigating lossy compression for scientific datasets to respond to the pressing needs of scientists performing large scale simulations and experiments for significant data reduction. Franck is member of the editorial board of IEEE Transactions on Parallel and Distributed Computing and of the IEEE CCGRID steering committees. He is fellow of the IEEE.


Costas Bekas

Title: HPC Frontiers in Cognitive Computing

Tuesday, 26 June 2018 | 14:00 – 14:45

Abstract: We are experiencing an unprecedented increase of the volume of data. Next to structured data, that originates from sensors, experiments and simulations, unstructured data in the form of text and images poses great challenges for computing systems. Cognitive computing targets at extracting knowledge from all kinds of data sources and applies powerful big data and analytics algorithms to help decision making and to create value. In this context, large scale machine learning and pattern recognition hold central role. Advances in algorithms as well as in computing architectures are much in need in order to achieve the full potential of cognitive computing. We will discuss changing computing paradigms and algorithmic frontiers and we will provide practical examples from recent state of the art cognitive solutions in key areas such as novel materials design.

Short bio

Costas Bekas is managing the Foundations of Cognitive Computing group at IBM Research-Zurich. He received B. Eng., Msc and PhD diplomas, all from the Computer Engineering & Informatics Department, University of Patras, Greece, in 1998, 2001 and 2003 respectively. Between 2003-2005, he worked as a postdoctoral associate with prof. Yousef Saad at the Computer Science & Engineering Department, University of Minnesota, USA. He has been with IBM since September 2005. Costas’s main research interests span HPC, massive scale analytics and cognitive computing, energy aware algorithms and architectures.
Costas is a recipient of the PRACE 2012 award and the ACM Gordon Bell 2013 and 2015 prizes.


Bilel Jamoussi

Title: Machine Learning for 5G and Future Networks

Wednesday, 27 June 2018 | 9:15 – 10:00

Abstract: Artificial Intelligence including Machine Learning is now possible thanks to the increasing availability of datasets and to high performance computing. This keynote will provide an update about the latest development in the International Standards efforts by ITU, including Artificial Intelligence for good, Data Processing and Management in support of IoT and Smart Cities, and Machine Learning for 5G and Future Networks. We will present the results of the first and second ITU Summit on AI for good, and highlight the work in progress on the uses cases for Machine Learning for 5G and Future networks, the data sets, and network architecture.

Short bio

Since 2010, Dr. Bilel Jamoussi is Chief of the Study Groups Department of ITU Standardization Bureau in Geneva Switzerland. Prior to 2010, Jamoussi worked for a Telecommunication equipment and solutions provider for 15 years in Canada and then in the United States where he held several leadership positions and was granted 22 US patents in diverse areas including packet, optical, wireless, and quality of service. He holds a BSc, MSc and PhD degrees in Computer Engineering from the Pennsylvania State University, USA. He is fluent in Arabic, French, and English and speaks some Spanish and German.


Wolfgang E. Nagel

Title: Digitization and Data Analytics: Architectures, Methods, and Consequences

Wednesday, 27 June 2018 | 14:00 – 14:30

Abstract: These days, digitization as the driver for development is discussed in all kinds of media, and it significantly changes life, social interaction, most forms of business and economy, and also: science. It is driven by technology and the fact that objects can be digitally stored and analyzed much easier than in any other analog representation, and it can be copied without loss, and fast to any place in the world with very limited or at no cost. With this, it becomes an unlimited global resource. The efficient and intelligent handling of large – often distributed and heterogeneous – data sets containing these digital objects increasingly determines the scientific and economic competitiveness in most application areas. Mobile applications, social networks, multimedia collections, sensor networks, data-intense scientific experiments and complex simulations generate a data deluge. Innovative methods to process and analyze these data sets open up various new opportunities for their exploitation and new insights. Nowadays, sophisticated and specialized hardware options are available in the HPC area to provide architectures adjusted to the needs of different data analytics workloads. The talk will present methods to provide shaped storage and computing frameworks for big data analytics, and their use in scientific application fields. The talk will also outline the early and potential long term consequences of digitization and data analytics in scientific research environments.

Short bio

Wolfgang E. Nagel holds the Chair for Computer Architecture at TU Dresden and is the founding director of the Center for Information Services and High Performance Computing (ZIH). His research covers programming concepts and software tools to support the development of scalable applications, analysis of computer architectures, and development of efficient parallel algorithms and methods. Over the last years, he has extended his focus to data intensive computing (covering data acquisition, handling, and exploitation of large data sets for a broad spectrum of users), research data infrastructures, and data life cycle management. Prof. Nagel is the scientific coordinator of the Big Data competence center “ScaDS – Competence Center for Scalable Data Services and Solutions Dresden/Leipzig”.


Sergio Maffiolletti

Title: EnhanceR: Science becomes FAIR and open

Wednesday, 27 June 2018 | 14:30 – 15:00

Abstract: Science and research are changing. IT, which was once used in only certain sectors, is now ubiquitous. Data sets grow day by day, and benefiting them presents many new challenges. At the same time, there is a recognition that innovative research often crosses lines between fiends, institutions and countries. This has lead to a new focus on open science, to bring new benefits and to increase trust in the scientific establishment. At the same time, the FAIR data principles (Findable, Accessible, Interoperable, Reusable) have arisen as a set of functional guidance for how we deal with data and conduct research – a checklist for data scientists. These changes demand a lot from researchers, and they must be provided with the support needed to follow this new wave in research.

EnhanceR – www.enhancer.ch – is a national network that provides researchers and research organisations with services that enable open science. Through consulting, training and access to technical services, EnhanceR will support researchers moving to open science and FAIR principles.

EnhanceR federates 8 organisations with specialist skills and units in IT for Research, and supported by swissuniversities, which is trying to move Switzerland toward national services in support of research.

Short bio

Sergio Maffioletti serves as Director of the EnhanceR project, a national initiative to provide specialised research IT support to the Swiss research academic sector (https://www.enhancer.ch/).
After his doctorate in computer science at the University of Fribourg, Sergio worked at the Swiss National Supercomputing Center where he developed competences in High Performance and Grid computing. He joined the University of Zurich in 2009 as project director of the Grid Computing Competence Center. He has been managing several national initiatives for establishing a academic e-infrastructure services.
He is now an infrastructure and application specialists at S3IT – the Service and Support for ScienceIT unit at the University of Zurich. There he consults and supports research groups in migrating complex data analysis usecases on large scale computational infrastructure; he helps institutes and research groups to understand their research infrastructure needs and how to develop a strategy for infrastructure services.