Keynote Speakers

  • Erik Verlinde

    Professor of Theoretical Physics, University of Amsterdam, the Netherlands

    “We use concepts like time and space, but we don’t really understand what this means microscopically. That might change… I think there’s something we haven’t found yet, and this will help us discover the origins of our universe.”
    Erik Verlinde, in UvA in the Spotlight

    Erik Verlinde is a Dutch theoretical physicist, internationally recognized for his contributions to the field of string theory and quantum field theory. His PhD work in conformal field theories led to the Verlinde formula, which is widely used by string theorists. He is currently best known for his work on emergent gravity, which proposes that both space-time and the gravitational force are not fundamental, but emergent properties arising from the entanglement of quantum-information. Incorporating the expansion of the Universe into this theory even allowed him to predict, under specific circumstances, the excess gravity currently attributed to dark matter. Erik studied physics at the University of Utrecht and conducted his PhD research under the supervision of Bernard de Wit and Nobel Prize winner Gerard ‘t Hooft. At the end of his PhD, he moved to Princeton as a postdoctoral fellow. In 1993 Erik accepted a tenured position at CERN, and in 1996 Utrecht University appointed him as professor of Physics. In 1999 he was also awarded a professorship at Princeton University. Since 2003 Erik Verlinde has been a professor of Physics at the Institute for Theoretical Physics in the University of Amsterdam. In 2011 he was awarded the Spinoza Prize, the most prestigious award for scientists in the Netherlands.

     

    String Theory, Emergent Gravity and the Dark Universe

    Insights from string theory and black hole physics indicate that gravity is emergent and can be derived from the microscopic quantum description of spacetime. String theory gives a microscopic explanation for the entropy of black holes. In doing so it revealed that in the presence of horizons the microscopic description of spacetime is in an entropic phase that differs from the empty vacuum. In this talk I will argue that the same conclusion holds for cosmological horizons. and that as a result the emergent laws of gravity deviate from those in the empty vacuum. I will explain the physical principles behind the emergent laws of gravity and discus the implications for the gravitational phenomenon associated with dark energy and dark matter.

  • Thanu Padmanabhan

    Distinguished Professor, Inter-University Center for Astronomy and Astrophysics in Pune, India

    “Instead of the conventional view that gravity is a fundamental interaction, it could be that a microstructure - the 'atoms of space time' - give rise to it. … The connection between gravity and thermodynamics which started out as an analogy … has now become a physical reality.”
    Thanu Padmanabhan, in Scientific American India

    Theoretical physicist and cosmologist Thanu Padmanabhan is currently a Distinguished Professor at the Inter-University Center for Astronomy and Astrophysics (IUCAA) at Pune, India. He was born in Trivandrum in 1957, and obtained his BSc and MSc degrees in Physics from Kerala University, both of which were awarded with the Gold Medal. During his PhD research at the Tata Institute of Fundamental Research (TIFR) he was offered a faculty position, even before he has finished. He is internationally renowned for his major contributions to a wide range of topics, including structure formation in the universe, the nature of dark energy and the interpretation of gravity as an emergent phenomenon. The latter endeavour has led to a description wherein gravity emerges from the collective behaviour of "the atoms of space-time", just like the theory of fluid dynamics arises from a macroscopic description of regular atoms. His ideas on this topic have received awards from the Gravity Research Foundation, USA on eight occasions including the first prize in 2008. He was a Sackler Distinguished Astronomer at Cambridge, the President of the Cosmology Commission of the International Astronomical Union (2009-2012) and the Chairman of Astrophysics Commission of International Union of Pure and Applied Physics (2011-2014). In addition to nearly 300 technical papers, he has also authored two popular science books and ten graduate level text books in astrophysics, general relativity and quantum field theory. In recognition of his achievements, he received the medal of honour, Padma Shri, from the President of India in 2007.

     

    Spacetime microstructure, Gravity and the Cosmos

    Spacetime behaves like a fluid, made of atoms of spacetime, and its dynamics can be obtained from the kinetic theory for these underlying microscopic degrees of freedom. This allows the interpretation of the field equations - as well as several geometrical variables - in a purely thermodynamic language. This is true for a large class of theories including, but not limited to, general relativity. The cosmological constant, which arises as an integration constant, can then be related to the cosmic information accessible to an eternal observer. This relation, in turn, allows the determination of its numerical value from the microstructure of the spacetime. In fact, this approach provides a refreshingly new perspective on cosmology and offers the possibility of testing quantum gravity in the sky.

  • Alan Heavens

    Director of the Imperial Centre for Inference and Cosmology, Imperial College London, England

    “Advances in astrophysics and cosmology are driven by large and complex data sets, which can only be analyzed, interpreted and understood thanks to refined statistical methods.”
    Alan Heavens' ICIC Mission statement

    Alan Heavens is the Director of the Imperial Centre for Inference and Cosmology (ICIC) at Imperial College London, which aims to address the challenges posed by the statistical analysis of massive and complex data streams in astrophysics, astroparticle physics and cosmology. After earning his MA and PhD at the University of Cambridge, he obtained his first position as a professor of Theoretical Astrophysics at the University of Edinburgh. He is currently a member of the Planck and Euclid consortia, and a fellow of Royal Astronomical Society and the Royal Society of Edinburgh. His main field of expertise involves the statistical analysis of cosmological data, especially from large surveys of galaxies or the cosmic microwave background, as obtained by the Planck and Euclid telescopes. His techniques allow him to gain insights in many astrophysical areas, such as the evolution of large-scale structure, the nature of dark energy and dark matter, gravitational lensing, galaxy spectra, and the search for rare astrophysical objects. In addition his expertise in data-analysis branches out to other applications, as illustrated by his foundation of Blackford Analysis. This company uses massive data compression techniques to rapidly analyse large datasets, such as those obtained from medical scanners.

     

    Bending the Universe

    Einstein’s gravity bends the light from distant objects so the picture we see is not a true one. The distortion pattern tells us about the mysterious ingredients of the Universe - dark matter and dark energy - and about the law of gravity itself. But how should we extract that information? The answer is to simultaneously determine both the distortion map and the properties of the Universe, a million-dimensional problem that needs sophisticated statistical and numerical analysis techniques to solve. In doing so, we learn where the dark matter is, and we can undo the distortion to reveal a truer picture of the Universe.

  • Karlheinz Meier

    Professor of Experimental Physics, University of Heidelberg, Germany

    “What makes the brain so specific and so attractive in terms of computing? I mean: It runs on very little food, it’s 'banana power' that safely keeps you running; it's very compact, obviously much smaller than a supercomputer; and maybe the most important thing, it doesn't need any software updates.”
    Karlheinz Meier, at the SAI conference 2015

    Karlheinz Meier acquired his PhD in Physics at the Hamburg University in Germany, and is currently Professor of Experimental Physics at the University of Heidelberg. Although he has played a leading role in experimental particle physics as a scientific staff member at both DESY (Hamburg) and CERN (Geneva), he has since expanded his interest to 'neuromorphic computing': the field of designing electronic systems inspired by the human brain. After leading the construction of the data pre-processing system for the ATLAS experiment at the Large Hadron Collider, he co-founded the Kirchhof Institute for Physics in Heidelberg. Here he oversees research ranging from testing particle physics models in accelerators and studying complex quantum systems, to developing electronic system models for understanding information processing in the brain. On the latter subject he has initiated numerous large-scale projects, such as the FACETS and BrainScaleS projects with the goal of developing 'silicon neural circuits'. He is currently co-director of the Human Brain Project, which strives to link neuromorphic systems to large scale computer simulations of the brain in order to advance the fields of neuroscience, computing and brain-related medicine. His excellent presentation skills are exemplified by his numerous short films, where he explains a wide range of experiments to a general audience.

     

    Information in the brain - From understanding to non-Turing computing

    The brain is a complex assembly of cells connected by a dynamic network of axons, dendrites and synapses. It is not a closed system but rather develops through continuous interaction with the environment and other brains. As a consequence it has many features that we would like to see in computers. The brain does not require engineered algorithms but it learns. The brain does not need reliable components but it is fault tolerant. The brain is very energy efficient and it solves problems by performing stochastic inference. Theoretical neuroscience is making good progress to uncover the principles behind these features. At the same time several research groups in academia and industry work on transferring the principles to novel computer architectures which are often referred to as „neuromorphic“. The lecture will deliver an overview of this emerging field and discuss the development towards a more biologically derived AI.

  • Peter Sloot

    Professor of Computational Science, University of Amsterdam, the Netherlands

    “I believe that if we can gain a better understanding of what complex systems actually amount to and develop the computational models to mimic them, we will get very close to predicting life, the universe and everything, and the best coffee for that matter.”
    Peter Sloot, on Uva in the Spotlight

    Peter Sloot uses computational modelling and simulation to study the way in which “nature processes information”. His work has many applications, such as mapping the relations of criminals, modelling the immune system, predicting the risk of drug addiction, understanding the growth of tumours, and studying the way in which infectious diseases like HIV are behaving and spreading. After finishing his MSc in both Physics and Chemistry at the University of Amsterdam (UvA), he performed his PhD research in Computational Science at the Antoni van Leeuwenhoek Cancer Institute. He is currently a Professor of Computational Science at the UvA, where he is appointed as Scientific Director of the “Foundations of Complex Systems” programme at the Institute of Advanced Study. In addition, he works as Professor of Complex Systems in Singapore, and as Professor of Advanced Computing in St. Petersburg. His passion for education and sharing knowledge is demonstrated by his numerous video appearances, including Dutch news broadcasts, scientific TV programs, a scientific documentary and a TED-talk. He also works as the Editor-in-Chief of two Elsevier Science journals: the Future Generation Computer Systems journal, which focuses on grid computing and eScience, and the Journal of Computational Science, an international platform used to exchange the latest results in simulation-based science across scientific disciplines.

     

    Hidden Order in the Informational Structure of Life

    Greek mythology talks about Prometheus who stole fire from heaven to animate his clay men. My central conjecture is that what Prometheus stole was not fire but information in the form of Gibbs free energy. This resulted in our complex world, with networks of zillions of molecules in the living cell to the billions of human individuals and countless living organisms that constitute our planet, all interacting in nonlinear and often unpredictable ways. In this talk I’ll explore a new way to connect the dots, from the life-bringing free energy of the sun to the information stored and processed by living creatures.

  • Lieven Vandersypen

    Professor of Quantum Nanoscience, Delft University of Technology, the Netherlands

    “Quantum technology, technology making use of quantum effects, is about quantum computers faster than any computer we can imagine today, about ways of communicating with each other that are just impossible today, and about sensors and imaging techniques with unprecedented accuracy. But perhaps the most important applications we can’t even imagine quite yet.”
    Lieven Vandersypen, at TEDxBreda

    Originally from Belgium, Lieven Vandersypen started his undergraduate in Mechanical Engineering at the University of Leuven, but obtained his MSc and PhD in Electrical Engineering at Stanford, California. At IBM Research - Almaden he developed his expertise in quantum information and computing. He used the spins of nuclei as quantum bits to perform unique quantum calculations, such as the factorization of 15 into the prime numbers 5 and 3, which warranted a publication in Nature. He was subsequently hired as a Postdoc at the Delft University of Technology, where he became an Antoni van Leeuwenhoek professor at the department of Quantum Nanoscience in 2006. Here he leads the Vandersypen Lab, which researches quantum mechanical phenomena in nanoscale devices with the goal of eventually building a large-scale quantum computing. At temperatures of only a few milli-Kelvins, his team manages to trap individual electrons in “quantum dots”: molecular structures the size of a virus, and to control and observe their spin. His latest research focuses concentrates on the usage of spin-less materials like graphene as the basis for quantum computers. Lieven’s dedication to interdisciplinary and socially motivated scientific discourse is exemplified by his work at the Royal Netherlands Academy of Arts and Sciences’ (KNAW) Young Academy, and his many online videos including a TED-talk.

     

    From quantum surprises to quantum devices

    Quantum computation has captivated the minds of many for almost two decades. For much of that time, it was seen mostly as an extremely interesting scientific problem that aims at exploring the limits for computation allowed by nature. In the last few years, we have entered a new phase as the belief has grown that a large-scale quantum computer can actually be built. In this talk, I will outline how the most fundamental aspects of quantum physics translate to very real laboratory advances with an increasing potential for delivering a breakthrough technology.

  • Ashish Mahabal

    Computational data scientist, California Institute of Technology (Caltech), USA

    “When one sees that the same kinds of techniques can be applied, that's fantastic. Because once you take a dataset and abstract it enough, the tools that you are using don't care where the data came from. It's highly rewarding to be able to work on these completely different scales: from the Universe level to the cell level.”
    Ashish Mahabal, in an interview on data-driven astronomy

    Ashish Mahabal is a Computational Data Scientist, working at the Center for Data Driven Discovery of the California Institute of Technology (Caltech). In 1998 he obtained his PhD in astronomy at the Inter-University Centre for Astronomy and Astrophysics (IUCAA) at Pune, India. After a one-year postdoc position at the Physical Research Laboratory in Ahmedabad, he moved to Caltech. His main areas of expertise are machine learning and data-mining, mainly for the classification of transient astronomical sources. He has participated in several large-area sky surveys, such as the Large Synoptic Survey Telescope survey (LSST), the Catalonia Real-Time Transient Survey (CRTS), and the Palomar Transient Factory (PTF). Currently he leads the machine learning effort for the Zwicky Transient Facility (ZTF). His interest lies in combining diverse datasets to maximize scientific output using statistical and mathematical techniques, machine learning and (where possible) citizen science. He is applying these techniques to a variety of different fields, ranging from earth science with the EarthCube community, to medicine with the consortium for Molecular and Cellular Characterization of Lesions (MCL) and the Early Detection Research Network for cancer (EDRN). More generally he is interested in informatics, virtual worlds and educational outreach.

     

    Making complex data speak

    The big data problem is not so much a problem of volume or scale, as it is of complexity arising from factors like diversity and sparsity. At times representing the data to fit existing solutions can work wonders. This requires being able to abstract the problem, and sometimes clothing it in a different garb. We provide examples of how large astronomical surveys looking for time varying phenomena benefit from such approaches e.g. to find transients, or classify variables. With good provenance and modeling, the same process of abstraction can be applied in a much wider realm, through methodology transfer. We demonstrate this through parallels between transient astronomy - the study of outlying and rapidly varying phenomena - and personalized medicine - characterizing individuals through specifics and not just averages.