Keynote Speakers

  • Erik Verlinde

    Professor Theoretical Physics, University of Amsterdam, Netherlands

    Erik Verlinde is a Dutch theoretical physicist, internationally recognized for his contributions to the field of string theory and quantum field theory. His PhD work in conformal field theories lead to the Verlinde formula, which is widely used by string theorists. He is also well known for his work on entropic gravity, which proposed that gravity is not a fundamental force but rather a result of the combination of other forces. Erik studied physics at the University of Utrecht and conducted his PhD research there under the supervisionof Bernard de Wit and Nobel Prize winner Gerard ‘t Hooft. At the end of his PhD, he moved to Princeton as a postdoctoral fellow. In 1993, Erik accepted a tenured position at CERN and in 1996 Utrecht University appointed him as professor of Physics. In 1999, he was also awarded a professorship at Princeton University. Since 2003, Erik Verlinde has been a professor of Physics at the Institute for Theoretical Physics in the University of Amsterdam. In 2011, he was awarded the most prestigious award for scientists in the Netherlands, the Spinoza Prize, by the Netherlands Organization for Scientific Research.

     

    Emergent gravity from quantum information: Explaining the dark universe

    At present we are witnessing a revolution in theoretical physics leading to a completely new view on space-time and gravity. Studies in string theory and black hole physics have revealed a deep connection between the structure of space-time and gravity and key concepts of quantum information theory. A particularly important role is played by quantum entanglement and its associated entropy. Einstein gravity appears as a consequence of an quantum-analogue of the first law of thermodynamics. This new view on gravity and space-time has particularly important implications for cosmology, where it leads to a natural explanation of the observed phenomena associated to dark energy and dark matter.

  • Alex Szalay

    Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA

    Alexander Szalay is the Alumni Centennial Professor of Astronomy at the Johns Hopkins University, where he is also a professor in the Department of Computer Science. His primary research interest is in cosmology where he has made significant contribution in the field of statistical measures of the spatial distribution of galaxies and galaxy formation. He was born and educated in Hungary, and spent his postdoctoral period at the UC Berkeley and University of Chicago before accepting a faculty position at the Johns Hopkins. Alex is the architect for the Science Archive of the Sloan Digital Sky Survey and the project director of the US National Virtual Observatory. He has written more than 340 papers that have appeared in various scientific journals, covering areas such as theoretical cosmology, observational astronomy, spatial statistics, and computer science. He was elected to the Hungarian Academy of Sciences as a corresponding member in 1990 and since 2003 he is a fellow of the American Academy of Arts and Sciences. He received an Alexander von Humboldt Prize in Physical Sciences in 2004 and a Microsoft Award for Technical Computing in 2007. The next year he also became Doctor Honoris Clausa of the Eotvos University in Budapest.

     

    How to Collect Less Data…

    Our instruments are becoming increasingly more capable in collecting phenomenal amounts of data. Our supercomputers are soon reaching the Exascale, where each time-step of a simulation will have a multi-petabyte memory footprint. Over the "Internet-of-things" billions of inexpensive sensors are already flooding our networks with information about their surroundings. In this age of extreme data we cannot simply continue storing it all and hope that one day we will be able to analyze them. The sheer amounts of data will soon bring an end to the Fourth Paradigm. We need a new stage in our approach, where we must figure out how can we reduce the volume of the incoming data, but preserve most of the information content. Natural phenomena are inherently sparse, controlled by a small number of relevant physical parameters. Transforming the data to such sparse representations, close to the source, may be an efficient way to reduce data volume. Various smart sampling and reconstruction strategies must be considered. Our numerical simulations will need to be planned and run very differently. The design of future experiments must be coming from multidimensional optimizations based on principles of active learning. The talk describes various ideas about these forthcoming changes and challenges, all resulting in yet another paradigm shift at the information frontier.

  • Gerard 't Hooft

    Professor Theoretical Physics, University of Utrecht, Netherlands

    Gerard 't Hooft is an internationally renowned Dutch theoretical physicist who shared the 1999 Nobel Prize in Physics with Martinus J.G. Veltman for “their development of a mathematical model that enabled scientists to predict the properties of both the subatomic particles that constitute the universe and the fundamental forces through which they interact“. Their work in the early 70's provided a much needed mathematical foundation to the electro-weak theory that helped physicists to discover a new subatomic particle (top quark), which was directly observed in 1995. Gerard   't Hooft earned his doctorate in physics in 1972 at the University of Utrecht. Soon after, he returned as a professor and has been there since. His exceptional scientific contributions to the field of theoretical high-energy physics and quantum field theory have been recognized numerously. He is a recipient of the Dannie Heineman Prize from the American Physical Society (1979), the Wolf Prize from the State of Israel (1982), the Lorentz Medal from the Royal Netherlands Academy of Arts and Sciences (KNAW) (1986), the Dutch Physicaprijs (1995), the Benjamin Franklin Medal (Philadelphia), and the Spinoza prize in 1995, which is the highest honor bestowed to Dutch scientists.

     

    Cogwheels, the cellular automaton interpretation, and conformal gravity

    The ultimate laws of physics may be characterised by discreteness, as if information, in the form of discrete bits and bytes, is being transmitted and processed everywhere in the universe. The suspicion that the real dynamical variables are classical bits rather than qubits, leads to the CA Interpretation of quantum mechanics. At first sight, this may seem to have little to do with the gravitational force. Upon closer inspection however, we have the information problem for black holes, indicating discreteness of quantum states there. Black holes can also be understood better if the fundamental action for all of physics reflects a theory with spontaneously broken (rather than explicitly broken) local conformal symmetry. This symmetry requires indefinite metric, caused by sign switches in the Lagrangian, and we now can explain how indefinite metric can also be understood as a strong indication favouring discreteness.

  • Charley Lineweaver

    Professor Astronomy and Astrophysics, Australian National University, Australia

    Dr Charles H. Lineweaver is the convener of the Australian National University's Planetary Science Institute and holds a joint appointment as an associate professor in the Research School of Astronomy and Astrophysics and the Research School of Earth Sciences. He obtained an undergraduate degree in physics from Ludwig Maximilians Universitat, Munich, Germany and a Ph.D. in astrophysics from the University of California at Berkeley (in 1994) He was a member of the COBE satellite team that discovered the temperature fluctuations in the cosmic microwave background. Before his appointment at ANU, he held post-doctoral positions at Strasbourg Observatory and the University of New South Wales where he taught one of the most popular general studies courses "Are We Alone?"His research areas include cosmology, exoplanetology, and astrobiology and evolutionary biology. He has about a dozen projects for students at all levels, dealing with exoplanet statistics, the recession of the Moon, cosmic entropy production, major transitions in cosmic and biological evolution and phylogenetic trees.

     

    Infogenesis: the Low Entropy Origins and Evolution of Information: Selfish Genes and Selfish Memes

    The fundamental role of information can be found in its origin: infogenesis. Inflation and baryogenesis produced the low gravitational entropy of almost uniformly distributed matter in the early universe. Gravitational collapse then became a source of free energy that produced far from equilibrium dissipative structures. Among these, aqueous thermal gradients on the Earth’s surface produced chemical redox potentials that drove auto-catalytic cycles and polymerized organic monomers. Environmental selection on enclosed cycles and polymers separated phenotype from genotype: code-based information. Thus, the coded information in the DNA of all life forms, came (via selection) from the structural information in the environment, which came from the low gravitational entropy of the early universe. I will point out the parallels between genetic and memetic evolution and how they both can be understood as the evolution of information.

  • Lude Franke

    Professor System Genetics, University Medical Center Groningen, Netherlands

    Lude Franke is associate professor in systems genetics (Department of Genetics, University Medical Centre Groningen). In the last few years thousands of genetic risk factors have been found. However, how most of these risk factors work remains unclear. It is unknown what cell-types and tissues are affected and very often it remains unclear what pathways are affected. Lude reads a research group that works on getting this insight, by using computational, statistical and mathematical methods.

    Lude Franke studied biomedical sciences at the Utrecht University where he obtained his PhD in 2008 while developing computational and statistical methods for conducting genome-wide association studies and gene network reconstruction using gene expression data. He subsequently developed computational methods to increase statistical power to identify such effects on gene expression. By using these methods, he was able to demonstrate that the genetic risk-variants for many other diseases also affect gene-expression levels, often only in specific cell-types and tissues. He recently showed in a reanalysis of 80,000 gene expression profiles that somatic copy number mutations in cancer can be detected well when correcting such data for 'transcriptional components'.

     

    Exploiting the biological information universe: Understanding disease by reusing data

    In the last seven years over 10,000 genetic risk factors have been found for over 200 different diseases. However, although we now know the causes of these diseases, we do not fully understand how DNA variants trigger them. To get more insight, we have recently started to take advantage of the massive amounts of information that have already been generated by thousands of laboratories around the world. Although each of these laboratories generated data to answer their own, often highly specific, research questions, we reasoned that generic patterns would show up when jointly reanalysing all this data and that these patterns could be informative for addressing the aforementioned problems. We therefore developed several computational and statistical methods to exploit gene expression data, which permitted us to gain insight in the function of genes to identify the molecular ‘downstream’ functions of genetic risk factors, and also to identify environmental risk factors that cause disease.

    In this presentation, I will discuss the exciting opportunities that now exist in this biological information universe. I will show a few examples of how our digital 'cell observatory’ enables us to understand disease and how this knowledge might eventually help to develop new drugs.