Abstracts

  • Information Retrieval From Black Holes

    Sumanta Chakraborty - Indian Association for the Cultivation of Science, India

    It is generally believed that, when matter collapses to form a black hole, the complete information about the initial state of the matter cannot be retrieved by future asymptotic observers, through local measurements. This is contrary to the expectation from a unitary evolution in quantum theory and leads to (a version of) the black hole information paradox. Classically, nothing else, apart from mass, charge and angular momentum is expected to be revealed to such asymptotic observers after the formation of a black hole. Semi-classically, black holes evaporate after their formation through the Hawking radiation. The dominant part of the radiation is expected to be thermal and hence one cannot know anything about the initial data from the resultant radiation. However, there can be sources of distortions which make the radiation non-thermal. Although the distortions are not strong enough to make the evolution unitary, these distortions carry some part of information regarding the in-state. In this work, we show how one can decipher the information about the in-state of the field from these distortions. We show that the distortions of a particular kind — which we call non-vacuum distortions — can be used to fully reconstruct the initial data.

  • Information scan of quantum states based on entropy-power uncertainty relations

    Petr Jizba - Faculty of Nuclear Sciences and Physical Engineering, The Czech Republic

    In this talk I will use the concept of entropy power to derive a new one-parameter class of information-theoretic uncertainty relations for pairs of observables in an infinite dimensional Hilbert space. This class constitute an infinite tower of higher-order cumulant uncertainty relations, which allows in principle to reconstruct the underlying distribution in a process that is analogous to quantum state tomography. I will illustrate the power of the new class by studying unbalanced cat states and the Cauchy-type heavy-tailed wave function that are of practical interest in quantum metrology. I will also briefly discuss connection with generalized Cramer-Rao inequalities and De-Bruijn's identity. Finally, I will try to cast some fresh light on the black hole information paradox.
    Related works:
    [1] P. Jizba, J.A. Dunningham and J. Joo, On the Role of Information Theoretic Uncertainty Relations in Quantum Theory, Annals of Physics 355 (2015) 87
    [2] P. Jizba, J.A. Dunningham, A. Hayes and Y. Ma, One-parameter class of uncertainty relations based on entropy power, Phys. Rev. E93 (2016) 060104(R)

  • Hints towards the Emergent Nature of Gravity

    Manus Visser - University of Amsterdam, Netherlands

    A possible way out of the conundrum of quantum gravity is the proposal that general relativity (GR) is not a fundamental theory but emerges from an underlying microscopic description. Despite recent interest in the emergent gravity program within the physics as well as the philosophy community, an assessment of the theoretical evidence for this idea is lacking at the moment. We intend to fill this gap in the literature by discussing the main arguments in favour of the hypothesis that the metric field and its dynamics are emergent. First, we distinguish between microstructure inspired from GR, such as through quantization or discretization, and microstructure that is not directly motivated from GR, such as strings, quantum bits or condensed matter fields. The emergent gravity approach can then be defined as the view that the metric field and its dynamics are derivable from the latter type of microstructure. Subsequently, we assess in how far the following properties of (semi-classical) GR are suggestive of underlying microstructure: (1) the metric's universal coupling to matter fields, (2) perturbative non-renormalizability, (3) black hole thermodynamics, and (4) the holographic principle. In the conclusion we formalize the general structure of the plausibility arguments put forward.

  • Testing Verlinde's Emergent Gravity using Gravitational Lensing

    Margot Brouwer - Universities of Groningen and Amsterdam, The Netherlands

    With his theory of emergent gravity, Verlinde not only claims to explain the mechanism behind gravity, but also the origin of the mysterious "extra gravity" found in and around galaxies, which is currently attributed to dark matter. We performed the first test of Verlinde's theory by measuring the space-time curvature around more than 33,000 galaxies using weak gravitational lensing. To achieve this, we employed the 180 square degrees overlap of the Kilo-Degree Survey (KiDS) with the Galaxy And Mass Assembly (GAMA) survey. We compared the observed gravitational lensing effect around galaxies to Verlinde's prediction, which is based only on the normal mass of the galaxies, and found that they are in good agreement without requiring any free parameters.

  • Emergent Gravity as a Relic of Quantum Spacetime

    Dawood Kothawala - Indian Institute of Technology Madras, India

    Most approaches to the Emergent Gravity paradigm effectively have two key inputs: (i) an observer dependent notion of causal horizons with thermal attributes, (ii) the information content characterized by entropy of these horizons. I show how these attributes can arise naturally from the geometric structure of the gravitational lagrangian of a quantum spacetime with a natural notion of zero point length built into it. This suggests that the (semi-)classical limit of quantum gravity is better described by thermodynamic functionals, rather than the conventional formalism based on the Einstein-Hilbert action principle.

  • On the origin of physical information

    Stefano Gottardi - Simbeyond B.V. Eindhoven, the Netherlands

    Information may be the key concept for achieving a physical description of the Universe that goes beyond our current understanding of the laws of Nature. However, for such an important concept, we are still struggling to provide a simple, unique and comprehensive definition of information in physics. Is information physical? If so, where is it stored? Can we identify information with a conserved entity of the physical reality? In the attempt of answering these questions I am going to present a possible definition for physical information (p-information). The final goal is to construct a theory where p-information is the ontological description of physical states, the information ontological states (io-states), which encode the complete reality of what we now call particles and fields. I will discuss a description of reality where p-information is conserved and exchanged in quanta during interaction among io-states and some of the implications of this description on our current understanding of quantum mechanics and quantum field theory.

  • How the universe computes

    Stephen Anastasi - The Scots PGC College, Australia

    Empirically founded methods (Bekenstein, Hawking, Lloyd) calculate that the information content of the universe, including dark matter and energy, is bounded by a finite number in the region of 〖10〗^122 bits. The presenter develops an a priori investigation of a general principle of equivalence, which produces a mathematical model of a necessarily evolving universe, for which the number of bits contained is calculated as ≈6.5×〖10〗^121, correlating to (t_univ./t_P)^2 + t_univ./t_P. Under the action of the initiating condition, structure, time and space are emergent, as is time’s arrow. The form of evolution implies the second law of thermodynamics, and that entropy is epiphenomenal with respect to the arrow of time, as is dark energy with respect to universal acceleration. A core informational sub-structure of the model correlates to the Planck mass (m_P). If this correlation is faithful, the Planck mass equation should be modified to m_P=t/t_now √(ℏc/G) . Accommodating experimental data relating to baryon number implies that the mass of baryons (at least) has increased over time, and that the very early universe was something like a Bose-Einstein condensate. It is conjectured that the implied low mass condition and rapid expansion of space that would accompany a spherical cosmological surface increasing as (t_univ./t_P)^2+t_univ./t_P would carry any emerging particles in a way that naturally overcomes gravity in the very early universe, essentially independent of its underlying cause. If the model is faithful, it would explain several cosmic conundrums, for example initial low entropy conditions.

  • How do Inflationary Perturbations become Classical?

    Shreya Banerjee - Tata Institute Of Fundamental Research Mumbai, India

    The observed classicality of primordial perturbations, despite their quantum origin during inflation, calls for a mechanism for quantum-to-classical transition of these initial fluctuations. As literature suggests a number of plausible mechanisms which try to address this issue, it is of importance to seek concrete observational signatures of the various approaches in order to have a better understanding of the early universe dynamics. Among these several approaches, it is the spontaneous collapse dynamics of Quantum Mechanics which is most viable for leaving discrete observational signatures as collapse mechanism inherently changes the generic quantum dynamics. We observe in this study that the observables from the scalar sector, i.e. scalar tilt ns, running of scalar tilt αs and running of running of scalar tilt βs, can not potentially distinguish a collapse modified inflationary dynamics in the realm of canonical scalar field and k−inflationary scenarios. The only distinguishable imprint of collapse mechanism lies in the observables of tensor sector in the form of modified consistency relation and a blue-tilted tensor spectrum only when the collapse parameter δ is non-zero and positive.

  • The complexity and information content of (simulated) cosmic structures

    Franco Vazza - Dipartimento di Astronomia, Università di Bologna

    The emergence of cosmic structure is commonly considered one of the most complex phenomena in Nature. However, the level complexity associated to the cosmic web has never been defined nor measured in a quantitative and objective way. In my contribution I will present recent results (Vazza 2017 MNRAS) in the attempt of measuring the information content of simulated cosmic structure, based on Information Theory. In particular, I will show how the statistical symbolic analysis of the datastream of state-of-the-art cosmological simulations can efficiently measure the bits of information necessary to predict the evolution of gas energy fields in a statistical way, also offering a simple way to identify the most complex epochs and mechanisms behind the emergence of observed cosmic structure, on ~Megaparsec scales. In the near future, Cosmic Information Theory will likely represent an innovative framework to design and analyse complex simulations of the Universe in a simple, yet powerful way. I will also discuss how a quantitative framework to measure information and complexity in simulations already allowed us a first (explorative yet quantitative) comparison between the cosmic web and other complex systems, such has the neuronal network of the human brain.

  • Big data analysis of public datasets to improve the genetic diagnostic process

    Patrick Deelen, University Medical Center Groningen, the Netherlands

    DNA analysis of patients with a rare disease often yields dozens of candidate genetic variants that could potentially cause the disease. The final manual interpretation of these variants is a complex and time-consuming process. In order to streamline this process, we downloaded data from public repositories containing vast amounts of molecular measurements on gene activity of 30,000 samples. Large scale analysis of this gene activity information allowed us to predict which genes are more likely involved in specific diseases or phenotypes. We show how we can use this information to improve the diagnostic process of patients with rare diseases.

  • GAVIN: Gene-Aware Variant INterpretation for medical sequencing

    Joeri van der Velde, University Medical Center Groningen, the Netherlands

    Around 1 in 17 people is affected by one of 8,000 known rare diseases. Most of these patients do not receive a diagnosis, meaning they remain in uncertainty without a prognosis, are unable join specific patient support groups, and do not receive the most appropriate treatment. Next-generation sequencing of DNA allows us to establish a molecular diagnosis and help these patients. However, we need to investigate an increasing amount of known disease genes for an increasing amount of patients. This requires genome diagnostics to rely more and more on computation estimation of how likely a mutation is thought to be disease causing. Here, we present Gene-Aware Variant INterpretation (GAVIN), a new method that accurately classifies variants for clinical diagnostic purposes. Classifications are based on gene-specific calibrations of allele frequencies, protein impact, and evolutionary stability. In a benchmark we achieve a sensitivity of 91.4% and a specificity of 76.9%, unmatched by other tools. Now we are incorporating machine-learning methods to surpass this again, so we can diagnose more patients.

  • Learning What Questions to Ask About Dark Energy

    Andrew Arrasmith - UC Davis Physics/Los Alamos National Labs, US

    The nature of the dominant component of the cosmological energy density, dark energy, is one of the most intriguing puzzles of modern physics. While many models for dark energy have been proposed, such models are difficult to constrain with modern astrophysical data sets, leaving a great deal of ambiguity. Given these difficulties, we propose a different approach. Rather than attempting to constrain the vast space of proposed models, we present a method that uses machine learning to ask which possible features of dark energy can be constrained. Specifically, we begin with a fairly general (and thus high dimensional) model for the dark energy equation of state and then learn a lower dimensional approximation that preserves the most detectable features of the general model for a choice of datasets. These “most detectable” features are selected based on a combination of the prior we choose on the general model and the structure of the experimental noise in the included datasets. After demonstrating our method, we present the results of a Monte Carlo Markov Chain analysis of the reduced model we arrive at.

  • Detection of Ultra Diffuse Galaxies in the Fornax Deep Survey

    Reynier Peletier, Kapteyn Institute, RUG, NL

    We present preliminary results of a study of ultra-diffuse galaxies in the new, ultradeep, FDS galaxy survey of the Fornax Cluster, as part of the SUNDIAL EU International Training Network. Ultra-diffuse galaxies (UDGs) are large galaxies with surface brightness that are much lower than those of ordinary galaxies. We discuss a number of automatic detection methods, and compare these with visual detection. Based on this we will draw conclusions on the properties of UDGs, and what their origin could be.

  • Machine-Learning Strategies for Variable Source Classification

    Nina Hernitschek - Caltech University, US

    In the era of large-scale surveys, methods to investigate these data, and especially to classify sources, become more and more important. Using the example of the Pan-STARRS1 3pi survey as panoptic high-latitude survey in the time domain, we show how we explored the capabilities of this survey for carrying out time-domain science in a variety of applications. We use structure function fitting, period fitting and subsequent machine-learning classification to search for and classify high-latitude as well as low-latitude variable sources, in particular RR Lyrae, Cepheids and QSOs. As the survey strategy of Pan-STARRS1 3pi led to sparse, non-simultaneous light curves, this is also a testbed to explore the possibilities in investigate the sparse data that will occur during the first months or years of upcoming large-scale surveys when only a fraction of the total observing is done.

  • Facts and Fakes

    Edwin Valentijn

    I will discuss the WISE technology which delivers advanced Big Data information systems by including (classical) entanglement of all data items such as Meta data and files. This so-called extreme data lineage is a tool for connecting world wide distributed communities, both for data processing and analysis. It also serves to facilitate deep Quality control and reproducibility of data products, back to the source of the data. I will discuss applications of Astro-WISE such as the astronomical KiDs survey, which obtains dark matter maps, and the further development of this approach for the data handling and analysis of the Euclid satellite. Such complex information systems have the threat of unrecognized systematic errors and also involve difficult to qualify data items obtained with machine learning. This will push the Data validation to rather extreme limits. Data validation will become a most critical aspect in Big Data sets. We see this already happening in the internet based media and I will discuss parallels with the on-going Fact and Fakes discussions and how this can be approached with WISE type of technologies: going back to the source.

  • Design, Drag & Drop and Deploy your own deep learning application in minutes!

    Bas Beukers

    In just 20 minutes we will give you an overview of our own Horus generic platform to build your smart applications instead of long lasting and complex development projects! For your information - This is also the average time you need to build your own deep learning application. With the use of the Horus platform including an extensive library packed with 3rd party SDKs; the powerful Horus components; the very user friendly WYSIWYG application builder; anyone can build their own multi platform supported applications at an instance. We will showcase three different projects using this platform already. Including project how to catch rhino poachers in South-Africa and how we bring hospitalized kids back to school with the help of our VR livestreaming platform. These examples will gave you a great insight in the variety, scalability and flexibility of this platform.

  • Machine learning as a service or custom built solutions?

    Jean-Paul van Oosten

    As a company specialised in helping customers extract the full business value from their data assets, Target Holding has several years of experience with Machine Learning and other AI-propositions for medium and large enterprises. Frequently, we are asked how our services compare to Microsoft’s Azure platform or Google’s online machine learning service. This talk provides some insight from 9 years of experience with these topics.

  • The Hunt for the first Black Hole image

    Remo Tilanus

    The mere possibility in our Universe of objects like black holes has had profound implications for information theory. From the black hole information paradox, i.e. the controversial suggestion that classical black holes destroy information, Hawking radiation, the holographic principle that states that information in a volume is mapped onto its boundary, to the more recent ideas of emergent gravity. Even if many of these results have been generalized since, they arose from or were strongly influenced by considerations about the implication of event horizons, which are the defining property of black holes. Yet, while few doubt the existence of black holes, to date they have never been observed directly. This is about to change: the Event Horizon Telescope Consortium is currently analyzing observations obtained in 2017 with an earth-spanning network of radio telescopes of the centers of our Galaxy as well as the nearby galaxy M87 that have an event-horizon-scale resolution, potentially providing us with a first direct image of a supermassive black hole. In my presentation I will discuss ongoing experiment as well as possible future developments.

  • Organizing the Universe with Dark Matter: Maximizing Complexity in Cosmic Structures

    Sinziana Paduroiu - Quantum Gravity Research, US

    From the early stages of the Universe to the small-scale high complexity of structures, dark matter plays a crucial role in the cosmic organizational chart. As numerical simulations indicate, structure forms and evolves differently in different dark matter models. Astrophysical observations combined with limits from recent detection experiments suggest a heavy fermion particle in the keV range as a prominent dark matter candidate. From structure formation and large-scale structure to the small galactic scales, I will illustrate with numerical simulations how different dark matter models imprint their signature, with a focus on a keV dark matter candidate. Non-local aspects related to the free-streaming of such particles are discussed in this dark matter scenario in connection to information transport. At the small scales, phase-space density constraints and quantum pressure are related to the internal structure of galaxies. The properties of these hypothesized particles, however, are strongly dependent on the production mechanism. Reviewing several theoretical considerations of proposed models, the production mechanism seems to require a high number of degrees of freedom, which can provide us with clues about the geometrical initial conditions. From the entropic perspective, connections between the organization of structures in the Universe and other systems are explored.

  • The signature of accelerated detectors in cavities

    Richard Lopp - Institute for Quantum Computing, University of Wat (CA), US

    We examine if the usual approximations of Quantum Optics are valid to study the Unruh effect in a cavity - as these approximations have been applied in past literature for those scenarios. Therefore, we consider the behaviour of an accelerated Unruh-DeWitt particle detector interacting with a quantum scalar field whilst travelling through a cavity in 3+1D. We thereby model a simplified version of the light-matter interaction between an atom and the electric field. We characterize the relativistic and non-relativistic regimes, and show that the energy is not merely localized in a few number of field modes in general, rendering it impossible to employ the single mode approximation. Furthermore, we find that overall neither the massless nor the massive scalar field of a 1+1D theory is a satisfying approximation to the effective 1+1D limit. The ultimate objective will be to study if the bombardment of the cavity with a stream of accelerated atoms results in the accumulation of a significant signature in the field characteristic of the Unruh effect, avoiding thereby the need for thermalisation of the atoms.

  • Euclid - a data driven space mission

    René Laureijs - ESA/ESTEC, NL

    Successive astronomy missions of the European Space Agency with the objective of surveying large fractions of the astronomical sky exhibit a steady increase in the output data rate, in step with the contemporary technical capabilities. Euclid (to be launch in 2022) will become ESA's champion in generating astronomical data, surpassing older missions like e.g. Planck and Gaia. The Euclid space telescope has been designed to determine the properties of dark matter and dark energy in our universe with an unprecedented precision. The experiment has been set up such that all imaging data are collected and sent to Earth for further analysis. Consequently, the Euclid database will contain far more information than required by the experiment, enabling other scientific investigations, which can cover nearly all areas of astronomy. We describe the Euclid experiment - the science drivers, design, and implementation of the mission. We concentrate on the particular spacecraft design drivers for the data collection and processing for Euclid. We will also put the Euclid development in the context of other scientific space missions, and we will address the areas of technology necessary either limiting or supporting the increasing streams of data.

  • Geometry of the Local Universe

    Johan Hidding - Netherlands eScience Center / Kapteyn Institute, NL

    Matter in the Universe is clustered; when seen on the largest scales galaxies group into walls, filaments and clusters. This Cosmic Web of structures shapes the environment in which galaxies form and merge. From the observed distribution of galaxies (including their redshift) it is very hard to distil a three dimensional impression of the nature of their environment. We used realisations (Hess et al. 2013) of cosmic initial conditions that were constrained to reproduce observations found in the 2MASS redshift survey (Huchra et al. 2012). To these initial conditions we applied the geometric adhesion model, giving a weighted Voronoi representation of the structures in the Local Universe (z < 0.03). We present visualisations of several specific known structures in the Local Universe like the Local Void, Local Sheet, the Perseus-Pisces supercluster and the Fornax filament, showing their current geometry and past dynamical history.

  • The crumpling of primordial information in the cosmic spider web

    Mark Neyrinck - University of the Basque Country, Bilbao, SP

    The initial Universe was imprinted with a pattern of primordial density fluctuations, a pattern with easily quantified information. Gravity then used this pattern as a blueprint to fold it up and stretch it into the cosmic web of galaxies, filaments, walls and voids. We recently showed that the cosmic web has identical geometric properties as both architectural "spiderwebs" (nodes and strands between them that can be pulled entirely into tension) and origami tessellations. These ideas provide a convenient framework to quantify the loss of primordial information, or growth of a kind of entropy, in the Universe.

  • Past, Present and Future in cosmology

    Daan Meerburg - Cambridge /RUG

    In a period of 40 years, observational cosmology has transformed from backyard science led by a small number of people into huge experimental efforts, counting sometimes thousands of members. Initially, big steps were made in our fundamental understanding of the Universe, which has made cosmology the most important tool to understand the origin and fate of our Universe. However, observationally we will eventually be limited by our current horizon and our ability to build experiments that can explore the past light cone of the Universe. I will discuss the big discoveries in cosmology, with a special emphasis on the Cosmic Microwave Background (CMB). I will summarise current experimental efforts (Planck) and discuss the latest results derived from observations of the CMB. The future of cosmology will evolve around very large collaborative efforts which will unite a large fraction if not all scientist working in a particular field. Again, as an example I will discuss the efforts in the CMB community, which aim to build an instrument with one million detectors in the next 10 years. In parallel, I will show that these huge efforts lead to less guaranteed results, precisely because we are reaching the limits of information contained within the observable Universe.

© Information Universe. All rights reserved.
Design by Iva Kostadinova. Template based on Templated. Background image John Dubinski (University of Toronto).
Banner image: Erwin Platen & Rense Boomsma (Kapteyn Institute, RUG)