Abstracts

  • Cosmic Web on Fulldome

    Rien van de Waijgaert, Kapteyn Astronomical Institute, University of Groningen

    The large scale distribution of matter and galaxies features a complex network of interconnected filamentary galaxy associations. Galaxies and mass exist in a wispy weblike spatial arrangement consisting of dense compact clusters, elongated filaments, and sheetlike walls, amidst large near-empty void regions. This network has become known as the Cosmic Web. An important additional aspect of this mass distribution is that it is marked by substructure over a wide range of scales and densities, containing structures from a few megaparsecs up to tens and even hundreds of Megaparsecs of size.
    The vast Megaparsec cosmic web is one of the most striking examples of complex geometric patterns found in nature, and certainly the largest in terms of sheer size. The overwhelming complexity of the individual structures as well as of their connectivity, the lack of structural symmetries, the intrinsic multiscale nature and the wide range of densities that one finds in the cosmic matter distribution is a major and rewarding challenge towards understanding the formation of structure in the Universe.
    In this presentation we will take the viewer by means of an impressive full-dome vista over the efforts of the past decades to map the structure of the Universe on scales of hundreds of Megaparsecs. We subsequently address the formation of structure in the Universe, starting at the dawn of times when the Universe was nearly uniform to the present-day Universe. The experience of being entirely surrounded by a full-dome rendering of the structure of the Universe 379000 years after the Big Bang, as seen by the WMAP and Planck satellites, is followed up by witnessing the 13.8 billions years of gravitational contraction of these minute primordial fluctuations towards the majestic weblike patterns we observe today. We will explore the theory of Zeldovich, travel through the Millennium simulation and watch the formation of galaxies in the Illustris simulation to finally return to our own cosmic neighbourhood. The Local Void at our cosmic doorstep and the neighbouring ridge of the Pisces-Perseus supercluster will be studied, along with a zoom-in towards the small cosmic corner in which we see the formation of our Galaxy and the nearby Andromeda Galaxy.
    Finally, accompanied by Mozart's music we return to the real Universe mapped by the Sloan Digital Sky Survey, and travel along its structures, as if on a spaceship's journey.

  • Information-theoretic approaches to network complexity and genetic reprogramming

    Hector Zenil, Karolinska Institute/University of Paris/Oxford University

    Most real-world networks are complex because connections between units can represent non-linear and convoluted actions among the elements (e.g. genes). I will show that adjacency matrix as a full (labelled) description of the network accounts for this complexity in the way of (algorithmic) information theory. I will also show that Shannon's entropy, Kolmogorov-Chaitin's complexity and Solomonoff-Levin's algorithmic probability can quantify different properties of static and evolving graphs. I propose formal definitions of complexity for both labelled and unlabelled graphs and introduce tools are at the intersection of information theory, network biology and dynamical systems with a surprising application to reprogramming and steering biological cells to have them perform different tasks potentially providing a better understanding into how nature itself programmed them in the first place and even ultimately reverse the development of some complex diseases in the future.

  • Temperature from quantum entanglement

    Santhosh Kumar S, Indian Institute of Science Education and Research, India

    It is still unclear how thermal states dynamically emerge from a microscopic quantum description. A complete understanding of the long time evolution of closed quantum systems may resolve the tension between a microscopic description and the one offered by equilibrium statistical mechanics. In an attempt in this direction, we consider a simple bipartite system (a quantum scalar field propagating in black-hole background) and study the evolution of the entanglement entropy - by tracing over the degrees of freedom inside the event-horizon -at different times. We define entanglement temperature which is similar to the one used in the microcanonical ensemble picture in statistical mechanics and show that (i) this temperature is a finite quantity while the entanglement entropy diverges and (ii) matches with the Hawking temperature for all several black-hole space-times. We also discuss the implications of our result for the laws of black-hole mechanics and eigen-state thermalization.

  • Bayesian Big Bang

    Fred Daum, Raytheon, USA

    We derive a flow of particles corresponding to Bayes’ rule that describes the big bang with late time cosmic acceleration. In our theory the physical cause of late time cosmic acceleration is the cooling of the cosmos, which makes the measurements in Bayes’ rule much more accurate. The early cosmos was extremely hot, making measurements essentially worthless. We derive a PDE for this flow using a log-homotopy from the prior probability density to the posteriori probability density. We compute the normalized cosmic deceleration vs. redshift curve which agrees with data to within experimental error. Einstein’s theory of general relativity also agrees with the cosmological data, and more accurate data are required to falsify such theories. Our theory has the advantage that it actually explains the big bang, and it avoids postulating the existence of dark energy and dark matter, neither of which has ever been detected in credible terrestrial experiments. We do not use entropy or the Unruh effect or any form of consciousness, but rather we only use Bayes’ rule to model the measurements that are continuously occurring everywhere throughout the cosmos with photons and other particles. The likelihood depends on the thermal history of the cosmos, for which we assume Planck’s black body law with a radiation temperature that is roughly proportional to cosmic red shift. Despite much talk of information and measurements and observers in physics until now there has been no use of Bayesian rule to model such measurements.

  • Bayesian Big Bang

    Fred Daum, Raytheon, USA

    We derive a flow of particles corresponding to Bayes’ rule that describes the big bang with late time cosmic acceleration. In our theory the physical cause of late time cosmic acceleration is the cooling of the cosmos, which makes the measurements in Bayes’ rule much more accurate. The early cosmos was extremely hot, making measurements essentially worthless. We derive a PDE for this flow using a log-homotopy from the prior probability density to the posteriori probability density. We compute the normalized cosmic deceleration vs. redshift curve which agrees with data to within experimental error. Einstein’s theory of general relativity also agrees with the cosmological data, and more accurate data are required to falsify such theories. Our theory has the advantage that it actually explains the big bang, and it avoids postulating the existence of dark energy and dark matter, neither of which has ever been detected in credible terrestrial experiments. We do not use entropy or the Unruh effect or any form of consciousness, but rather we only use Bayes’ rule to model the measurements that are continuously occurring everywhere throughout the cosmos with photons and other particles. The likelihood depends on the thermal history of the cosmos, for which we assume Planck’s black body law with a radiation temperature that is roughly proportional to cosmic red shift. Despite much talk of information and measurements and observers in physics until now there has been no use of Bayesian rule to model such measurements.

  • Apparent Cosmic Acceleration via Back reaction from the Causal Propagation of Inhomogeneity

    Brett Bochner, Hofstra University, USA

    In the 'causal back reaction' paradigm, an apparent cosmic acceleration is generated from the causal flow of inhomogeneity information coming in from all structure-forming regions in the observable universe, depending crucially upon the time dependence of that information flow. Using a phenomenological approach to model the impact of these effects upon the evolution of the cosmic expansion, we find viable cosmological models in which causal back reaction successfully serves as a replacement for Dark Energy for generating the apparent acceleration, while achieving an alternative cosmic concordance for a matter-only universe with hierarchical structure formation.

  • Information-theoretical properties of the d-dimensional blackbody radiation

    Jesus Dehesa, University of Granada, Spain

    The (three-dimensional) Cosmic Microwave Background (CMB) radiation observed today is the most perfect black body radiation ever observed in nature, with a temperature of about 2.725 K. In this talk I will present an information-theoretical analysis of the spectral energy density of a d-dimensional black body at temperature T by means of various entropy-like quantities (disequilibrium, Shannon entropy, Fisher information) as well as by two (dimensionless) complexity measures (Cramer-Rao, Fisher-Shannon). We discuss these quantities in terms of temperature and dimensionality. It is shown that all three measures of complexity have an universal character in the sense that they only depend on the dimensionality d of the universe. In addition, three entropy-based characteristic frequencies are introduced and shown to obey Wien-like laws, similar to the well-known Wien's displacement law followed by the predominant frequency at which the spectral density is maximum. Application and some predictions about the CMB are physically discussed.

  • Free Will: A Notion of Information?

    Kim Wende, Universite Catholique de Louvain, Belgium

    A standard model of cognition is introduced suggesting that the functioning semantic brain network enables freely willed acts (causal determination by informing). Semantic networks resolve some limitations of the graph theory, to explain the human mind/brain from a general and external perspective as (micro-) state (Internal cosmos, Mental Universe Einstein, 1916). In the whole brain/mind, causality structures both memory and conscious experience by providing a linkage (horizontal reference) through time. Gestalt-perception of physical causation from collision observation (the ‘launch effect’ Michotte 1946) shows the universality of causal impressions at the level of visual and sensory systems in humans. Furthermore, evidence on brain and language development and symptomatic alterations of cognition and perception in schizophrenia suggests general aspects of semantics that are inherently universal to the brains of modern humans (Crow, 2008). Particularly the production of new (meaningful) abstract information (complex concepts and potentially directed - then causal - associations between them). This structured functionality of the brain overall allows for critical + creative (meta-) cognitive processes (reasoning, or reflection aimed at new insight: ideas and notions; Kant, 1784; accordingly, the economic concept of interest is regarded as an additional motivational(!) criterion at the psychological and neural level; Deleuze, 1983; Tse, 2013). If realized (themselves experienced as events/attributed with meaning), such ‘instances’ of agency in+over one whole mind/brain’s own(ed) conceptual network, or references to, i.e. ‘usages’ of and inscriptions into its own knowledge (in-formings), then in sum over times, constitute reason(-ability) as a general property of the human mind/brain as a state/system.

  • Space-time topology: black holes induce dark energy

    Marco Spaans, Kapteyn Astronomical Institute, Netherlands

    The topology of quantum space-time is explored and expressed through wormholes and three-tori. The global role of information in such a multiply connected space-time is presented. It is further shown that dark energy is driven by the number of black holes in the observable universe.

  • Algorithmic Information Theory: A Model for Science?

    Tom Sterkenburg, CWI/University of Groningen, Netherlands

    Algorithmic information theory offers an idealized notion of a data object's compressibility, that is commonly interpreted as an objective measure of information content. In this guise the theory has been brought to bear on issues in epistemology and even metaphysics, and it might strike one as particularly suited to supplement current work in understanding the "information universe." However, because of the technical character of the subject, it is hard for non-experts to assess the merits of these associations. In this talk, I will offer a critical review of some of the main interpretative claims surrounding algorithmic information theory. In particular, I will look at the connections that the theory forges between compressibility, computability, and simplicity, in order to evaluate the theory's supposed ramifications for the role of simplicity in scientific theorizing. A key explanatory device I will rely on is the formal correspondence between code lengths and probabilities. Code lengths provide a nice operational characterization of the concept of probability; but, conversely, this correspondence also allows us to express main algorithmic concepts from information theory in more familiar probabilistic terms. Indeed, this perspective clearly reveals the multiple levels at which the relevant notion of simplicity/information content is still a relative or subjective notion.

  • A historical look at the information universe

    Frans van Lunteren, Leiden Observatory, Netherlands

    The current emphasis on information as a key concept for the understanding of the universe nicely fits into a long-term historical pattern. Since the rise of modern science in the 17th century, four technical artefacts have been used as powerful resources for the understanding of both inorganic and organic nature. First of all, they provided a general framework for such understanding. More specifically, their metaphorical use as models of nature facilitated the construction and refinement of some key concepts that would play a prominent role in the analysis of natural phenomena. The four machines that came to model nature in the modern era were respectively the mechanical clock, the balance, the steam engine and the computer. More than any others, these machines came to play a highly visible role in Western societies, both socially and economically. The concepts they helped to emphasize and analyze were 'matter in motion', 'force', 'energy' and 'information'. Each of these concepts was, at some time, held to represent the ultimate building block of the universe. The ties between these four machines and nature were strengthened by the fact that the machines would eventually make their physical entry into scientific research itself. If this pattern makes any sense, what does it teach us about the probable fate of information based theories of the universe? All in all, there is both good news and bad news.

  • Seemingly Simple but Devastatingly Hard Question: What is Information?

    Ahmet Koltuksuz, Yasar University, Turkey

    From information flow in terms of molecular movements in cancer treatment, to the black hole information bounds and to the very fabric of spacetime, the definition and nature of information has been perplexing indeed. When information is evaluated on a metric scale, it is both discrete and continuous; however by Laplace-Beltrami operator-Gauss-Bonnet Theorem, it can be mapped from one another. On Planck scale; defining information as a discrete entity on the surface area of an n-manifold as 1s or 0s on triangles of Planck areas, provides a unique way of matching it with Bekenstein approaches. By triangulation, the surface area of any given n-dimensional manifold is an effectively computable function. The union and addition of n-manifolds can be computed by discrete operators. Our computer simulation results based on n-manifolds about digital nature of information are: (i) it is a discrete physical entity and is directly measurable; (ii) by Regge Calculus-Casual Dynamic Triangulation, it can be approached from both discrete and continuous perspectives depending on the scale that the observations have been made; (iii) the metrics and the attributes (i.e. timeliness, integrity) of information are re-defined since changing the curvature of a manifold doesn’t affect the net surface area of it, even and especially when the spacetime itself is considered as 2-manifold on the Planck scale; (iv) by mathematical & physical characteristics of an n-manifold; like a sample taken from the spacetime itself, the entropy of information is re-defined in this paper.

  • Role of information Theoretic Uncertainty Relations in Quantum Theory

    Petr Jizba, Czech Technical University, Czech Republic

    In my talk I will briefly reviewed uncertainty relations based on information theory. I then extend these results to account for (differential) Renyi entropy and its related entropy power. This will allows me to identify a new class of information-theoretic uncertainty relations (ITUR's). The potency of such uncertainty relations in quantum mechanics will be illustrated with a simple two-energy-level model where they outperform both the usual Robertson--Schrodinger uncertainty relation and Shannon entropy based uncertainty relation of Deutsch. In the continuous case the ensuing entropy power uncertainty relations are discussed in the context of heavy tailed wave functions and Schrodinger cat states. Again, improvement over both the Robertson--Schrodinger uncertainty principle and Shannon ITUR will demonstrated in these cases.

  • A Shock-in-Time : Producing the Entropy of the Universe

    Jonathan Braden, University College London, UK

    An early phase of inflation within our Hubble patch was driven by the potential energy of a long-wavelength effective scalar field condensate - the inflation. Additionally, there were small nearly Gaussian vacuum fluctuations on sub-Hubble scales. To connect with the standard Hot Big Bang, the low entropy coherent energy stored in the inflation must be converted into the high entropy incoherent energy of a hot plasma with temperature above the GeV scale. Since (Shannon) entropy is the expectation value of a field's information content, this transition is one of the most dramatic information processing events in our Universe's history. Field theory models of this transition often contain exponential instabilities in the sub-Hubble modes. This leads to strong mode-mode coupling and highly non-Gaussian fundamental field statistics. Since the dynamics is unitary, entropy production must occur as a result of an effective coarse-graining. Remarkably, we can identify a collective variable (the logarithmic energy density phonons) with nearly Gaussian low point statistics. We therefore define an entropy based on measurements of the one- and two-point statistics of these phonons. Using maximum entropy techniques, I will demonstrate that there is a short burst of entropy production around the onset of mode-mode coupling, followed by a slower nearly adiabatic cascade of spatial modes to shorter length scales. We have dubbed the regime of rapid entropy production the shock-in-time, in analogy with the behaviour of fluid shocks as randomising fronts. Finally, I will discuss the production of adiabatic density perturbations through a spatial modulation of the shock-in-time.

  • On Semantic Information of Nature

    Wolfgang Johannsen, Frankfurt School of Finance & Management, Germany

    Information’s place in nature has not been well understood yet. C.E. Shannon’s Information Theory excluded all semantic aspects in favor of quantitative aspects of information transmitted via a communication channel. In our contribution we present a new comprehensive information model (Evolutionary Energetic Information Model). Our model is built on Information Theory as well as Bateson’s definition and Burgin’s ontology on information, to name a few. The model defines semantic information in nature within a framework. At the core we propose that semantic information is energy. However, not all energy is information. Our basic assumption is that semantics is ligated to life. Without life, 'meaning' effectively is without a basis. Information is part of life, which in turn is the result of evolution on earth. Different branches of evolution possess various types of semantic information. Semantic information thus is subject to mutation, selection and adaption. The common denominators for all semantic information which are syntax, semantics and pragmatics (Morris), are being applied in evolution. Syntax in the beginning had materialized as rules imprinted in biochemistry molecules. Later, the biochemistry ‘hardware’ became accompanied by ‘software’ Species have taken up more and more complex - and meaningful - information during evolution. With complexity accumulated, semantic information in various ways is bequeathed to next generations to build upon. By being exposed to evolutionary mechanisms, increasing information complexity is being developed, processed and inherited.

  • Use of ontologies for automated data processing and their challenges: a bioinformatics view

    Arzu Tugce Guler, Leiden University Medical Center, Netherlands

    The formalisation of knowledge plays a key role in automated data processing. Describing the entities for a specific domain is usually a starting point for building an architecture to analyse and process data. In data intensive fields, such as bioinformatics, the relationships and hierarchies among entities are also crucial for building complete "workflow" systems. Scientific ontologies define entities, relationships, attributes and hierarchies; thus helps to bring data from different observations and experiments to a common ground. Ontologies come into use, indirectly, to accomplish one of the aspects of automated data processing: minimizing, even discarding, the need for human intervention. Use of ontologies in automated systems helps us to avoid building data-dependent processing tools, those need to be tuned for each type of data coming from different observations and experiments within the same scope. There are well-established ontologies in various scientific fields, especially in information sciences and life sciences. Since bioinformatics is a bridge between those two fields, it is very suitable setting for discussing the binding force of ontologies, in every sense of the word. In our study, we review the conveniences that ontologies bring to automated data processing along with their challenges and side effects from a bioinformatics point of view.

  • Simulating a Black Hole with a Cellular Automaton based Causal Dynamical Triangulation

    Cagatay Yucel, Yasar University, Turkey

    In this research paper, the results of a computer simulation for Bekenstein entropy of a black hole by Cellular Automaton based Causal Dynamical Triangulation (CDT) are presented. As a different perspective, Shannon entropy is also computed, and the two are compared and contrasted. Besides using rules of the simulations of CDT, the rules of Conwayâs Game of Life (GoL) are added to the simulation. GoL is known to be constructing self-replicating patterns on the grid where cells are either alive or dead. On the other hand, CDT has shown the emergence of a 4-d universe with the dynamical and deterministic rules for 4-simplices where the dimension of time defines the causality. By unifying these two simulations, a cellular automaton; which alters the geometry and curvature of spacetime for a specific case that the entropy is maximized on the triangulation, is implemented as a simulated black hole. In doing so, the information stored in the nearby triangles are collapsed into maximized area of entropy. This coding allows us to simulate the software based black holes on a grid where 1s or 0s are distributed within the triangulation. By this simulation, we show that: (i) the results obtained by the computation of the surface area of a given triangulated spacetime and entropy comply with that of Bekenstein, (ii) given the probability distribution of the information on the triangles, the computation of Shannonâs information entropy is possible. The evaluation of the two entropy computations, as well, is included in this paper.

  • A personal Perspective on Cellular Automaton Models of Fundamental Processes in Physics

    Edward Fredkin, Carnegie Mellon University, USA

    NOTE: The following abstract is an abridged version of the original one.
    I started work in this field in 1958, when I first had the idea that microscopic physics might be a consequence of some kind of digital process. Over the years, one problem after another cropped up and required solutions… when this process started, the academic community was, in general, absolutely certain that no universal computational model could be reversible (while the microscopic laws of physics were reversible). I and my students put to rest every obstacle that stood in the way of microscopically reversible computational models of fundamental processes in physics. What we now wish to demonstrate is no more than additional possibilities. We claim that there will not be found, any convincing scientific methodology that rules out the possibility of a Discrete Cellular Automaton model of the Physics of our world. Of course, our likely inability to disprove this conjecture is not a proof of its truth, however it could nevertheless serve to lessen the often vociferous objections expressed by those who support the concept that mathematical analysis exactly conforms to physics....
    While currently, Cellular Automaton models include some aspects of physics, what is nevertheless still true is that there are many aspects of physics, where we do not let have competent CA models....The point is not a claim that all the important problems have been solved, rather is a claim that we have evidence that assures us that all of the problems that separate CA from being competent models of microscopic physics can, indeed, be solved. What our talk will explain are how CA models of physics could exhibit translation symmetry and how CA models could implement characteristics of quantum field theory.

  • Partially-directed evolution: from the Big Bang to the problem of knowledge acquisition

    Alexei Melkikh , Ural Federal University, Yekaterinburg, Russia

    The evolution of complex systems in the Universe is considered on the basis of the previously proposed model of partial-directed evolution of life (Melkikh, 2014). One of the most important properties of partially-directed evolution is the existence of a priori information about the future states of the system. This leads to the need of storing such information, including the early stages of evolution (Melkikh, 2015). As the storage mechanism of a priori information quantum states of biologically important molecules are considered. However, this information must have existed before the emergence of life, in particular - in the early stages of the Universe expansion. Mechanisms of evolution and storage of such information during expansion of the Universe associated with the properties of fundamental interactions are proposed. In this context, the problem of knowledge acquisition in natural and artificial systems is also considered.

  • Understanding the homochirality and the information content of the building blocks of life via theoretical spectroscopy

    Fabiana da Pieve, LSI, CNRS, Ecole Polytechnique, France

    The origin of the homochirality of the building blocks of life is one of the most fascinating mysteries of nature. In the last decades, findings of amino acids with some preferred chirality in meteorites have been reported, suggesting the hypothesis of a selective destruction of the amino acids (trapped in interstellar ices) of one handedness by a chiral perturbation in space, like the circularly polarized light from young stars. This would have possibly biased the biosphere via the impact of meteorites containing the organic matter with the other handedness on the early Earth. I will present quantum mechanical calculations based on Time-Dependent Density Functional Theory for the difference between absorption for left and right handed circular light of some specific amino acids and their precursors, observed in meteoritic findings, and for the so called "index of chirality", giving the information content brought by the local chirality of each system with different complexity. I obtain that the precursor has a stronger preference for absorption of right handed light with respect to the amino acid, suggesting that the handedness of the latter could be biased by asymmetric destruction of the precursor and that the information on the local chirality changes not always as expected between the two systems.

  • Emulation networks of minimalistic universes

    Jürgen Riedel, LABORES, France

    We explore the reprogramming and emulation capabilities of cellular automata (CA). We show a series of crossing boundary results including a Wolfram Class 1 Elementary Cellular Automata (ECA) emulating a Class 2 ECA, a Class 2 ECA emulating a Class 3 ECA, and a Class 3 ECA emulating a Class 2 ECA, among results of similar type, including an exhaustive exploration of CA for neighbourhood range r = 3/2. All these emulations occur with only a constant overhead as a result of the block emulation technique and hence are computationally efficient. We also show that topological properties such as in-out hub degrees in the emulation network can be used as a finer-grained classification of behaviour under various complexity measures suggesting a Turing-universality likeliness as a function of the number of emulated rules for a given block size that yields results in agreement with the knowledge of the ECA rule space. We also put forward a definition of CA prime rule in the sense of prime numbers as basic constructors of the space of all rules under composition proving that all the ECA space can be constructed by a small subset of rules none of which is a Wolfram Class 4 rule. The approach leads to a different topological perspective of the dynamic space; and of the computing, controlling and reprogramming capabilities of these simple computer programs.

  • Information Visualization in Neuroscience

    Jos Roerdink, University of Groningen, The Netherlands

    Many new insights into the workings of the human brain are nowadays obtained from neuroimaging data, produced by techniques such as Magnetic Resonance Imaging (MRI), Diffusion Tensor Imaging (DTI), functional Magnetic Resonance Imaging (fMRI), Positron Emission Tomography (PET), or Electroencephalography (EEG). I will discuss how information visualization methods can be used to get insight in the structure and function of the brain. Particular attention will be paid to brain connectivity and connectomics. For example, multichannel EEG data or fMRI data can be used to study brain coherence networks, while fiber tract visualization is possible based on DTI data. I will also discuss how the combination of machine learning and visualisation can help to identify and understand brain patterns that are characteristic of certain neuro-degenerative diseases.