Abstracts

Keynote: Spacetime microstructure, Gravity and the Cosmos
Thanu Padmanabhan  InterUniversity Center for Astronomy and Astrophysics, India
Spacetime behaves like a fluid, made of atoms of spacetime, and its dynamics can be obtained from the kinetic theory for these underlying microscopic degrees of freedom. This allows the interpretation of the field equations  as well as several geometrical variables  in a purely thermodynamic language. This is true for a large class of theories including, but not limited to, general relativity. The cosmological constant, which arises as an integration constant, can then be related to the cosmic information accessible to an eternal observer. This relation, in turn, allows the determination of its numerical value from the microstructure of the spacetime. In fact, this approach provides a refreshingly new perspective on cosmology and offers the possibility of testing quantum gravity in the sky.

Information Retrieval From Black Holes
Sumanta Chakraborty  Indian Association for the Cultivation of Science, India
It is generally believed that, when matter collapses to form a black hole, the complete information about the initial state of the matter cannot be retrieved by future asymptotic observers, through local measurements. This is contrary to the expectation from a unitary evolution in quantum theory and leads to (a version of) the black hole information paradox. Classically, nothing else, apart from mass, charge and angular momentum is expected to be revealed to such asymptotic observers after the formation of a black hole. Semiclassically, black holes evaporate after their formation through the Hawking radiation. The dominant part of the radiation is expected to be thermal and hence one cannot know anything about the initial data from the resultant radiation. However, there can be sources of distortions which make the radiation nonthermal. Although the distortions are not strong enough to make the evolution unitary, these distortions carry some part of information regarding the instate. In this work, we show how one can decipher the information about the instate of the field from these distortions. We show that the distortions of a particular kind — which we call nonvacuum distortions — can be used to fully reconstruct the initial data.

Information scan of quantum states based on entropypower uncertainty relations
Petr Jizba  Faculty of Nuclear Sciences and Physical Engineering, The Czech Republic
In this talk I will use the concept of entropy power to derive a new oneparameter class of informationtheoretic uncertainty relations for pairs of observables in an infinite dimensional Hilbert space. This class constitute an infinite tower of higherorder cumulant uncertainty relations, which allows in principle to reconstruct the underlying distribution in a process that is analogous to quantum state tomography. I will illustrate the power of the new class by studying unbalanced cat states and the Cauchytype heavytailed wave function that are of practical interest in quantum metrology. I will also briefly discuss connection with generalized CramerRao inequalities and DeBruijn's identity. Finally, I will try to cast some fresh light on the black hole information paradox.
Related works:
[1] P. Jizba, J.A. Dunningham and J. Joo, On the Role of Information Theoretic Uncertainty Relations in Quantum Theory, Annals of Physics 355 (2015) 87
[2] P. Jizba, J.A. Dunningham, A. Hayes and Y. Ma, Oneparameter class of uncertainty relations based on entropy power, Phys. Rev. E93 (2016) 060104(R)

Invited: Unraveling black holes with gravitational waves
Chris Van Den Broeck  Van Swinderen Institute, the Netherlands
The ability to directly detect gravitational waves has opened up the possibility of empirically studying the detailed structure of black holes. In particular, a wealth of information is contained in the way a newly formed black hole oscillates ("ringdown"), which will enable an indirect test of the celebrated nohair theorem. If a deviation is seen then this may be an indication of quantum effects near the horizon. Prompted by Hawking's information paradox, alternatives to standard black holes have been proposed, such as firewalls and fuzzballs, which would give rise to gravitational wave "echoes" long after the ringdown signal has died down. In the next few years, after the anticipated upgrades of the LIGO and Virgo detectors, such effects will become accessible, opening up a new chapter in the study of gravitation in the strongfield regime.

Keynote: title TBD
Erik Verlinde
TBD

Hints towards the Emergent Nature of Gravity
Manus Visser  University of Amsterdam, Netherlands
A possible way out of the conundrum of quantum gravity is the proposal that general relativity (GR) is not a fundamental theory but emerges from an underlying microscopic description. Despite recent interest in the emergent gravity program within the physics as well as the philosophy community, an assessment of the theoretical evidence for this idea is lacking at the moment. We intend to fill this gap in the literature by discussing the main arguments in favour of the hypothesis that the metric field and its dynamics are emergent. First, we distinguish between microstructure inspired from GR, such as through quantization or discretization, and microstructure that is not directly motivated from GR, such as strings, quantum bits or condensed matter fields. The emergent gravity approach can then be defined as the view that the metric field and its dynamics are derivable from the latter type of microstructure. Subsequently, we assess in how far the following properties of (semiclassical) GR are suggestive of underlying microstructure: (1) the metric's universal coupling to matter fields, (2) perturbative nonrenormalizability, (3) black hole thermodynamics, and (4) the holographic principle. In the conclusion we formalize the general structure of the plausibility arguments put forward.

Testing Verlinde's Emergent Gravity using Gravitational Lensing
Margot Brouwer  Universities of Groningen and Amsterdam, The Netherlands
With his theory of emergent gravity, Verlinde not only claims to explain the mechanism behind gravity, but also the origin of the mysterious "extra gravity" found in and around galaxies, which is currently attributed to dark matter. We performed the first test of Verlinde's theory by measuring the spacetime curvature around more than 33,000 galaxies using weak gravitational lensing. To achieve this, we employed the 180 square degrees overlap of the KiloDegree Survey (KiDS) with the Galaxy And Mass Assembly (GAMA) survey. We compared the observed gravitational lensing effect around galaxies to Verlinde's prediction, which is based only on the normal mass of the galaxies, and found that they are in good agreement without requiring any free parameters.

Emergent Gravity as a Relic of Quantum Spacetime
Dawood Kothawala  Indian Institute of Technology Madras, India
Most approaches to the Emergent Gravity paradigm effectively have two key inputs: (i) an observer dependent notion of causal horizons with thermal attributes, (ii) the information content characterized by entropy of these horizons. I show how these attributes can arise naturally from the geometric structure of the gravitational lagrangian of a quantum spacetime with a natural notion of zero point length built into it. This suggests that the (semi)classical limit of quantum gravity is better described by thermodynamic functionals, rather than the conventional formalism based on the EinsteinHilbert action principle.

Public talk: Cosmic History and Mysteries
Thanu Padmanabhan  InterUniversity Center for Astronomy and Astrophysics, India
The spectacular progress in observational cosmology in the last few decades has helped us to understand the history of our universe, for a period spanning a fraction of a second to billions of years. These observations have also shown that the current accelerated expansion of the universe demands the existence of yet another fundamental constant of nature, called the cosmological constant. Understanding the origin of the cosmological constant is considered to be the greatest challenge in theoretical physics today. Prof. Padmanabhan will describe its features and a possible way forward towards a better understanding of our cosmos.

On the origin of physical information
Stefano Gottardi  Simbeyond B.V. Eindhoven, the Netherlands
Information may be the key concept for achieving a physical description of the Universe that goes beyond our current understanding of the laws of Nature. However, for such an important concept, we are still struggling to provide a simple, unique and comprehensive definition of information in physics. Is information physical? If so, where is it stored? Can we identify information with a conserved entity of the physical reality? In the attempt of answering these questions I am going to present a possible definition for physical information (pinformation). The final goal is to construct a theory where pinformation is the ontological description of physical states, the information ontological states (iostates), which encode the complete reality of what we now call particles and fields. I will discuss a description of reality where pinformation is conserved and exchanged in quanta during interaction among iostates and some of the implications of this description on our current understanding of quantum mechanics and quantum field theory.

How the universe computes
Stephen Anastasi  The Scots PGC College, Australia
Empirically founded methods (Bekenstein, Hawking, Lloyd) calculate that the information content of the universe, including dark matter and energy, is bounded by a finite number in the region of 〖10〗^122 bits. The presenter develops an a priori investigation of a general principle of equivalence, which produces a mathematical model of a necessarily evolving universe, for which the number of bits contained is calculated as ≈6.5×〖10〗^121, correlating to (t_univ./t_P)^2 + t_univ./t_P. Under the action of the initiating condition, structure, time and space are emergent, as is time’s arrow. The form of evolution implies the second law of thermodynamics, and that entropy is epiphenomenal with respect to the arrow of time, as is dark energy with respect to universal acceleration. A core informational substructure of the model correlates to the Planck mass (m_P). If this correlation is faithful, the Planck mass equation should be modified to m_P=t/t_now √(ℏc/G) . Accommodating experimental data relating to baryon number implies that the mass of baryons (at least) has increased over time, and that the very early universe was something like a BoseEinstein condensate. It is conjectured that the implied low mass condition and rapid expansion of space that would accompany a spherical cosmological surface increasing as (t_univ./t_P)^2+t_univ./t_P would carry any emerging particles in a way that naturally overcomes gravity in the very early universe, essentially independent of its underlying cause. If the model is faithful, it would explain several cosmic conundrums, for example initial low entropy conditions.

How do Inflationary Perturbations become Classical?
Shreya Banerjee  Tata Institute Of Fundamental Research Mumbai, India
The observed classicality of primordial perturbations, despite their quantum origin during inflation, calls for a mechanism for quantumtoclassical transition of these initial fluctuations. As literature suggests a number of plausible mechanisms which try to address this issue, it is of importance to seek concrete observational signatures of the various approaches in order to have a better understanding of the early universe dynamics. Among these several approaches, it is the spontaneous collapse dynamics of Quantum Mechanics which is most viable for leaving discrete observational signatures as collapse mechanism inherently changes the generic quantum dynamics. We observe in this study that the observables from the scalar sector, i.e. scalar tilt ns, running of scalar tilt αs and running of running of scalar tilt βs, can not potentially distinguish a collapse modified inflationary dynamics in the realm of canonical scalar field and k−inflationary scenarios. The only distinguishable imprint of collapse mechanism lies in the observables of tensor sector in the form of modified consistency relation and a bluetilted tensor spectrum only when the collapse parameter δ is nonzero and positive.

The complexity and information content of (simulated) cosmic structures
Franco Vazza  Dipartimento di Astronomia, Università di Bologna
The emergence of cosmic structure is commonly considered one of the most complex phenomena in Nature. However, the level complexity associated to the cosmic web has never been defined nor measured in a quantitative and objective way. In my contribution I will present recent results (Vazza 2017 MNRAS) in the attempt of measuring the information content of simulated cosmic structure, based on Information Theory. In particular, I will show how the statistical symbolic analysis of the datastream of stateoftheart cosmological simulations can efficiently measure the bits of information necessary to predict the evolution of gas energy fields in a statistical way, also offering a simple way to identify the most complex epochs and mechanisms behind the emergence of observed cosmic structure, on ~Megaparsec scales. In the near future, Cosmic Information Theory will likely represent an innovative framework to design and analyse complex simulations of the Universe in a simple, yet powerful way. I will also discuss how a quantitative framework to measure information and complexity in simulations already allowed us a first (explorative yet quantitative) comparison between the cosmic web and other complex systems, such has the neuronal network of the human brain.

Keynote: Information in the brain  From understanding to nonTuring computing
Karlheinz Meier  University of Heidelberg, Germany
The brain is a complex assembly of cells connected by a dynamic network of axons, dendrites and synapses. It is not a closed system but rather develops through continuous interaction with the environment and other brains. As a consequence it has many features that we would like to see in computers. The brain does not require engineered algorithms but it learns. The brain does not need reliable components but it is fault tolerant. The brain is very energy efficient and it solves problems by performing stochastic inference. Theoretical neuroscience is making good progress to uncover the principles behind these features. At the same time several research groups in academia and industry work on transferring the principles to novel computer architectures which are often referred to as „neuromorphic“. The lecture will deliver an overview of this emerging field and discuss the development towards a more biologically derived AI.

Explaining information by spontaneous symmetry breaking into the dynamics of mental processes
Gabriel Crumpei  Alexandru Ioan Cuza University from Iasi, Romania
Nowadays, the main difficulty of the neuroscientists is the prejudice to study only the neuronal, neuroglial and neurotransmitters structure. Starting from the quantum theory according to which every particle has a corresponding wave, and taking into account that starting from the newest cell structures, the neurofibriles, down to the cell, tissues and organs, one can observe the existence of a strong wave spectral activity. This spectral wave component has been understudied, even if it is contained in the quantum physics theories, but also in the neurophysiological concepts and is rudimentary highlighted at the level of overall cerebral activity through EEG and EMG. This spectral component associated and related to the material, corpuscular one (the neuronal and nonneuronal structures of the brain) must be at least as important as the corpuscular part. This approach of the structure and activity of the brain from a spectral perspective allows the study of the brain from the complex systems theory perspective. One can try to identify the unstructured, chaotic, stochastic component, along with the structured, causal component with linear dynamics, as a dynamics between the two components on the phase space, in which there is a permanent exchange of energy, but also of information. If we come to accept this, then certain principles, properties and characteristics from nonlinear dynamics could be used to better understand the mental.

Invited: How Much Coral Habitat is there globally? An Ecological Modelling and Remote Sensing Journey
Chris Roelfsema, Remote Sensing Research Centre, School of Earth aR, St. Lucia, AU
Global coral reefs cover 1% of the Oceans, with 25% of Marine species and 500 million people rely on these natural wonders, but reefs are impacted through climate change. Reef conservation requires consistent habitat maps, however there is no single map created consistently, describing geomorphic zonation (e.g. slope, flat, crest) or bottom composition (e.g. coral, algae, sand) over the full extent of the worlds coral reefs. Such maps have not been produced yet due to the costs of mapping the extensive and mostly submerged reefs, hence we do not know the actual extent of coral habitat. This presentation introduces an approach that integrated ecological mapping with empirical modelling to characterise the geomorphic zonation and bottom composition for shallow reefs of for the Great Barrier Reef. Our mapping approach was applied on 237 reefs on the GBR and combined: field data; Landsat OLI 8derived mosaic and water depth (15m pixels); slope, waveheight; object based analysis; and ecogeomorphological modelling. The methods and digital maps represent a significant advancement in our capability to map at a relative fine scale over a large extent, hence it’s now being adapted to apply to all global coral reefs. The adaption includes the use of the higher spatial resolution Planet Dove Imagery (3.7m pixels), resulting insignificant increase size of data sets and processing resource needs. The geomorphic and bottom type maps created for coral reefs globally, will support management and science for the conservation of the GBR or other reefs globally.

Big data analysis of public datasets to improve the genetic diagnostic process
Patrick Deelen, University Medical Center Groningen, the Netherlands
DNA analysis of patients with a rare disease often yields dozens of candidate genetic variants that could potentially cause the disease. The final manual interpretation of these variants is a complex and timeconsuming process. In order to streamline this process, we downloaded data from public repositories containing vast amounts of molecular measurements on gene activity of 30,000 samples. Large scale analysis of this gene activity information allowed us to predict which genes are more likely involved in specific diseases or phenotypes. We show how we can use this information to improve the diagnostic process of patients with rare diseases.

GAVIN: GeneAware Variant INterpretation for medical sequencing
Joeri van der Velde, University Medical Center Groningen, the Netherlands
Around 1 in 17 people is affected by one of 8,000 known rare diseases. Most of these patients do not receive a diagnosis, meaning they remain in uncertainty without a prognosis, are unable join specific patient support groups, and do not receive the most appropriate treatment. Nextgeneration sequencing of DNA allows us to establish a molecular diagnosis and help these patients. However, we need to investigate an increasing amount of known disease genes for an increasing amount of patients. This requires genome diagnostics to rely more and more on computation estimation of how likely a mutation is thought to be disease causing. Here, we present GeneAware Variant INterpretation (GAVIN), a new method that accurately classifies variants for clinical diagnostic purposes. Classifications are based on genespecific calibrations of allele frequencies, protein impact, and evolutionary stability. In a benchmark we achieve a sensitivity of 91.4% and a specificity of 76.9%, unmatched by other tools. Now we are incorporating machinelearning methods to surpass this again, so we can diagnose more patients.

Keynote: Making complex data speak
Ashish Mahabal  California Institute of technology, United States
The big data problem is not so much a problem of volume or scale, as it is of complexity arising from factors like diversity and sparsity. At times representing the data to fit existing solutions can work wonders. This requires being able to abstract the problem, and sometimes clothing it in a different garb. We provide examples of how large astronomical surveys looking for time varying phenomena benefit from such approaches e.g. to find transients, or classify variables. With good provenance and modeling, the same process of abstraction can be applied in a much wider realm, through methodology transfer. We demonstrate this through parallels between transient astronomy  the study of outlying and rapidly varying phenomena  and personalized medicine  characterizing individuals through specifics and not just averages.

Learning What Questions to Ask About Dark Energy
Andrew Arrasmith  UC Davis Physics/Los Alamos National Labs, US
The nature of the dominant component of the cosmological energy density, dark energy, is one of the most intriguing puzzles of modern physics. While many models for dark energy have been proposed, such models are difficult to constrain with modern astrophysical data sets, leaving a great deal of ambiguity. Given these difficulties, we propose a different approach. Rather than attempting to constrain the vast space of proposed models, we present a method that uses machine learning to ask which possible features of dark energy can be constrained. Specifically, we begin with a fairly general (and thus high dimensional) model for the dark energy equation of state and then learn a lower dimensional approximation that preserves the most detectable features of the general model for a choice of datasets. These “most detectable” features are selected based on a combination of the prior we choose on the general model and the structure of the experimental noise in the included datasets. After demonstrating our method, we present the results of a Monte Carlo Markov Chain analysis of the reduced model we arrive at.

Invited: Photometric redshifts for large panchromatic surveys
Giuseppe Longo, University Federico II in Napoli, Italy

Detection of Ultra Diffuse Galaxies in the Fornax Deep Survey
Reynier Peletier, Kapteyn Institute, RUG, NL
We present preliminary results of a study of ultradiffuse galaxies in the new, ultradeep, FDS galaxy survey of the Fornax Cluster, as part of the SUNDIAL EU International Training Network. Ultradiffuse galaxies (UDGs) are large galaxies with surface brightness that are much lower than those of ordinary galaxies. We discuss a number of automatic detection methods, and compare these with visual detection. Based on this we will draw conclusions on the properties of UDGs, and what their origin could be.

MachineLearning Strategies for Variable Source Classification
Nina Hernitschek  Caltech University, US
In the era of largescale surveys, methods to investigate these data, and especially to classify sources, become more and more important. Using the example of the PanSTARRS1 3pi survey as panoptic highlatitude survey in the time domain, we show how we explored the capabilities of this survey for carrying out timedomain science in a variety of applications. We use structure function fitting, period fitting and subsequent machinelearning classification to search for and classify highlatitude as well as lowlatitude variable sources, in particular RR Lyrae, Cepheids and QSOs. As the survey strategy of PanSTARRS1 3pi led to sparse, nonsimultaneous light curves, this is also a testbed to explore the possibilities in investigate the sparse data that will occur during the first months or years of upcoming largescale surveys when only a fraction of the total observing is done.

Keynote: From quantum surprises to quantum devices
Lieven Vandersypen  Delft University of Technology, The Netherlands
Quantum computation has captivated the minds of many for almost two decades. For much of that time, it was seen mostly as an extremely interesting scientific problem that aims at exploring the limits for computation allowed by nature. In the last few years, we have entered a new phase as the belief has grown that a largescale quantum computer can actually be built. In this talk, I will outline how the most fundamental aspects of quantum physics translate to very real laboratory advances with an increasing potential for delivering a breakthrough technology.

Invited: title TBD
Michael Walter

The signature of accelerated detectors in cavities
Richard Lopp  Institute for Quantum Computing, University of Wat (CA), US
We examine if the usual approximations of Quantum Optics are valid to study the Unruh effect in a cavity  as these approximations have been applied in past literature for those scenarios. Therefore, we consider the behaviour of an accelerated UnruhDeWitt particle detector interacting with a quantum scalar field whilst travelling through a cavity in 3+1D. We thereby model a simplified version of the lightmatter interaction between an atom and the electric field. We characterize the relativistic and nonrelativistic regimes, and show that the energy is not merely localized in a few number of field modes in general, rendering it impossible to employ the single mode approximation. Furthermore, we find that overall neither the massless nor the massive scalar field of a 1+1D theory is a satisfying approximation to the effective 1+1D limit. The ultimate objective will be to study if the bombardment of the cavity with a stream of accelerated atoms results in the accumulation of a significant signature in the field characteristic of the Unruh effect, avoiding thereby the need for thermalisation of the atoms.

Securing the Environment with Quantum Communication
M. Saravanan  Ericsson India Global Services Pvt. Ltd, IN
The pursuit to a secure communication has effectively been challenged with the power of a Quantum computing that can break open digital signatures and the RSA protocol which are generally establishes private communication on the Internet today. IOT devices used for private interaction are posed with a major threat as they often control access to private property like houses, cars, etc. Interactive IOT devices have been designed that let owner control action of the devices online. One example is the WAG collar [1] that allows owner to access information from the pet’s perspective. It comes with installed camera that sends live feed to the owner on the app whenever the sensors in the collar detect intrusion in pet perspective such as abnormal level of barking. These devices exchange information on the classical channel that is prone to information hacking by a classical or quantum adversary. In the following work we propose a quantum encoding protocol [2] that encodes the information sent to the classical communication channel so that an adversary upon hacking remains clueless about the content of communication. The encoding mechanism uses only two qubits and functions as a layer of quantum encoding over the classical channel as such to minimize computation overhead. Possible advantage of this encoding mechanism is the fact often stated as a result in quantum cryptography called monogamy of entanglement. Monogamy ensures that the encoded message is only accessible by the possessor of the other pair of entangled qubit. For an example, the wag collar records details about location of the pet and details about the biological system of the pet. The proposed encoding mechanism gives an approach to encode different type of such associated information in a manner, secure, not only against a classical intrusion but also a quantum adversary. Moreover, the quantum deep learning can be employed to understand the reasons for barking through autonomous image processing and barking styles and to setup an alert mechanism directly to caretaker intelligently.
[1] https://www.wagz.com/
[2] Hong Lai, MingXing Luo, Cheng Zhan, Josef Pieprzyk and Mehmet A.Orgun, “An improved coding method of quantum key distribution protocols based on Fibonaccivalued OAM entangled states”, Physics Letters A, Volume 381, Issue 35, 18 September 2017, Pages 29222926

Keynote: Hidden Order in the Informational Structure of Life
Peter Sloot
Greek mythology talks about Prometheus who stole fire from heaven to animate his clay men. My central conjecture is that what Prometheus stole was not fire but information in the form of Gibbs free energy. This resulted in our complex world, with networks of zillions of molecules in the living cell to the billions of human individuals and countless living organisms that constitute our planet, all interacting in nonlinear and often unpredictable ways. In this talk I’ll explore a new way to connect the dots, from the lifebringing free energy of the sun to the information stored and processed by living creatures.

Invited: Information from the cosmic dark ages
Pratika Dayal  Kapteyn Institute, Groningen, NL
Over the next few years, stateoftheartfacilities such as the James Webb Space Telescope, the Low Frequency Array (Lofar) and the Square Kilometre Array (SKA) will yield petabytes of data from the "cosmic dark ages". Disentangling these diverse pieces of information into a global picture will require cuttingedge theoretical models. In this talk, I will highlight how we can combine galaxy data with 21 cm observations from these earliest epochs to shed light on the dark ages of our Universe.

Invited: Big data and visualization in the Jupyter Notebook
Maarten Breddels, Freelance developer / consultant, the Netherlands
With big astronomical catalogs such as Gaia containing more than a billion stars becoming common, we need new methods to visualize and explore these large datasets. Data volumes of this size require different visualization techniques since scatter plots become too slow and meaningless due to overplotting. We solve the performance and visualization issue using binned statistics, e.g. histograms, density maps, and volume rendering in 3d. The calculation of statistics on Ndimensional grids is handled by Python library called vaex, which I will introduce. It can process at least a billion samples per second, to produce for instance the mean of a quantity on a regular grid. Jupyter notebooks are becoming the default workspace for (data) science and are the main platform vaex is targeting. Using Jupyter Hub, users can now move their code to where the data is without changing their platform. However, to visualize higher dimensional data in the notebook interactively, no proper solution existed. This led to the development of ipyvolume, which can render 3d volumes and up to a million points in the Jupyter notebook. It allows for sharing with colleagues, rendering on your tablet (paperless office), outreach and press release material to even full dome and 360degree videos capturing. The combination of vaex and ipyvolume in the Jupyter notebook are a good demonstration of the power of exploring Big Data in the Jupyter ecosystem.

Title TBD
Edwin Valentijn

Title TBD
Bas Beukers

Title TBD
Steven Bosch

Keynote: Bending the Universe
Alan Heavens  Imperial Centre for Inference and Cosmology, England
Einstein’s gravity bends the light from distant objects so the picture we see is not a true one. The distortion pattern tells us about the mysterious ingredients of the Universe  dark matter and dark energy  and about the law of gravity itself. But how should we extract that information? The answer is to simultaneously determine both the distortion map and the properties of the Universe, a milliondimensional problem that needs sophisticated statistical and numerical analysis techniques to solve. In doing so, we learn where the dark matter is, and we can undo the distortion to reveal a truer picture of the Universe.

Euclid  a data driven space mission
René Laureijs  ESA/ESTEC, NL
Successive astronomy missions of the European Space Agency with the objective of surveying large fractions of the astronomical sky exhibit a steady increase in the output data rate, in step with the contemporary technical capabilities. Euclid (to be launch in 2022) will become ESA's champion in generating astronomical data, surpassing older missions like e.g. Planck and Gaia. The Euclid space telescope has been designed to determine the properties of dark matter and dark energy in our universe with an unprecedented precision. The experiment has been set up such that all imaging data are collected and sent to Earth for further analysis. Consequently, the Euclid database will contain far more information than required by the experiment, enabling other scientific investigations, which can cover nearly all areas of astronomy. We describe the Euclid experiment  the science drivers, design, and implementation of the mission. We concentrate on the particular spacecraft design drivers for the data collection and processing for Euclid. We will also put the Euclid development in the context of other scientific space missions, and we will address the areas of technology necessary either limiting or supporting the increasing streams of data.

Invited: Cosmology and more with Euclid
Henk Hoekstra  Leiden University, NL
Euclid is an ESA Mclass mission that will survey 15000 square degrees with HSTlike resolution. It is scheduled for launch about 4 years from now and can be considered the z~1 equivalent of the highly successful SDSS. The main science driver of Euclid is to study the nature of dark energy, but it also enables unprecedented tests of our understanding of gravity on cosmological scales, tell us more about the nature of dark matter, the initial conditions of the Universe and constrain the combined mass of the neutrinos. It will also leave an amazing legacy of deep optical and NIR data that will impact many aspects of astronomy. In this talk I will provide an overview of the mission, focusing on weak lensing, one of the main probes that drives the design of this exciting project.

Invited: Lighting up Einstein's dark Universe
Alessandra Silvestri  Leiden University
It is now an exceptional time for modern cosmology, when we can observe the universe with high precision and connect cosmological measurements with theory. The excitement about the advances of observational cosmology is accompanied by the awareness that we face some major challenges: we still lack compelling theoretical models for dark matter, that accounts for the formation of the structure we see around us , and dark energy, that drives cosmic acceleration, as well as a deeper understanding of the mechanism that set up primordial conditions. I will discuss the theoretical aspects of these challenges and our approaches at shedding light on them, with particular focus on dark energy.

Geometry of the Local Universe
Johan Hidding  Netherlands eScience Center / Kapteyn Institute, NL
Matter in the Universe is clustered; when seen on the largest scales galaxies group into walls, filaments and clusters. This Cosmic Web of structures shapes the environment in which galaxies form and merge. From the observed distribution of galaxies (including their redshift) it is very hard to distil a three dimensional impression of the nature of their environment. We used realisations (Hess et al. 2013) of cosmic initial conditions that were constrained to reproduce observations found in the 2MASS redshift survey (Huchra et al. 2012). To these initial conditions we applied the geometric adhesion model, giving a weighted Voronoi representation of the structures in the Local Universe (z < 0.03). We present visualisations of several specific known structures in the Local Universe like the Local Void, Local Sheet, the PerseusPisces supercluster and the Fornax filament, showing their current geometry and past dynamical history.

The crumpling of primordial information in the cosmic spider web
Mark Neyrinck  University of the Basque Country, Bilbao, SP
The initial Universe was imprinted with a pattern of primordial density fluctuations, a pattern with easily quantified information. Gravity then used this pattern as a blueprint to fold it up and stretch it into the cosmic web of galaxies, filaments, walls and voids. We recently showed that the cosmic web has identical geometric properties as both architectural "spiderwebs" (nodes and strands between them that can be pulled entirely into tension) and origami tessellations. These ideas provide a convenient framework to quantify the loss of primordial information, or growth of a kind of entropy, in the Universe.

Invited: Bayesian Inference reconstruction of large scale structure and Information Theory classification of the Cosmic Web
Florent Leclercq  Imperial Centre for Inference and Cosmology, London, UK
Recent developments of Bayesian largescale structure inference technology naturally bring in a connection between cosmic web analysis and information theory. I will discuss the Shannon entropy of structuretype probability distributions and the information gain due to Sloan Digital Sky Survey galaxies, propose a decision criterion for classifying structures in the presence of uncertainty, and introduce utility functions for the optimal choice of a cosmic web classifier, specific to the application of interest. As showcases, I will discuss the phasespace structure of nearby dark matter, the discrimination of dark energy models from the cosmic web and an approach inspired by supervised machine learning for predicting galaxy colours given their largescale environment.

Past, Present and Future in cosmology
Daan Meerburg  Cambridge /RUG
In a period of 40 years, observational cosmology has transformed from backyard science led by a small number of people into huge experimental efforts, counting sometimes thousands of members. Initially, big steps were made in our fundamental understanding of the Universe, which has made cosmology the most important tool to understand the origin and fate of our Universe. However, observationally we will eventually be limited by our current horizon and our ability to build experiments that can explore the past light cone of the Universe. I will discuss the big discoveries in cosmology, with a special emphasis on the Cosmic Microwave Background (CMB). I will summarise current experimental efforts (Planck) and discuss the latest results derived from observations of the CMB. The future of cosmology will evolve around very large collaborative efforts which will unite a large fraction if not all scientist working in a particular field. Again, as an example I will discuss the efforts in the CMB community, which aim to build an instrument with one million detectors in the next 10 years. In parallel, I will show that these huge efforts lead to less guaranteed results, precisely because we are reaching the limits of information contained within the observable Universe.