This is Lite Plone Theme
You are here: Home Info Science Overview

Science Overview

This supercomputer supports world-leading research in cosmology and exoplanets

COSMOS consortium

Cosmologists require supercomputers to enable them to calculate what their theories of the early universe predict and to test them against observations of the present universe.

The UK Computational Cosmology Consortium (UK-CCC) was forged in 1996 in recognition of the need for high performance computing resources to advance cosmological research and to maintain the UK's leading role in this highly competitive field. HEFCE infrastructure funding in 1997 made possible the purchase of the world's first national cosmology supercomputer - COSMOS - which has now been operating successfully for seven years through five major upgrades in a longstanding collaboration with SGI.

The consortium consists of the major UK groups studying particle cosmology and the cosmic microwave sky with participation from eight Universities, covering ten separate departments. The lead institutional contributors for the present COSMOS (mark VII) are the Universities of Cambridge, Central Lancashire, Portsmouth and Sussex and Imperial College, London.

The present COSMOS investigators are listed on the COSMOS team page. A key strength of the consortium is its interdisciplinary nature which brings together those with interests ranging from the particle physics of the early universe to late galaxy evolution, allowing the UK-CCC to develop a seamless history of the Universe from the earliest moments after the Big Bang through to the present day.

The COSMOS consortium's current programme of research aims to advance our understanding of the origin and structure of our universe, primarily through the scientific exploitation of the cosmic microwave sky. Our interdisciplinary scientific goals fall into three broad categories:

  1. Developing data analysis techniques and software pipelines to extract cosmological information from the CMB.
  2. Characterising the fundamental nature of the primordial perturbations from which the structure in our universe formed.
  3. Understanding the non-equilibrium dynamics of the early universe, particularly phase transitions and inflation.

As a result of HEFCE SRIF funding, the consortium has purchased an SGI Altix 3700 computer with 152 Intel Itanium2 cpus, 152 Gb globally shared memory, and 10 Tb storage (Later upgraded to the current Altix 4700, 459GB RAM). COSMOS configuration details and those of other consortium servers are detailed on the hardware page. However, we note here that this is a state-of-the-art system combining fast floating point chips with a highly scalable architecture which is capable of running both OpenMP and MPI parallel applications. The choice of shared-memory platform has always been driven by the consortium need for ease-of-use and rapid-time-to-solution in this fast moving field.

Below we briefly highlight some of the key areas in which consortium members are working, using updated excerpts from the original HEFCE proposal.

Cosmic Microwave Sky

Observations of the cosmic microwave background (CMB) radiation from the ground, balloons and space will revolutionise our understanding of the universe by providing precise measurements of primordial fluctuations generated in the very early Universe. The data will allow clean tests of fundamental physics applicable at energies over 10 orders of magnitude higher than those achievable in particle accelerators.

Recent CMB experiments such as WMAPwmap (right) have provided strong evidence in favour of the simplest type of primordial perturbations, and a nearly flat Universe. Future experiments will substantially improve on the accuracy of these measurements, definitively answering the question of the form of the primordial perturbations and accurately measuring the basic parameters describing the geometry, constitution and expansion of the Universe. They will also allow us to scan our past light cone for relics of unification physics such as cosmic strings or other signatures of higher dimensions.

In addition, precision measurements of the CMB anisotropies on arcminute angular scales will provide key astrophysical information on the large-scale distribution of matter (via gravitational lensing), the formation of the first stars in the Universe (via signatures of reionization) and the evolution of galaxy clusters (detected via the Sunyaev-Zel'dovich effect).

planckThe Planck Surveyor satellite (left) is an approved ESA mission scheduled for launch in early 2007. It will scan the whole sky over a broad range of frequencies with unprecedented angular resolution and sensitivity. The data analysis for Planck is the responsibility of two international consortia, one for the low frequency instrument (LFI) and one for the high frequency instrument (HFI). The latter will produce the highest resolution and highest sensitivity maps for both temperature and polarisation.

Three Cambridge departments (Cavendish, DAMTP and IoA) have formed the Cambridge Planck Analysis Centre (CPAC) bringing together a wide range of expertise in theoretical astrophysics, cosmology and data analysis. Within the Planck HFI consortium, CPAC has been allocated the primary responsibility for the high level data analysis (Level 3). This work includes removal of contaminant foregrounds such as galactic dust and extragalactic point sources, and production of cleaned real space maps and power spectra of the primordial temperature and polarization anisotropies. This extremely computationally intensive development work is being undertaken on the central COSMOS supercomputer as well as on remote consortium servers at the Cavendish and IoA.

Planck consortium members centred around Imperial College have formed the London Planck Analysis Centre (LPAC). They are collaborating with the IAS (Paris) on Level 2 data analysis, that is, the production of maps of the microwave sky from time-ordered data of circular scans by the Planck satellite. Like CPAC, this work again depends on the COSMOS platform especially for data sets approaching realistic Planck scales.

The full Planck analysis pipeline is scheduled for completion by 2005, so this is a period of intense activity for both CPAC and LPAC with an ongoing program of simulations of increasing complexity. Considerable effort on Planck has also involved developing simulation software in collaboration with IAS (Paris) and MPA (Garching). The simulations are being used to assess map reconstruction errors in both the temperature and polarisation maps resulting from the scanning strategy, while also providing a vital data analysis testbed.

Origin of Structure in the Universe

mooreEstablishing the fundamental character of the primordial perturbations in the Universe would be one of the great achievements in the history of science. It would lay the foundations for an understanding of the formation of galaxies, stars, planets and ultimately ourselves. It would also provide vital clues to the physical processes operating at the beginning of the Big Bang.

Our work in this area complements the other two programs of research. The primary aim is the accurate determination of predictions from the full range of theoretical models proposed for the origin of structure in the Universe. Over the coming decade we shall be engaged in a continuous process of developing theories and comparing them with observations. If none of the current theoretical paradigms fit the new data we shall be interested in developing others. If we discover a paradigm which fits the data we shall obtain invaluable clues as to the connection between structure formation in the Universe and fundamental physics.

cmbThe cosmic microwave sky will provide a data set of unparalleled scope and accuracy giving us a direct probe of the Universe before the epoch of nonlinear gravitational clustering. Because the initial perturbations were still small at that time, accurate calculations from basic theory are possible. Members of the COSMOS consortium pioneered many of the calculational methods and we seek to develop these in anticipation of the Planck data so that maximal theoretical understanding may be gained.

The microwave anisotropy pattern carries a great deal of information regarding the cosmological parameters - for example, the density of baryons, neutrinos, and hypothetical dark matter candidates, the geometry of the universe and a possible cosmological constant. Should one of the theories of the origin of structure be confirmed, a spin-off will be an accurate determination of the key cosmological parameters.

ekpyroticThe cosmic microwave sky also offers us an unequalled possibility of searching our past light cone for exotic relics of Unification physics, produced in the extreme temperatures of the hot big bang and surviving as cosmological fossils in the current era. Members of the COSMOS consortium have pioneered the understanding of such relics and the computation of their distinctive signatures in the cosmic microwave sky.

Braneworld models in which our universe emerges as a three-dimensional membrane from a higher dimensional theory are gaining increasing popularity because of strong motivations from superstring theory. In particular, brane inflation models generically predict the formation of cosmic strings with distinct properties. These signatures or others for the ekpyrotic braneworld model (right) offer the prospect of real observational tests of fundamental theory.

The Early Universe

A constellation of important questions about the evolution of the early universe have emerged in the last few years, that require the application of supercomputing resources for their answers. Among these are how the baryon asymmetry developed, how we can find evidence today that the universe went through symmetry-breaking phase transitions, and how we can find evidence for an inflationary epoch. Understanding phase transitions and post-inflationary dynamics ('preheating') requires powerful non-perturbative techniques in field theory, pre-eminently, real-time lattice field theory.

For the foreseeable future, real-time lattice field theory means field theory in the classical approximation, which holds when temperatures and densities are high. We need to simulate large volumes of the Universe for long periods of time, to include important long-wavelength effects and to see the emergence of late-time behaviour. Furthermore, to study thermal phase transitions we also need to include the effect of the high energy particles with energies around that given by the temperature. Such simulations end up being effectively 6-dimensional, as we must include 2 dimensions of momentum space as well as the normal 3+1 dimensions of space time. Hence the need for supercomputing: this is if anything a more demanding problem than that facing the lattice QCD community.

A major goal is simulating in real time the phase transition in the electroweak theory, in order to compute the matter-antimatter asymmetry that results. Current approaches, which mix perturbative and semiclassical techniques, have all but failed as the Higgs particle refused to appear at LEP, thus putting the electroweak transition into the realm of non-perturbative physics. Hence in order to calculate the baryon asymmetry we will need supercomputing.

stringsA more far-reaching goal is to understand the Universe further back in time, beyond the reach of conventional accelerator-based physics. For this we need to find evidence of early periods when the Universe was out of thermal equilibrium. Over the years attention has focussed on two classes of non-equilibrium behaviour: phase transitions and inflation.

Besides creating a baryon asymmetry, a phase transition can also leave behind topological defects: strings, monopoles and textures, which would survive until today and, if sufficiently energetic, be detectable either through their gravitational effects or through their decay products. After inflation, all the energy in the Universe is in the form of a coherently oscillating field with essentially zero momentum: there is a big gap in our understanding of how this energy changed into thermal energy. This transfer may have happened very rapidly, in a process known as preheating, and the more rapidly it happens, the higher the temperature reached, and therefore the greater the likelihood of creating exotic relics such as topological defects, like the cosmic strings produced at the end of inflation (left).

The COSMOS collaboration is world-leading in the area of numerical particle cosmology. Its members were the first to produce massively parallel simulations of cosmic string networks forming and evolving, and the first to implement the Hard Thermal Loop (HTL) effective Lagrangian in a numerical simulation. In the next few years we expect significant advances in numerical simulations of the very early Universe, and we plan major contributions to the fields of baryogenesis, the formation and evolution of topological defects, and preheating and thermalisation after inflation.

Extra-solar planets and their atmospheres (ExoMol Consortium)

The ExoMol consortium is concerned with the observation and characterisation of extra-solar planets and their atmospheres, developing key numerical codes vital to this international endeavour.

The Universe started with a Big Bang synthesis of a few chemical elements that eventually led to self-replicating, competitive structures of molecules we call life. With ever more powerful telescopes, some of the oldest questions in science can now be addressed: Are there worlds beyond our solar system? Are they numerous or rare? How many of them have the right conditions for life? A scientific approach to these questions, however, needs to start with more fundamental questions such as – “How do stars and planets form and evolve?” and – “What are they made of?” The detection, observation and characterisation of sub stellar objects now play a key role in the schedules of many ground-based (e.g. the European South Observatory) or space-based (Hubble and Spitzer) telescopes and are major drivers for new telescope developments.

ExoMol science: The wealth of observational discoveries, however, is not well matched by theoretical advances in the same field: modelling the atmospheres of planetary and stellar objects, and hence deriving their chemical composition, temperature, etc. is still impeded by the lack of fundamental data, especially in molecular spectroscopy. Extremely expensive, new and forthcoming European-funded telescopes (Herschel, ALMA, ELT, JWST, SPICA) or proposed ones (THESIS) will only help to characterise such objects if a commensurate effort is made to secure the tools required to interpret the observations. ExoMol aims to provide the information needed to understand the physics and chemistry of astronomical bodies cool enough to form molecules in their atmospheres such as, in rough order of priority: extrasolar planets, cool stars, and planetary disks. ExoMol is a Miracle-supported project based at UCL and funded by the European Research Council. The team is led by Professor Jonathan Tennyson (FRS) and Dr. Sergey Yurchenko – both internationally recognized experts in molecular physics, particularly in the methodologies, required for computing the energies and spatial forms of the molecular quantum states associated with rotational and vibrational motions of these systems. (More detail is available at

ExoMol is an ever-evolving and growing database of molecular line lists that can be used for such spectral characterisation and simulation, and as input to atmospheric models of exoplanets, brown dwarfs and cool stars, and other models including those for combustion and sunspots. It has attracted ~2MEuro of European funding and provided input for a 2007 Nature paper (Tinetti et al) on first detection of water molecules in an exoplanet atmosphere. In practice, a total team of 11 people are involved in this research, supported by a European Research Council grant. One of the ultimate aims of the project is the pioneering prediction of rovibrational spectra of molecules with up to 10 atoms.

The main aim of the project is to be able to consolidate existing databases of astrophysically important molecules such as H2O, ammonia and acetylene. Being able to diagonalize the larger matrices required – we need to move from 2×105-square matrices to 106-square - produces vast improvements in the accuracy and completeness of the numerical wavefunctions and eigenstate properties, which are produced by the code. Without the capability to perform such calculations, information about many potentially important transitions, associated with very high-energy molecular dynamics, would be lost; and it would become impossible to use the 'absorption bands' built up by numerous transitions of this nature for reliable and accurate determination of compositional and physical information from spectroscopic data associated with the astrophysical environments mentioned above.

Recently the ExoMol team produced new high precision line lists for methane which should help the search for life on exoplanets.   This received considerable media attention and you can read more about it on the COSMOS news page: