# News

**Please subscribe to our RSS news feed.**

## John M. Stewart: Python for Scientists

Retired DAMTP faculty member John Stewart sadly died this year in November. He was a UK pioneer in numerical relativity and you can read his obituary on the CTC webpages here.

Following his retirement in 2010, John wrote a book on the Python programming language entitled "Python for Scientists", published by Cambridge University Press. An extended version of the book is due to be published next year.

For those interested in Python, we give access here to two files.

Code samples include most of the code snippets, which are four or more lines long. They are organised by the section in the book in which they occur.

## Numerical relativity reveals possible violation of cosmic censorship in higher dimensions

Numerical simulations of Einstein’s equations of general relativity provided the first concrete evidence that the weak cosmic censorship conjecture can be violated in five-dimensional asymptotically flat spacetimes, where the simulations show that singularities can generically form without being hidden behind a black hole horizon. Asymptotically flat spacetimes are considered most relevant in practical applications.

PhD students Markus Kunesch and Saran Tunyasuvunakool from the CTC, led by Pau Figueras from Queen Mary University of London, have successfully simulated the nonlinear dynamics of `black rings’ in five dimensions. These black holes, which are shaped like very thin rings, are subject to the so-called Gregory-Laflamme instability that gives rise to a series of ‘bulges’ connected by strings that become thinner over time. These strings eventually become so thin that they pinch off into a series of miniature black holes, similar to how a thin stream of water from a tap breaks up into droplets.

Black rings were identified as an exact solution to Einstein’s equations in five dimensions by Roberto Emparan and Harvey Reall in 2002. The solution is known to be linearly unstable under small perturbations, but this is the first time that their nonlinear dynamics have been successfully simulated using supercomputers. The evolution points towards the appearance of a ‘naked singularity’, which would cause the equations behind general relativity to break down outside of a black hole. The results are published in the journal Physical Review Letters.

General relativity underpins our current understanding of gravity: everything from the estimation of the age of the stars in the universe, to the GPS signals we rely on to help us navigate, is based on Einstein’s equations. In part, the theory tells us that matter warps its surrounding spacetime, and what we call gravity is the effect of that warp. In the 100 years since it was published, general relativity has passed every test that has been thrown at it, but one of its limitations is the existence of singularities, where Einstein’s equations break down mathematically.

Exact stationary black hole solutions generally contain a singularity at the centre, hidden behind an event horizon. Current theoretical expectation is that any singularity which forms as a result of dynamical physical processes must also be surrounded by a black hole horizon. This allows observers outside of black holes to use Einstein’s equations to predict spacetime behaviour indefinitely far into the future.

“As long as singularities stay hidden behind an event horizon, they do not cause trouble and general relativity holds - the ‘cosmic censorship conjecture’ says that this is always the case,” said study co-author Markus Kunesch, a PhD student at the CTC. “As long as the cosmic censorship conjecture is valid, we can safely predict the future outside of black holes. Because ultimately, what we’re trying to do in physics is to predict the future given knowledge about the state of the universe now.”

A singularity which exists outside of a black hole horizon is called a ‘naked singularity’. Without the horizon, information from a naked singularity can reach an arbitrary distance in space. The failure of Einstein’s equations at the singularity would therefore cascade into a failure to predict the future of the spacetime.

“If naked singularities exist, general relativity breaks down,” said co-author Saran Tunyasuvunakool, also a PhD student at the CTC. “It would no longer have any predictive power – it could no longer be considered as a standalone theory to explain the universe.”

The cosmic censorship conjecture was first formulated informally by Roger Penrose in 1969, and over the years the statement of the conjecture has been made mathematically precise. However, the result remains unproven. One way to gain an insight is to study general relativity in higher dimensions and see if cosmic censorship still holds there. Failure of the conjecture in higher dimensions could help identify properties specific to four-dimensional spacetimes which makes it particularly suitable for general relativity. The discovery of black rings in five dimensions led scientists to hypothesise that they could break up and give rise to a naked singularity.

Using the COSMOS supercomputer, the researchers were able to perform a full simulation of Einstein’s complete theory in five dimensions, allowing them to not only confirm that these ‘black rings’ are unstable, but to also identify their eventual fate. Most of the time, a black ring collapses back into a sphere, so that the singularity would stay contained within the event horizon. Only a very thin black ring becomes sufficiently unstable as to form bulges connected by thinner and thinner strings, eventually breaking off and forming a naked singularity. The resulting extreme geometry was one of the motivations for the development of the GRChombo adaptive mesh refinement numerical relativity code, along with theoretical improvements to existing gauge conditions

“The better we get at simulating Einstein’s theory of gravity in higher dimensions, the easier it will be for us to help with advancing new computational techniques – we’re pushing the limits of what you can do on a computer when it comes to Einstein’s theory,” said Tunyasuvunakool. “If cosmic censorship doesn’t hold in higher dimensions, then maybe we need to look at what’s so special about a four-dimensional universe that means it does hold.”

*Adapted from an article originally published by the University of Cambridge, licensed under a Creative Commons Attribution 4.0 International License.*

**Reference:**

Pau Figueras, Markus Kunesch, and Saran Tunyasuvunakool ‘End Point of Black Ring Instabilities and the Weak Cosmic Censorship Conjecture.’ Physical Review Letters (2016). DOI: 10.1103/PhysRevLett.116.071102

Below are some films of the simulations.

## COSMOS searches for magnetic monopoles

Artist's impression of a monopole-antimonopole pair production event in the MoEDAL experiment (Copyright: Heikka Valja / MoEDAL)

Magnets always have two poles, north and south . . . or do they? The existence of magnetic monopoles, elementary particles with a single magnetic north or south pole, was postulated in 1931 by Paul Dirac who found that they are compatible with quantum mechanics and that their existence would explain the quantisation of electric charge. This July, Arttu Rajantie from the COSMOS consortium led an exhibit called Monopole Quest (http://moedal.web.cern.ch/MonopoleQuest) at the Royal Society Summer Science Exhibition, explaining the physics of magnetic monopoles and the MoEDAL experiment (Monopole and Exotics Detector at the LHC) at CERN, which is searching for them.

In spite of extensive searches, magnetic monopoles have not been found in experiments. If they exist, they should have been produced in the very early universe, and as stable particles they would have survived until today. In that case we would expect to find them in cosmic rays, and also indirectly through their astrophysical effects. This monopole problem was one of the original arguments for cosmological inflation. It should also be possible to produce monopole-antimonopole pairs in particle collider experiments. The new MoEDAL experiment (http://moedal.web.cern.ch/) at the Large Hadron Collider has been designed for this purpose. The experiment started operation in June 2015, and it will publish its first results soon.

To interpret the results of the MoEDAL experiment and derive constraints on theories, one needs reliable and accurate theoretical predictions. This is a major challenge because the Dirac quantisation condition shows that the magnetic charge of a monopole would be very strong, and therefore the usual perturbation theory techniques based on Feynman diagrams, which are used to describe the production of other types of particles, are not applicable.

Rajantie and his collaborators have been developing numerical lattice field theory simulation methods for calculating the properties and behaviour of magnetic monopoles in quantum field theory. By using Monte Carlo simulation algorithms together with topologically non-trivial “twisted” boundary conditions, they can go beyond the semiclassical approximation and calculate fully quantum mechanical observables. These simulations will be crucial for understanding the physics implications of the MoEDAL results.

## COSMOS IPCC receives HPCwire Award at Supercomputing 2015 in Austin, Texas

The Stephen Hawking Centre for Theoretical Cosmology (CTC) at the University of Cambridge has been recognized in the annual HPCwire Readers’ and Editors’ Choice Awards, with the following honour:

** • Readers' Choice: Best Use of High Performance Data Analytics**

CTC operates the COSMOS hybrid shared-memory supercomputer, the largest shared-memory computer in Europe, which in 2014 was awarded the status of Intel Parallel Computing Centre (IPCC). The award was for the impressive many-core acceleration of the MODAL analysis pipeline which offered new statistical insights from the Cosmic Microwave Background as observed by the ESA Planck Satellite. More information about the HPCwire awards can be found here:

The Readers’ Choice award was presented at the 2015 International Conference for High Performance Computing, Networking, Storage and Analysis (SC15), in Austin, Texas. These coveted annual HPCwire Awards are determined through a nomination and voting process with the global HPCwire community, as well as selections from the HPCwire editors. The awards are an annual feature of the publication and constitute prestigious recognition from the HPC community. They are revealed each year to kick off the annual supercomputing conference.

Please find below the video link to the Opening Plenary Session at SC’15 where Intel’s Senior Vice President, Diane Bryant, highlights work with CTC and the HPCwire award (starts at 4:00 minutes in):

https://www.youtube.com/watch?v=kuh5qzZI2HM&feature=player_embedded

Professor Paul Shellard, CTC Director said: “We are thrilled at the Centre for Theoretical Cosmology and COSMOS IPCC to have received this international award in high performance computing. It is recognition of a unique synergy that we have developed between world-leading researchers from the STFC DiRAC HPC Facility and industry-leading vendors like Intel and SGI which aims to get maximum impact from new many-core technologies for our data analytic pipelines. Dramatic speed-ups have been achieved for our Planck satellite analysis and other codes through a potent combination of new parallel programming paradigms and architectural co-design; these capabilities are opening up new windows on our Universe.”

For many years, STFC-funded scientists in Cambridge have operated COSMOS supercomputer systems with unique shared-memory capabilities in a longstanding collaboration with SGI, together with innovative new processor technology from Intel. In 2014, the CTC with COSMOS was named an Intel Parallel Computing Center focusing on Xeon Phi porting and optimization efforts on their unique hybrid UV2000 system co-designed for many-core acceleration with SGI. This IPCC support coincided with Cosmic Microwave Background (CMB) measurements by the Planck satellite which provided the first high resolution temperature and polarization maps of the entire sky. The COSMOS IPCC team adapted the main workhorse non-Gaussian statistical correlation code, MODAL, which is designed to analyze very small CMB fluctuations in the Planck data; it aims to provide insight into new physics theories about how structures formed in the Universe. This is a computationally daunting task and a complete analysis for three-point correlations would have taken unfeasibly long to perform even on the largest supercomputers available to the researchers. The use of the hybrid UV2000 + Xeon Phi system, combined with the optimization and modernization effort of the COSMOS IPCC team, resulted in runtimes being cut by a factor 1/100-1/1000 which meant we could meet the tight ESA timescales available for the analysis.

DiRAC is the integrated supercomputing facility for HPC-based research in particle physics, astronomy and cosmology, areas in which the UK is world-leading. Supported by the UK Government’s Large Facilities Capital Fund since 2009, the Science and Technology Facilities Council has invested in innovative DiRAC systems which match machine architecture to the requirements and algorithm design of the research problems to be solved. For the COSMOS supercomputer in Cambridge, DiRAC has worked with Intel and SGI to build a data analytics system based on heterogeneous CPU architectures, giving access to more efficient and powerful many-core Intel Xeon Phi chips. The flexible capability to offload detailed analysis functions to faster processors as and when needed greatly decreases the time needed to produce results. These developments offer a hardware and software blueprint for future systems for the detailed analysis of a wide range of datasets.

Tom Tabor, CEO of Tabor Communications, publisher of HPCwire, said “HPCwire readers are among the most informed in the HPC community and these awards are ultimately given to the organizations that are making the greatest impact in advancing technology and humanity itself through high performance computing. The HPCwire Readers’ and Editors’ Choice Awards send a strong message of support and appreciation from those in the global HPC community. We are proud to be able to recognize these efforts each your and our congratulations go out to all the winners.”

**Further highlights from Supercomputing 2015**

Other work by the COSMOS IPCC team from CTC is being highlighted on the SGI and Intel booths at SC’15. The team are working with Intel on the ray-tracing visualization package OSPRay with demonstrations of realtime visualizations of a huge 10TB dataset on an SGI UV system. These are the largest cosmic wall simulations ever performed with evolution beginning 200,000 years after the Big Bang and finishing today, some 14 billion years later; the aim is to determine the observable implications of walls for the cosmic microwave background. Their work with Intel on OSPray was also demonstrated at ISC’15 in Frankfurt.

## Cosmic Microwave Background gives new insights into early universe

CTC scientists are refining our understanding of the formation of the early universe by analyzing the Cosmic Microwave Background (CMB) data captured by the Planck Satellite. The CMB is the relic radiation or “first light” to escape once the very hot temperature from the Big Bang cooled after about 300,000 years. It is like looking at a snapshot of the Universe as it was over 13 billion years ago.

The CTC is home to COSMOS, a Xeon Phi enabled SGI UV2000 supercomputer and the largest shared-memory system in Europe. The COSMOS team at Cambridge has developed a new code called MODAL for analyzing the Planck data. This new code allows us to find and interpret the very tiny fluctuations that have been measured between different CMB data points on the sky. These fluctuations can provide insight into new physics theories about how structures like stars and galaxies formed in our Universe.

Our analysis of the Planck satellite data confirms a signal predicted by general relativity, but it also reveals other possible yet unexplained signals which could tell us much more about the early universe. Below is a film which shows Dr Juha Jäykkä, one of the COSMOS team, giving a demonstration of the analysis at the 2015 International Supercomputing Conference in Frankfurt.

Members of our COSMOS team have also written up the new Planck data results in a chapter in the recently published book High Performance Parallelism Pearls, Volume Two: Multicore and Many-core Programming Approaches, which features our images on the front cover:

You can find out more here:

http://www.hpcwire.com/2015/08/24/cosmos-team-achieves-100x-speedup-on-cosmology-code/

and here: