Posts Tagged ‘supercomputer’

CAT Scan Of Nearby Supernova Remnant Reveals Frothy Interio

February 9, 2015 Leave a comment

Image: NASA, ESA, and the Hubble Heritage Team

Image: NASA, ESA, and the Hubble Heritage Team

Cassiopeia A, or Cas A for short, is one of the most well studied supernova remnants in our galaxy. But it still holds major surprises. Harvard-Smithsonian and Dartmouth College astronomers have generated a new 3-D map of its interior using the astronomical equivalent of a CAT scan. They found that the Cas A supernova remnant is composed of a collection of about a half dozen massive cavities – or “bubbles.”

“Our three-dimensional map is a rare look at the insides of an exploded star,” says Dan Milisavljevic of the Harvard-Smithsonian Center for Astrophysics (CfA). This research is being published in the Jan. 30 issue of the journal Science.

About 340 years ago a massive star exploded in the constellation Cassiopeia. As the star blew itself apart, extremely hot and radioactive matter rapidly streamed outward from the star’s core, mixing and churning outer debris. The complex physics behind these explosions is difficult to model, even with state-of-the-art simulations run on some of the world’s most powerful supercomputers. However, by carefully studying relatively young supernova remnants like Cas A, astronomers can investigate various key processes that drive these titanic stellar explosions.

Link To Full Story


A Simulation Of The Universe With Realistic Galaxies

December 29, 2014 Leave a comment

Credit: J. Schaye et al. 2015

Credit: J. Schaye et al. 2015

The simulations took several months to run at the “Cosmology Machine” in Durham and at “Curie” in Paris, among the largest computers used for scientific research in the U.K. and France, respectively. Astronomers can now use the results to study the development of galaxies from almost 14 billion years ago until now. The results will be published in Monthly Notices of the Royal Astronomical Society on 1 January.

or years, astronomers have studied the formation of galaxies using computer simulations, but with limited success. The galaxies that formed in previous simulations were often too massive, too small, too old and too spherical.

The galaxies formed in the EAGLE-simulation (Evolution and Assembly of GaLaxies and their Environments) are a much closer reflection of real galaxies thanks to the strong galactic winds, which blow away the gas supply needed for the formation of stars. EAGLE’s galaxies are lighter and younger because fewer stars form and they form later. In the EAGLE simulation these galactic winds – which are powered by stars, supernova explosions and supermassive black holes – are stronger than in earlier simulations.

Link To Full Story

Supercomputer For Astronomy “ATERUI” Upgraded To Double Its Speed

November 29, 2014 Leave a comment

The Center for Computational Astrophysics (CfCA) of the National Astronomical Observatory of Japan upgraded the supercomputer Cray XC30 system “ATERUI”. By introducing state-of-the-art CPUs, the theoretical peak performance increased from 502 Tflops to 1.058 Pflops, which means that ATERUI made the leap to become a petaflops computer. The new ATERUI will expand the horizons for simulations to understand the Universe and astrophysical phenomena.

Data obtained by observations are snapshots of astronomical phenomena. To understand these data, we need to construct theories based on physics, and conduct experiments based on those theories. However, virtually no astronomical phenomena can be reconstructed in a laboratory due to the spatial and time scales involved. On the other hand, theoretical astronomy tries to understand astronomical phenomena by solving equations. In some cases, it is not easy to solve the equations by hand, so powerful computers assist astronomers.

Link To Full Story

A Turbulent Birth for Stars In Merging Galaxies

Using state of the art computer simulations, a team of French astrophysicists have for the first time explained a long standing mystery: why surges of star formation (so called ‘starbursts’) take place when galaxies collide. The scientists, led by Florent Renaud of the AIM institute near Paris in France, publish their results in a letter to the journal Monthly Notices of the Royal Astronomical Society.

Stars form when the gas inside galaxies becomes dense enough to collapse, usually under the effect of gravitation. When galaxies merge however, this increases the random motions of their gas generating whirls of turbulence which should hinder the collapse of the gas. Intuitively this turbulence should then slow down or even shut down the formation of stars, but in reality astronomers observe the opposite.

The new simulations were made using two of the most powerful supercomputers in Europe. The team modelled a galaxy like our own Milky Way and the two colliding Antennae galaxies.

For the Milky Way type galaxy, the astrophysicists used 12 million hours of time on the supercomputer Curie, running over a period of 12 months to simulate conditions across 300,000 light years. For the Antennae type system, the scientists used the supercomputer SuperMUC to cover 600,000 light years. This time they needed 8 million hours of computational time over a period of 8 months. With these enormous computing resources the team were able to model the systems in great detail, investigating details that were only a fraction of a light year across.

Link To Full Story

Press Release: Computers Beat Brainpower In Counting Stars

A team of University of Sydney astronomers has developed a new way to automatically classify huge numbers of astronomical objects, and to discover new, exotic ones almost as soon as they happen.

Massive torrents of raw data are now collected by telescopes on a daily basis creating an urgent need to massively accelerate the reliable classification of millions of stars and galaxies, and to quickly highlight objects that might be new discoveries or that have unusual properties.

“Next generation telescopes like the Square Kilometre Array will produce enough raw data to fill up 15 million iPods every day,” said Kitty Lo, lead author of the research published in The Astrophysical Journal.

“It will be too much for humans to sift through, and this is where computer classification comes in,” said Ms Lo.

Link To Full Story And Video

Astrophysicists Launch Ambitious Assessment Of Galaxy Formation Simulations

December 11, 2013 Leave a comment

One of the most powerful tools for understanding the formation and evolution of galaxies has been the use of computer simulations–numerical models of astrophysical processes run on supercomputers and compared with astronomical observations. Getting computer simulations to produce realistic-looking galaxies has been a challenge, however, and different codes (simulation programs) produce inconsistent results.

Now, an international collaboration led by astrophysicists at the University of California, Santa Cruz, aims to resolve these issues through an ambitious multi-year project named AGORA (Assembling Galaxies of Resolved Anatomy). AGORA will run direct comparisons of different codes using a common set of initial conditions and astrophysical assumptions. Each code treats some aspects of the physics differently, especially the way that energy from stars and supernovas is fed back into the simulated galaxies. The simulations are being run at the best resolutions currently possible, and they are using the same input physics as much as possible. The simulation results will be systematically compared with each other and against a variety of observations using a common analysis and visualization tool.

Link To Full Story
Link To Another Story

Team Led By University Of Leicester Sets New Record For Cosmic X-Ray Sightings

Scientists led by the University of Leicester have set a new record for cosmic X-ray sources ever sighted – creating an unprecedented cosmic X-ray catalogue that will provide a valuable resource allowing astronomers to explore the extreme Universe.

The XMM-Newton Survey Science Centre, led by a team from the University of Leicester’s Department of Physics and Astronomy, used the University’s ‘ALICE’ supercomputer to help them produce a new X-ray catalogue, dubbed “3XMM”.

This new catalogue contains over half a million X-ray source detections, representing a 50% increase over previous catalogues and is the largest catalogue of X-ray sources ever produced. This vast inventory is also home to some of the rarest and most extreme phenomena in the Universe, such as tidal disruption events – when a black hole swallows another star, producing prodigious outbursts of X-ray emission.

Full Story: