YOU ARE HERE: Home > News & Press > Neural networks promise sharpest ever images

I want information on:

Information for:

NEWS & PRESS

Neural networks promise sharpest ever images

Last Updated on Wednesday, 22 February 2017 12:26
Published on Thursday, 23 February 2017 00:01

Telescopes, the workhorse instruments of astronomy, are limited by the size of the mirror or lens they use. Using 'neural nets', a form of artificial intelligence, a group of Swiss researchers now have a way to push past that limit, offering scientists the prospect of the sharpest ever images in optical astronomy. The new work appears in a paper in Monthly Notices of the Royal Astronomical Society.

Sharpened galaxies smallThe frames here show an example of an original galaxy image (left), the same image deliberately degraded (second from left), the image after recovery with the neural net (second from right), and the image processed with deconvolution, the best existing technique (right). Credit: K. Schawinski / C. Zhang / ETH Zurich. Click for a full size image

 

 

 

 

 

 

 

 

 

 

 

The diameter of its lens or mirror, the so-called aperture, fundamentally limits any telescope. In simple terms, the bigger the mirror or lens, the more light it gathers, allowing astronomers to detect fainter objects, and to observe them more clearly. A statistical concept known as 'Nyquist sampling theorem' describes the resolution limit, and hence how much detail can be seen.

The Swiss study, led by Prof Kevin Schawinski of ETH Zurich, uses the latest in machine learning technology to challenge this limit. They teach a neural network, a computational approach that simulates the neurons in a brain, what galaxies look like, and then ask it to automatically recover a blurred image and turn it into a sharp one. Just like a human, the neural net needs examples – in this case a blurred and a sharp image of the same galaxy – to learn the technique.

Their system uses two neural nets competing with each other, an emerging approach popular with the machine learning research community called a "generative adversarial network", or GAN. The whole teaching programme took just a few hours on a high performance computer.

The trained neural nets were able to recognise and reconstruct features that the telescope could not resolve - such as star-forming regions, bars and dust lanes in galaxies. The scientists checked it against the original high-resolution image to test its performance, finding it better able to recover features than anything used to date, including the 'deconvolution' approach used to improve the images made in the early years of the Hubble Space Telescope.

Schawinski sees this as a big step forward: "We can start by going back to sky surveys made with telescopes over many years, see more detail than ever before, and for example learn more about the structure of galaxies. There is no reason why we can't then apply this technique to the deepest images from Hubble, and the coming James Webb Space Telescope, to learn more about the earliest structures in the Universe."

Co-author Professor Ce Zhang, from the computer science department in the same university, also sees great potential: "The massive amount of astronomical data is always fascinating to computer scientists. But, when techniques such as machine learning emerge, astrophysics also provides a great test bed for tackling a fundamental computational question – how do we integrate and take advantage of the knowledge that humans have accumulated over thousands of years, using a machine learning system? We hope our collaboration with Kevin can also shed light on this question."

The success of the project points to a more "data-driven" future for astrophysics in which information is learned automatically from data, instead of manually crafted physics models. ETH Zurich is hosting this work on the space.ml cross-disciplinary astrophysics/computer-science initiative, where the code is available to the general public.

 


Media contacts

Dr Robert Massey
Royal Astronomical Society
Mob: +44 (0)7802 877 699
This email address is being protected from spambots. You need JavaScript enabled to view it.

 

Dr Morgan Hollis
Royal Astronomical Society
Tel: +44 (0)20 7292 3977
This email address is being protected from spambots. You need JavaScript enabled to view it.

 


Science contacts

Prof Kevin Schawinski
Institute of Astronomy
ETH Zurich
Switzerland
Tel: +41 44 633 07 51
Mob: +41 79 647 11 56
This email address is being protected from spambots. You need JavaScript enabled to view it.

 

Prof Ce Zhang
Systems Group, D-INFK
ETH Zurich
Switzerland
Tel: +41 44 632 75 29
Mob: +41 79 592 97 40
This email address is being protected from spambots. You need JavaScript enabled to view it.

 


Further information

The new work appears in "Generative Adversarial Networks recover features in astrophysical images of galaxies beyond the deconvolution limit", Kevin Schawinski, Ce Zhang, Hantian Zhang, Lucas Fowler, and Gokula Krishnan Santhanam, Monthly Notices of the Royal Astronomical Society.

 

space.ml: from model-driven astrophysics to data-driven astrophysics

 


Notes for editors

The Royal Astronomical Society (RAS), founded in 1820, encourages and promotes the study of astronomy, solar-system science, geophysics and closely related branches of science. The RAS organizes scientific meetings, publishes international research and review journals, recognizes outstanding achievements by the award of medals and prizes, maintains an extensive library, supports education through grants and outreach activities and represents UK astronomy nationally and internationally. Its more than 4,000 members (Fellows), a third based overseas, include scientific researchers in universities, observatories and laboratories as well as historians of astronomy and others.

The RAS accepts papers for its journals based on the principle of peer review, in which fellow experts on the editorial boards accept the paper as worth considering. The Society issues press releases based on a similar principle, but the organisations and scientists concerned have overall responsibility for their content.

Follow the RAS on Twitter and Facebook