Login to the software and computing wiki (password protected).
Processing the vast quantities of data produced by the SKA will require very high performance central supercomputers capable of 100 petaflops per second processing power. This is about 50 times more powerful than the most powerful supercomputer in 2010 and equivalent to the processing power of about one hundred million PCs.
Science processing for standard radio astronomy imaging consists of several fundamental steps:
- Removing data that has been corrupted by interference or faults in the system.
- Calibrating each antenna’s signal to remove the effects of instrumental variation and variations in the line-of-sight propagation of the radio signal.
- Transforming the data onto a rectangular grid in what radio astronomers call the “u-v plane” – this is like interpolating a few randomly scattered altitude measurements onto a regular map grid to estimate the altitudes at all grid intersections.
- A mathematical calculation called a Fourier transformation to convert the data into a representation of the object’s image in the sky.
- A further calculation called “deconvolution of the point-response function of the array” to remove the radio equivalent of the spikes around bright stars in an optical image.
These steps must be done for thousands of separated frequency ranges, and in real time. The steps are combined in iterative loops that typically involve refining estimates of array parameters – such as complex gains – while concurrently creating images that converge towards the transformed observed data. Buffer memory is required to store interim processing results while the processing loops are being executed. The science processing facility will also have large data storage sub-systems. The end results of the converged image processing form the basis of the final astronomical images that are distributed to astronomers and physicists around the world.
The supercomputers will use millions of processors operating in parallel. To speed up some types of processing, the processors are likely to be assisted by specialised hardware. One of the software challenges for the SKA will be to adapt algorithms to operate on these new types of architectures.
Several other types of software developments will also be required. These include:
- Observing preparation and scheduling which are independent of data pipeline and data reduction processing.
- Real time – albeit relatively long time constant – monitor and control.
- Other time critical online data reduction and archiving in addition to that described above.
The project will also require middleware software and tools for ongoing software development.
Also in this section