Back to word listing

Word of the week: HPC

The SKA is often mentioned with excitement among the HPC (High-Performance Computing) community, so we went and asked our expert, Chief Architect Tim Cornwell, what HPC means exactly: 

Circuit boards and racks in the JVLA correlator.  Credit: Brent Carlson

Circuit boards in the JVLA correlator.  Credit: Brent Carlson

HPC describes the type of computing done in science and technical fields. Today’s research computers must be increasingly powerful given the huge amount of data they have to process.

Specially designed “super” computers are used to perform calculations or to process data such as SKA observations. Usually the power of a supercomputer computer is measured in Floating Point Operations Per Second (flops) – basically the number of additions or multiplications a computer can do in a second.

An average laptop can probably do a few thousand million flops (a few Gigaflops). When fully built, the SKA will require a supercomputer capable of at the very least a few Exaflops. That’s a million, million, million flops!

The SKA’s Central Signal Processor (CSP)

The CSP is the central processing “brain” of the SKA.  It converts digitised astronomical signals detected by SKA receivers (antenna & dipole -”rabbit-ear”- arrays) into the vital information needed by the Science Data Processor to make detailed images of deep space astronomical phenomena that the SKA is observing.

The CSP will utilise the latest generations of high-speed digital processing chips, high-speed/high-capacity memory chips, high-capacity fibre communications, high-speed circuit boards, high-speed modelling software and electronic test equipment, and the latest in agile, robust, and intelligent software.

The challenge with the CSP is that it must process enormous amounts of real-time data, and in so doing produce enormous amounts of output to be consumed by the Science Data Processor. It will be located in remote locations and must be designed to deliver the maximum science possible within a hard cost cap and an aggressive timeline.

The SKA’s Science Data Processor (SDP)

The SDP will have to manage the vast amounts of data being generated by the telescopes. From sky surveys, continuum surveys through to more targeted observations of objects both near and far. Taking the data, and moving it through data pipelines at staggering speeds, to then form data packages which will then be passed to the scientists, and in almost real-time, make decisions about noise that is not part of those delicate radio signals.

SDP is the part of the SKA system that will accept the data output from the correlator/beam-former and that will deliver final calibrated data products to scientists. To achieve the fidelity and high dynamic range required, the SDP system will have to employ sophisticated radio frequency interference mitigation, calibration and imaging algorithms, yet the sustained input data rates will be in the 100′s of gigabits per second and will need to be processed in soft-realtime.