Is the commercial use of HPC lagging in SA?

January 13th, 2014, Published in Uncategorised articles


South Africa is not lagging behind when it comes to the world of high performance computing. The Cape Town-based Centre of High Performance Computing (CHPC) has an impressive track record, although mostly in the sphere of academia.

When it comes to commercial applications there appears to be a lack of interest. Is that because the CHPC focusses more on research projects like its flagship project  at the Centre for Space Research at the North-West University in Potchefstroom with Prof. Marius Potgieter? Or is there a lack of interest from industry? Perhaps red tape issues prevent industry from utilising the facilities?

This flagship project studies the origins of cosmic rays, their propagation in the galactic and interstellar medium, where they encounter the heliosphere and eventually reach planet Earth. It focusses on the computational modelling of heliospace physics using numeral codes aimed to design, construct, link and expand numerical models in order to simulate the transport and acceleration of cosmic rays from their creation in the galaxy to their arrival on earth. By studying cosmic rays the possibility exists that scientists may be able to forecast some parts of the global climate change caused by these rays. A single millisecond pulsar (MSP) calculation takes about five hours on a fast desktop PC. This means that computations for a population of 70 MSPs will take approximately two weeks. In contrast, using the CHPC cluster in Cape Town takes a single day. Investigation on the distribution of primary cosmic rays in and outside galactic spiral arms has been shortened by several months with faster conclusions.

The CHPC is certainly able to deliver in the academic arena.

Today business creates a huge amount of data but is this data being put to good use or is it simply stored because we generate all this information? Jim Herring, director of IBM HPC products says that it’s a daunting task for enterprises to determine the core value of this data – and it’s a major reason why most updated data-retention policies, if they exist, are conservative in their safekeeping practices. “All of this causes us to store more stuff longer or even indefinitely, but business analytics is changing this – and so are an increasing number of intense data-crunching practices, such as complex portfolio analysis in the financial industry. Once performed almost exclusively by research laboratories and universities, these practices are now making their way into business. Taken together, big data trends are making a compelling case for the adoption of high-performance computing (HPC) in enterprise data centres.

HPC is the use of parallel processing that spreads machine instructions across multiple processors so they can run advanced applications efficiently, reliably and quickly. The applications include weather analytics and forecasting, scientific computations such as the analysis of genomes, oil and gas exploration, financial portfolios, medical and pharmaceuticals, and the breakdown of big data for use in business analytics.

“We see HPC as a solution for enterprise problems that require an intense compute process accompanied by the capability to rapidly process data,” says Meike Chabowski, product marketing manager for SUSE Linux Products. “This could take the shape of HPC clusters that run traditional HPC workloads and that focus on number crunching, or HPC processing that supports specific business applications such as portfolio management, risk management and enterprise data management.”

These new HPC clustering technologies use open source and are based on either a UNIX/Intel or a Linux/Intel platform. Especially in the case of Linux/Intel processing clusters, the total cost of ownership can be attractive to data centre decision makers. HPC is highly scalable and delivers high performance and reliability. “Of the top 500 supercomputer users, more than 90% of these sites implement HPC on Linux,” says Chabowski.

The main objectives of the CHPC are to enable South Africa to become globally competitive and to accelerate Africa’s socio-economic upliftment through the effective application of high-end cyber infrastructure. To honour its objectives, the centre has committed to continuously ensure that South African and African researchers have access to world-class systems and applications that enable them to accelerate their research and produce quality outcomes and prototypes.

In a recent statement on the  CHPC website, the CHPC says that it has an objective to broadened and deepen its user base – not only to inspire a larger number of people from traditionally computable but also to engage non-traditional areas, e.g. social sciences, emergency services,  and finance. The CHPC also says that it aims to partner with industrial and commercial organisations active in areas dependent on high end computing.

The questions are “Is the CHPC geared to handle commercial projects? Is industry ready to embrace HPC more aggressively?   EngineerIT asked the CHPC to comment on how they see their role in working with commercial projects but at the time of going to print no response was received.

Related Articles

  • Phoenix Contact bids farewell to GM Peter Mauff
  • The Fourth Industrial Revolution risks leaving women behind
  • SA GeoTech 2019 presentations
  • SA GeoTech 2019 – Picture gallery 1
  • Gadgets4Geeks – July 2019