In conversation with… Dr Happy Sithole

January 25th, 2019, Published in Articles: EngineerIT

HPC focus must shift to SMMEs

Dr Happy Marumo Sithole, director of the Centre for High Performance Computing.

The initiative to establish a High Performance Computing (HPC) facility in South Africa came from the academic research community. When Dr Happy Sithole was working on his PhD in material science, there was a small cluster of super computers at the University of Limpopo, but for some very large computational analysis he had to use a much larger HPC facility in the UK. Scientists at other South African universities were in similar positions. Collectively, the science community approached the Department of Science and Technology (DST) to spearhead a project to establish a Centre for High Performance Commuting (CHPC) in South Africa. The business case provided a good argument for making such a facility available to researchers and scientists. With long term funding from the DST, the CHPC was established in Cape Town in 2007 with Dr Sithole at the helm. CHPC is part of the CSIR but is funded by the DST.

“We have come a long way since our first machines. From a slow start we unveiled the fastest computer on the continent, a petaflops machine in 2016. This is a super computer with processing speed capable of a thousand-trillion floating point operations per second. With over 40 000 cores, the machine is the fastest computer on the African continent running at a speed of roughly one petaflop (1000 teraflops). That is 15 times faster than the system named ‘Tsessebe’ (Setswana for antelope) which had a peak performance of 24,9 teraflops/second and became number 311 on the world’s top 500 supercomputers and was ranked number one in Africa. CHPC names its high performance computers after the fastest animals in the country and gave this petaflop machine the name ‘Lengau’ (Setswana for cheetah).

“From the outset I had a much larger vision for CHPC, not to only support academic institutions but to capacitate industry to benefit from big data that is collected but is seldom put to good use. Over the past few years my vision is being realised with more enterprises understanding the value of big data and extracting valued information with supper computing applications. We have seen sectors such bioinformatics, the engineering, and petrochemical industries using HPC to analyse big data to improve and streamline their processes. While many industries are moving into HPC, I believe the focus must shift to include HPC at the low end of the market, SMMEs!

“We have started working with the Technology Innovation Agency (TIA) to identify small businesses to pilot HPC technologies. TIA has a programme called technology stations which we are now engaging in. One of the technology stations is based at the Cape Peninsula University of Technology (CPUT). They are looking at a reactor for textile dye. It is a complex piece of machinery which they want to redesign. We are working with them to put together a three-dimensional computational model to assist them understand the free flow of dye.

“I would like to see more projects like this because if we look at the success of South Africa, we must start new industries. However, SMMEs may have barriers to innovation and we must focus on removing those barriers.”

Fig. 2: Students at the University of the Freestate pondering over a problem in the Cyber Security Challenge competition at the 2018 CHPC National Conference.

But can SMMEs afford HPC?

“We have agreed with the Department of Science and Technology that we will support SSMEs in developing proof of concept to the stage where a project becomes profitable. Only then we will start charging them for our services.”

Given the nature of manufacturing, many small enterprises may have accumulated big data but don’t have the expertise to extract value from this data. Sithole said that CHPC employs engineers who are available to assist new entrants into HPC. “Like in the case of the CPUT Technology station, we will make available engineers specialising in computational fluid dynamics (CFD) and finite element analyses (FEA) as well as material science.”

“The challenge today is that many companies lag expertise in data science and data manipulation. Data science is a relatively new science and many people in industry have not been exposed to this. They need support to acquire skills like how to look at data, how to clean data and how to install an HPC machine to manipulate data. Industry needs to acquire expertise in creating a model that will deliver the true value that hides in big data. This is our way forward.”

With increasing demand not only from the academic sector but in particular the new industry demands, does the CHPC have enough capacity to meet these requirements? It is unlikely that every SMME or even larger companies will invest tomorrow in an HPC machine, but will rather turn to the CHPC to provide the compute power.

“It is a huge challenge! We are currently running close to 100% utilisation but we are managing the load by scheduling tasks on a fair share basis. We, however, have a roadmap to meet increasing future demands. The roadmap is based on the premise that a machine has a service life of six years. After that period some problems may appear, although it is not that the whole machine become inoperative, it may be a few nodes. Before the end of the six-year period we start looking at new processor technologies that are becoming available on the market. We are now in our fourth year.”

In September last year the CHPC added nine nodes of GPUs which accounts for 24 cards of NVIDIA P100 GPUs which are top of the range. They compliment the normal HPC but three of nodes have been configured as high density GPUs, four GPUs in a node which will perform high performance data analytics (HPDA). This is a process that leverages HPC’s use of parallel processing to run powerful analytic software at speeds higher than a teraflop or (a trillion floating-point operations per second). Through this approach, it is possible to quickly examine large data sets, drawing conclusions about the information they contain.

What happens after the six-year period? A total replacement? Dr Sithole said that yes, that is what it amounts to, but that does not mean that CHPC is discarding the current equipment. There is still much use left. “We bundle the 1000 nodes into smaller packages and donate them to universities to build capacity and introduce students to entry level HPC. We are conscious of the dynamics of our country and institutions that are historically disadvantaged and not able to support HPC. We work with them to create capacity. One example is our support of Sefako Makgatho Health Sciences University (SMU) in a big data project to manage the stock in pharmacies in government hospitals and clinics.”

In his opening address at the annual CHPC National Conference held in Cape Town in December 2018, Dr Sithole said, “Globally the landscape in HPC has changed significantly, where advancement of technology sees debates moving from the focus of improving performance of applications based on the increasing processing capabilities that are driven by Moore’s law developments to new innovative methods. The transformation of utilisation with non-traditional use of HPC and cyber infrastructure in general drives the design of technology to focus on performance, but also on different methods of provision HPC system and models of access.”

My conversation with Dr Sithole convinced me that South Africa is among the leaders in HPC and ready to support South Africa’s quest to embrace the fourth industrial revolution, which will be very much dependent on HPC and HPDA.

Fig. 3: Rhodes University team at the CHPC Cyber Security Challenge.

Related Tags

Related Articles

  • DCIM-as-a-service: Leveraging IoT and big data
  • Gaining visibility in a hybrid cloud environment
  • A people-first approach to RPA
  • Measurement parameters for burner and boiler systems
  • As-a-service solutions bolster information security