The Importance of Supercomputing

The Importance of Supercomputing

Most people use their computers (which includes mobile phones) for communication, social media, games, entertainment, office applications, and the like. Most of the time these activities are not particularly onerous in terms of computing as such or do not lead to enormous benefits in productivity, inventions, and discovery. There is one field, however, rarely discussed, that does do this - and that is supercomputing. It is through supercomputing that we are witnessing the most important technological advances of our day, including astronomy, weather and climate forecasting, materials science and engineering, molecular modeling, genomics, neurology, geoscience, and finance - all with numerous success stories (https://web.archive.org/web/20151031013056/https://www.hpcuserforum.com/downloads/HPCSuccessStories.pdf).


Usually, I draw a distinction between supercomputing and high-performance computing. Specifically, a supercomputer is any computer system that has exceptional computational power at a particular point in time, many (but not all) of which are measured in the bi-annual Top500 list (https://top500.org/). Once upon a time dominated by monolithic mainframes supercomputers, in a contemporary sense, are a subset of high-performance computing, which is typically arranged as a cluster of commodity-grade servers with a high-speed interconnect and message-passing software that allows the entire unit to be treated as a whole. One can even put together a "supercomputer" from Raspberry Pi systems, as the University of Southhampton illustrates (https://www.youtube.com/watch?v=Jq5nrHz9I94).


How important is this? For many years now we've known that there is a strong association between research output (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1679248) and access to such systems. Macroeconomic analysis (https://www.hpcuserforum.com/downloads/idcstudy.zip) shows that for every dollar invested in supercomputing, there is a return of forty-four dollars in profits or cost-savings. Both these metrics are almost certainly going to increase in time; datasets and problem complexity are growing at a rate greater than the computational performance of personal systems. More researchers need access to supercomputers.


However, researchers do require training to use such systems. The environment, the interface, the use of schedulers on a shared system, the location of data (https://www.youtube.com/watch?v=9eyFDBPk4Yw), is all something that needs to be learned. This is a big part of my life; in the last week, I spent three days teaching researchers from the basic of using a supercomputer system to scripting jobs, to using Australia's most powerful system Gadi at NCI (https://nci.org.au/news-events/news/australias-gadi-a-recognised-powerhouse-global-supercomputing-ranking), along with contributions at a board meeting of the international HPC Certification Forum (https://www.hpc-certification.org/). It is often a challenging vocation, but I feel confident that it is making a real difference to our shared lives. For that, I am very grateful.

From: https://levlafayette.com/node/750

要查看或添加评论,请登录

Lev Lafayette的更多文章

  • eResearch New Zealand 2025 Summary

    eResearch New Zealand 2025 Summary

    Aotearoa New Zealand has run eResearch symposiums and conferences since 2010, with the 2025 conference held in…

  • Twenty Years Ago, We Landed on Titan

    Twenty Years Ago, We Landed on Titan

    "Whoever has seen the universe, whoever has beheld the fiery designs of the universe, cannot think in terms of one man,…

  • Another Year in Supercomputing (2024 edition)

    Another Year in Supercomputing (2024 edition)

    The end of this year marks my seventeenth year working in high performance computing and my ninth at the University of…

    1 条评论
  • Another Year in Supercomputing

    Another Year in Supercomputing

    Since late in 2007 I have been involved in the field of high performance computing. Initially, this was at the…

    4 条评论
  • Spartan Finally Receives Its Laurels

    Spartan Finally Receives Its Laurels

    Way back in 2015 the University of Melbourne had a general-purpose high performance computer system called "Edward"…

    3 条评论
  • The Soul of the Machines? - The Current State of Advanced Artificial Intelligence

    The Soul of the Machines? - The Current State of Advanced Artificial Intelligence

    It worth introducing this presentation by making reference to the International Society for Philosophers, of whom I am…

    6 条评论
  • Interactive HPC Computation with Open OnDemand and FastX

    Interactive HPC Computation with Open OnDemand and FastX

    As dataset size and complexity requirements grow increasingly researchers need to find additional computational power…

  • Monitoring HPC Systems Against Compromised SSH

    Monitoring HPC Systems Against Compromised SSH

    **Objective** To describe a protections and monitoring against compromised SSH keys on HPC systems. **Scope** To…

  • The Continuum of "Needs" and "Wants"

    The Continuum of "Needs" and "Wants"

    There are many discussions between the difference of needs and wants, from personal satisfaction and happiness to the…

  • Batch Image Processing

    Batch Image Processing

    It may initially seem counter-intuitive, but sometimes one needs to process an image file without actually viewing the…

    2 条评论

社区洞察

其他会员也浏览了