In the front row: news from the show floorS

In the front row: news from the show floorS

I have just returned from two intensive (both in terms of number of meetings as well as in terms of massive quantity of food…) events: the RISC-V Summit North America in Santa Clara (CA) and SC23 in Denver (CO).

Building on the success of the RISC-V Europe held in May in Barcelona, this year’s RISC-V Summit North America has seen a large crowd gather at the Conference Center in Santa Clara to share technology breakthroughs, industry milestones, and case studies, to network and build relationships, and to discuss how to shape the architecture’s strategic future. The large vendor’s area presented the existing and future implementation of the ISA.

SC is one of the key events (ISC in Europe and SC Asia being the others) where the high-performance computing community convenes for an exciting week of sessions, speakers, and networking at its finest. The RISC-V Summit and SC are the events where an unparalleled mix of scientists, engineers, researchers, educators, programmers, developers, and vendors intermingle to learn, share, grow and enjoy.


RISC-V Summit North America

Calista Redmond, RISC-V International's CEO

RISC-V is exploding and inevitably, as Calista Redmond , CEO of RISC-V International , keeps reminding. Actually, many vendors presented their products, ranging from small hand-held devices to industry-grade accelerators specialized in AI/ML workloads. I was impressed by the great number of large enterprises, start-ups, SMEs, small university departments currently developing their own processor or implementing the ISA. That’s the beauty of the open standard?instruction set architecture (ISA). At the Summit, the RISC-V community shared the technical investment and shaped the architecture’s strategic future so everyone may create more rapidly, enjoy unprecedented design freedom, and substantially reduce the cost of innovation. Anyone anywhere can benefit from these contributions.


Here are my take-aways, focusing more on the technology than on specific vendors:

  1. While the embedded market for RISC-V chips has reached a sufficient maturity and a good market penetration, industry-grade chips for HPC are still in development.
  2. The only notable product suitable for AI/ML workloads is the Esperanto Technologies’ ET-SoC-1 inference chip (available now), which is designed to be the world’s most efficient RISC-V commercial chip, with over a thousand RISC-V processors on a single TSMC 7nm chip delivering a massively parallel, flexible architecture that combines exceptional performance and ultra-low power consumption.
  3. For designers looking into RISC-V software development without requiring full custom silicon, the?OpenHW Group has announced?the CORE-V CVA6 Platform project with the goal of bringing RISC-V emulation to more designers. Instead of requiring physical hardware or a software-based emulator, the CVA6 platform offers a one-to-one FPGA hardware mapping to ensure accuracy when moving from emulators to hardware.
  4. Rounding out the RISC-V developments,?Ventana held a session?covering the Veyron V2, the second-generation of RISC-V data center processors. The company reports improvements with the Veyron V2 in performance and efficiency compared to the?Veyron V1 (released at last year’s RISC-V Summit) and brings RISC-V to more applications that demand higher performance. Currently Ventana reports up to 40% performance improvements using a 3.6 GHz clock and 4 nm process technology. In addition, there is improved ecosystem support thanks to the RISE ecosystem initiative. Furthermore, using Ventana’s Domain Specific Accelerator, designers can improve workload efficiency without constraining innovation, allowing for more developments to be?made using RISC-V ?at all levels of computing.

As time goes on, it will be exciting to see how the extensible ISA and designers evolve to provide more performance and better efficiency. Regardless, the trends toward broader adoption and more deployment of the RISC-V ISA highlight its potential to change the way computing is done.

E4 is currently installing platforms in the RISC-V Lab, a laboratory dedicated to research activities exploiting the RISC-V ISA and focused on having a strong technological and innovative impact in the fields of HPC and AI. The RISC-V Lab will provide supervised first-hand access to customers and developers to platforms within a technology-oriented program.


SC23

SC23 in Denver

This year’s show has been impressive! The quality of the technical talks has further improved, compared with the previous events and 438 exhibitors have shown their best and latest products.

E4 Computer Engineering SpA participated as a speaker in the “Second International Workshop on RISC-V for HPC” with a speech entitled “E4 Experience with RISC-V in HPC”.

The attendees joining the panel were such that they filled the room, a rough estimate put the number to more than 150 people some of them participating standing.

Attendance was intense with many questions asked at each talk. The presentation on the RISC-V HPC cluster designed and integrated by E4, called “Monte Cimone”, was received with considerable interest and attention.

While not a widespread knowledge, Monte Cimone shows that the whole HPC software stack is mature enough for being used for complex workloads and the demonstration that it works correctly is a great achievement for this platform.

A continuation of the workshop will take place at HiPEAC 2024 (17-19 January, Munich), where further in-depth analysis will be performed, and an impressive list of speakers have already signed up.


As well as for the RISC-V Summit, here are my thoughts, centered more on technology than on specific vendors:

QUANTUM COMPUTING OR THE LUCID MADNESS

The basic premise goes, because quantum computing presents revolutionary opportunities for many scientific and industrial markets (e.g., AI, HPC, chemistry, materials science, finance, security), tech giants as well as a large number of startups?are involved in some form either developing platforms and/or competing for market dominance. Needless to say, competition is high and healthy! Different technologies are currently in different TRL stages, from TRL 3 (experimental proof of concept) to TRL 9 (actual system proven in operational environment). From this range, one could infer that it may be reasonable to assume that quantum computing is finally on the verge of becoming usable for ‘standard’ scientific and commercial applications (please note ‘standard’).

  • The basic premise continues, quantum computers are able to solve more complex problems than classical computers —even?the latest generation of supercomputers?— and can run highly complex?simulations un-tractable to classical computers, because of intrinsic limitations or unacceptable time to solution.
  • No major advancements with respect to the well-known current impediments of QC:- Quantum Decoherence- Qubit Scalability- Quantum Hardware Reliability- Quantum Software Development- Quantum Error Correction- Noise and Interference- Quantum Communication and Networking- Quantum Software Verification- Quantum Supremacy and Benchmarking: with respect to this key subject, demonstrating quantum supremacy, where quantum computers outperform classical computers in specific tasks, is a key milestone. However, accurately benchmarking quantum devices and defining meaningful metrics for quantum computing’s success remain challenging tasks for the research community.
  • No matter these humongous difficulties, there’s no doubt that the HPC community and industry should embark in this lucid madness, because quantum computing has the potential to transform many industries in the next decade. However, classical computing will always continue to play a role, and classical computers aren’t going away. I concede that quantum technologies will bring huge benefits on specific tasks and have the potential to disrupt many industries and many markets, but only in conjunction and complementing classical computers.
  • According to McKinsey, there are four areas (Drug Development, Optimization Problems, Quantum Artificial Intelligence, Cryptography) where quantum computing could yield immense long-term gains. Nevertheless, not every sector of science and engineering is likely to benefit in the same way and classical computing will remain relevant and necessary in several areas and complement the benefits of quantum technology.
  • Talking with users, I keep being asked: Which problems are better suited for quantum, and which for classical computers and how to combine the two to achieve the optimal exploitation of the technical features of both architectures?
  • Classical computer and Quantum Computer have complementary properties, the classes of problems where QCs have an advantage are called Non-Polynomials, where in comparison classical computers have long problem-solving times and consequent increased energy consumption. Therefore, the advent of QCs will not represent the extinction of HPC systems but the beginning of more and more integration to take advantage of the combined properties of the two types of machines.
  • The most likely scenario as of now is that researchers in academia and industry will have access to quantum computers through cloud services. Although quantum technology is still in its early stages, providers like?Amazon Web Services?and?Microsoft Azure?already offer cloud access to it.
  • It’s crucial to leverage the strengths of both technologies (classical computing and quantum computing) to unlock quantum’s full potential. Quantum computing isn’t going to take over the world. But it’s going to have a major impact in the next decade or two by working in full concertation with classical computers.
  • Four key factors will determine the success of Quantum Computing’s path to commercialization:- funding availability to continue the research and development-?tight co-design with end users to develop platform for running real-life workloads- the standardization of programming models and programming paradigms-?the availability of talent
  • Companies that don’t embrace the power of this new quantum-classical hybrid architectures risk being left behind.
  • At its premises, E4 Computer Engineering is building a Quantum Laboratory aimed at testing the technologies first-hand and providing access to customers and developers. Exploring the methodologies for the integration between quantum computing and classical computing can unchain the synergies of these two technologies. E4 places great hopes in QC and is actively collaboration with a number of vendors to co-design their next-gen products.

Bear with us…

COOLING

  • Equipment that can be applied to cool classical computer components to reduce the heat released and power consumption inside the data center took center stage. Plenty of technologies are currently deployed and an even greater number is in development. Walking around the aisles, equipment based on DLC, two-phase/phase-change cooling, immersive cooling, and other even more exotic technologies could be seen.
  • Talking with large, pre-Exascale class data centers, I gathered that their objective is to eliminate fans. In other words, only a minimal fraction of the heat generated by the equipment (in the form of warm air) will be released in the data center, and technologies using non-air cooling are tasked to absorb at least 95% of the heat generated.
  • E4 is currently in the final validation phase of a two-phase/phase change technology, developed as part of the Towards EXtreme scale Technologies and Accelerators for EuroHPC HW/SW Supercomputing Applications for Exascale (TEXTAROSSA) project, which hold promises to achieve that goal. The product is currently being engineered for production and deployment and will become a standard offer for a large number of platforms.


Conclusions:

Amidst the whirlwind of cutting-edge technologies and groundbreaking innovations showcased at the RISC-V Summit and SC23, the journey through the realms of RISC-V, quantum computing, and advanced cooling solutions underscores the relentless pursuit of progress. As the industry converges to redefine the future of computing, E4 stands at the forefront, bridging the branches of HPC, AI, and Quantum technologies.

Stay tuned for the unfolding saga of transformative possibilities, where the synergy of classical and quantum computing paves the way for a new era in technological evolution, or perhaps just what appears to be a lucid madness.

要查看或添加评论,请登录

社区洞察