Hardware Component of AI

Hardware Component of AI

Welcome to the second edition of my newsletter on the intersection of artificial intelligence (AI) and law. ?In this issue, I'll be focusing on a part of AI that doesn't get much attention: the hardware.

?

What does the term "hardware" refer to in the context of AI?

Every AI software needs a physical shell, like a computer, to function. The more complex the AI, the greater its computational power should be.

Computers use Central Processing Units (CPUs). CPUs are basically the brains for the machines, containing circuitry for processing inputs, storing data, and displaying results. CPUs determine how fast or complex tasks can be performed on a computer. Think of it like streaming a high-definition movie on an old laptop. You might eventually get to watch the film, but it'll take ages to load and probably won't play smoothly. Similarly, using a personal computer (i.e., an unsophisticated machine), to run sophisticated AI programs can be a slow and cumbersome process. Since personal computers are limited in capacity, they cannot handle really advanced AI tasks, such as modeling medical treatments or recognizing faces for example.

Due to the limitations of standard personal computers, most lay people, who use AI programs, use services hosted by companies rather than their own.

Presently, nearly all AI applications depend on the traditional computer architecture, which operates on binary units known as bits, with each bit being either a one or a zero. Consequently, every process and piece of information within this framework is ultimately represented by either a one or a zero. However, a new breed of computer is emerging alongside this conventional binary system: the quantum computer. Unlike classical computers, quantum computers have the potential to revolutionize how information is stored and processed.

Quantum computers operate with qubits, not the traditional bits used by classical computers. Qubits have the unique ability to encode information as both one and zero simultaneously, thanks to quantum mechanics principles like superposition and entanglement. The ability of quantum computers to perform complex calculations makes them significantly more powerful than classical computers. To illustrate , Google reported that its quantum computer completed a calculation in seconds that would have taken a classic computer 10,000 years (see https://www.nature.com/articles/s41586-019-1666-5 ).

In general, it's expected that fully operational quantum computers will be about 100 million times more powerful than today's laptops, and at least 3,500 times more powerful than current supercomputers. Quantum computers' speed is so much faster than classical computers that they can solve problems that were previously unsolvable. It does not mean, however, that quantum computers will completely replace classical computers. In the future, classical and quantum computers will excel in different areas. However, quantum computers will significantly boost our capacity to harness AI for good when it comes to complex calculations. But, they will also amplify some of the risks associated with AI use.

How do these technical details relate to law?

The hardware aspect of AI is related to law because it influences human behavior (see Jeutner, Valentin , The Quantum Imperative: Addressing the Legal Dimension of Quantum Computers (April 5, 2021)).

Technology can shape human decisions

Technology can easily influence human decision-making by establishing rights and limitations that guide specific actions.

Just like software, hardware is designed by human beings who have certain intentions and goals in mind.

"The discovery of the exact features of the atom, for example, did not necessitate the construction of a nuclear bomb. The nuclear bomb was constructed because individual scientists and individual politicians decided that this was a desirable course of action in light of a raging world war.” ?Sheila Jasanoff, The Ethics of Invention: Technology and the Human Future (Norton 2016).

These are political issues that carry legal implications because they involve the conflicting interests within society (Langdon Winner, ‘Do Artifacts Have Politics ?’ (1980) 109 Daedalus 121, 122).

While the developers' intentions may not be inherently problematic, it's important to keep this factor in the back of our mind. In order to draw a parallel, let's look at hardware and architecture as an analogy. In the same way that architects can design buildings that are wheelchair accessible or not, hardware can be developed that facilitates or restricts certain functionalities. The developer of an app can craft features that cater to the needs of a specific user, and the developer of a device can design machines with specific users in mind as well. Smartphones, for example, can be designed with intuitive user interfaces for adults, but challenging ones for young children to navigate. In a similar vein, a car's features could be tailored to accommodate drivers of a certain height, making it inaccessible to others. These design choices in both hardware and software can influence who can use them effectively, just as an architect's decisions can make a building accessible to certain groups of people.

A building's features are determined, like computer architecture, by its intended use. None of these design elements are inherently flawed, but each is a deliberate decision made for a specific purpose.

Some choices in computer infrastructure, like those in building design, are dictated by physical needs, but many are based on the designers' intentions for the machine's function. A common premise in both quantum and classical computing is that AI, which reflects human decision-making, adheres to certain rules of logic and rationality. While this may capture some aspects of human thought, it doesn't encompass the full spectrum. Obviously, this isn't necessarily problematic, but from a legal standpoint, such assumptions, combined with the belief that technology is inherently neutral, may turn out to be problematic.

Technology is NOT inherently neutral.

Also, it is also possible that the perspectives and needs of those who have the means to create powerful hardware may take precedence over those who do not. This leads us to another legal consideration relating to the increasingly complex and sophisticated hardware required to operate cutting-edge AI systems.

Technology will amplify inequalities

Building and running quantum computers is an expensive and complex process. They require a vacuum environment and must be cooled to approximately -272 degrees Celsius to function properly.

Since quantum computers are so complex, they can only be constructed and operated by a select number of organizations and nations. This exclusivity is noteworthy.

Often, technological advancements result in a disparity of capabilities between individuals and entities. For instance, classical supercomputers, while less potent than quantum computers, still require considerable technical expertise and energy to operate. The extraordinary advantages of AI are typically accessible only to those with the necessary hardware to run AI programs.

This highlights the importance of equitable access to technology, as it plays an important role in determining the relative standing and opportunities available to various minorities within society.

It is not surprising that nations and corporations are channeling substantial investments into quantum computing development, with the quantum market already valued at over $1 billion this year. As reported by Forbes , in Europe, Germany has committed to an investment exceeding $3 billion by 2026, while France has pledged nearly $2 billion, with goals to train 5,000 engineers proficient in quantum technology and to generate 30,000 jobs. In the United States, the National Quantum Initiative Act has sanctioned $1.2 billion over five years dedicated to research and development in quantum computing.

It is undeniable that the extraordinary capabilities of quantum computing, if left unregulated, can significantly alter social power dynamics.

To mitigate the risks associated with quantum computing, we need to establish legal frameworks that can steer regulatory measures relating to quantum technologies. From a legal perspective, strategies should be considered to address these disparities. One way to bridge the gap could be to mandate that entities with access to cutting-edge AI technologies allocate a portion of their computational resources to those who would otherwise be excluded.

Regulatory agencies and developers must ensure that quantum computers do not increase disparities or compromise individual autonomy, and that they consult with those whose interests are impacted.

Cybersecurity issues

Another problem is that quantum computers are assumed to be capable of bypassing any encryption, making classic passwords useless. Quantum dominance poses a significant cybersecurity threat, as many security measures would be made obsolete. In order for businesses to succeed in the near future, quantum-resistant encryption technologies must be deployed.

In such a scenario, entities equipped with quantum computers—be they individuals, corporations, or nations—would have a distinct strategic advantage over those lacking such technology. The latter group would find themselves unable to shield their information from the advanced decryption capabilities of quantum computers.

Quantum computers may not yet be fully operational, but we must prepare ourselves for their arrival, since quantum cryptography poses substantial security risks. We must understand, in the future, quantum algorithms may be able to decrypt information encoded using current encryption standards (Lukasz Olejnik, Robert Riemann and Thomas Zerdick, ‘Quantum Computing and Cryptography (2020) 2 Tech Dispatch accessed 24 March 2021). ?This implies that data encrypted through conventional methods today could potentially be decrypted in the future once sufficiently advanced quantum computers are developed.

The American National Institute of Standards and Technology (NIST ) is actively developing and aiming to standardize cryptographic algorithms that are secure against quantum attacks, with plans to mandate the adoption of such quantum-resistant encryption solutions. These post-quantum cryptographic (PQC) algorithms must be adopted by companies (small and large) in order to safeguard their data and communications (see Joseph, D., Misoczki, R., Manzano, M. et al. Transitioning organizations to post-quantum cryptography . Nature 605, 237–243 (2022).

Ultimately, it is important to recognize that the legal challenges posed by digital processes include both software and hardware considerations. Neither is neutral; both are normative constructs influenced by human decisions, open to scrutiny and debate. As we develop technological systems, we must acknowledge the biases and values embedded in them, and we must ensure that these values are critically considered.

?

Looking Ahead

In the next issue, I will talk about AI governance.

Thank you for joining me on this exploration of AI and law. Stay tuned for more in-depth analyses and discussions in my upcoming newsletters. Let's navigate this exciting and challenging landscape together.

Connect with me

I welcome your thoughts and feedback on this newsletter. Connect with me on LinkedIn to continue the conversation and stay updated on the latest developments in AI and law.

Disclaimer

The views and opinions expressed in this newsletter are solely my own and do not reflect the official policy or position of my employer, Cognizant Technology Solutions. This newsletter is an independent publication and has no affiliation with #Cognizant.

Michele Adler

Guiding Women Entrepreneurs to see beyond the surface, uncover their unique brilliance, step boldly into their most authentic selves, and confidently build a sustainable business with my proven step-by-step system.

3 个月

Laura Reynaud Esq., LL.M. my husband and I just saw a program about quantum computers and qubits with Michio Kaku. I thought it was fascinating.

Dear Laura, thank you for taking up an important issue of the law and AI. Highlighting the challenge is the first step to a solution. You are right that technology is not neutral. Technology development and applications reflect the law, traditions, and culture. Technology is owned by those who create it. Unfortunately, benefiting is more pronounced in the World (and in the law) than altruistic sharing.?Technology is the instrument for domination. The availability of AI to many decreases this domination. In 2024, there are no anymore few ‘developed’ and many ‘developing’ countries, as it was 100 years ago. You are based in the Middle East. My experience of the Middle East is that the region has enough resources to tackle development of AI, qubit computers, and applications. There is a good infrastructure and options to build teams. The Middle East can ‘join the race’. No need ‘to look up and away’, one can ‘look inside and around’. The law is a good instrument to make people realize their own capacities. Highlighting challenges is the step forward. Your newsletter certainly helps! With best regards,

Milla Oinonen

Senior Legal Counsel | LLM | MBA | Commercial Contract, Data Privacy and Technology Counsel specialized in international contracts with Compliance and Corporate Governance experience.

4 个月

Your article got me thinking, do you think regulatory measures should be taken to mandate the sharing of computational resources by those with access to this type of cutting-edge technologies, ensuring broader access for mitigating disparities? Laura Reynaud Esq., LL.M.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了