Understanding AI

Understanding AI

?#8

What is Artificial Intelligence and Data Science ?

Back in 1950 a professor called Alan Turing published a paper in which he speculated in the possibility for machines to think by themselves. He posed an idea, that was later to become the Turing test, to figure out if computer had the potential to think intelligently like a human being. The computer had to keep a conversation indistinguishable from a real human conversation, but no computer has ever been able to pass the test. The term “Artificial Intelligence” was further developed during the 1950’s and got defined in 1956. In 1959 the MIT Lab was established and research in AI began and is still undergoing to this very day. Jumping forward almost 40 years IBMs Deep Blue computer beats Garry Kasparov in a game of chess in 1997, and since then we really haven’t looked back. Today we surround ourselves with loads of Artificial Intelligence, like production robots, household appliances and Chatbots (Siri, Alexa, Google Assistant and likes), which is based upon Deep Learning or Machine Learning, and the list of applications are growing exponentially.??


How do we define AI?

Google, Microsoft, facebook, and many others are investing heavily in AI and with good reason. AI has been predicted to be the most important future technology, and the potential for deploying AI in all kinds of branches are almost endless. But before we dig into the heavy Data Science behind AI, it's important to understand the definitions of Artificial Intelligence.

Artificial intelligences are divided into 3 different kinds of intelligence:

Artificial Narrow Intelligence (Weak AI)

A simple example of Weak AI is the Google search engine, which suggest whole sentences, when you search on a specific topic. The search suggestions are based on your previous searches, age, location, and lots of other parameters. When Spotify recommends music like the genre you hear right now, it’s Weak AI in the works. When Netflix recommends movies for you to watch, it’s based om a Deep Learning process of going through the type of movie or series you, and other users with the same taste in movies, have been watching earlier. When Teams recommend a specific answer in a Teams chat, that’s… you guessed it… Weak AI. But it could also be a simple recognition of number plates, when crossing a toll bridge or road. Some of the more cooler use cases that we are presented with could be stuff like predicting maintenance in a production line, Visual inspection for quality control, fraud detection or imaging diagnostics for improving patient care. All these examples all fall under the term Artificial Narrow Intelligence or “Weak AI”, because they all operate within a certain frame of simple functions, with no intellectual thinking process as a human being can do.

No alt text provided for this image
Google Searth engine suggest questions for you, based on similar questions posed by other users


Artificial General Intelligence (Strong AI)

Basically, this level puts a computer on the same level of intelligence as humans. Computers have a very strong processing unit and can process millions of documents or tasks in seconds, but they still lack the thinking and reasoning of a human being. Even though we are getting closer to the point where we can tell a machine to do a specific task, and the machine makes a rational decision on its own, to evaluate if the order is sane or not, this level is still to be achieved. A good example of this is ChatGPT, which was launched in late by OpenAI. ChatGPT mimics humans in their conversation style, and is even able to write poetry, music or write software. Still, it lacks training and sometimes draws the wrong conclusion on even the most obvious answers and even expresses some radical opinions on, for instance, questions on race or gender. Therefore, ChatGPT cannot pass a Turing test, and is still classified as a Weak AI or Narrow Intelligence.

Artificial Super Intelligence

Now we cross the border of machines being cleverer than humans. This is frequently depictured in Sci-Fi movies like “The Terminator” or “Space Odyssey 2001”, where computers take over the world and Doomsday are looming in the horizon. To be totally clear here, this scenario is very unlikely, at least for many years to come.

At this stage in time we ONLY have Narrow Intelligence, or Weak AI, available. We can tell a computer to do specific tasks or processes and learn from them, but common reasoning, like we humans do, is still many years away.

?

The importance of AI

So why is Artificial Intelligence, or Data Science, becoming more and more important? For once, the amount of data we create must be processed, and up until now this has been a very data heavy process. ?90% of the world’s data has been created within the last 2 years. Scientists estimate that by 2025 we have created more than 175 Zettabytes (or 175.000.000.000.000.000 GB) at the edge, but at present we are only able to process roughly 12% of it. This calls for huge amounts of storage and processing power to process all that data. ?The data is for the most part unsorted that needs to be fed into a system that can process, train, and evaluate, before it can deploy a result. Imagine going through thousands upon thousands of MR scans to look for patterns of signs of cancer in patients. This would take years to accomplish for a team of doctors, but a PC would process this in a matter of seconds, and with much higher accuracy. Taking the amount of time a process takes, down to a fraction of a second, is the key essential for developing AI, and here is where a Workstation excel!


What do I need to get going?

I have already presented you with some use cases for AI, so what is the next step? The easy step is to hire an expensive data scientist and then we are done, right? Wrong. It’s a time-consuming task to gather data, process, train and deploy it into a usable output.

This process calls for heavy duty equipment, and a workstation preferably with multi-CPUs and GPUs, as AI datasets require a massive amount of power. Many software applications, like Python, TensorFlow or OmniSCI, can leverage from the power from a powerful set of GPUs, and vendors like Nvidia have even produced specially designed AI cores build into the GPU itself. Nvidia call these cores “Tensor cores”, and were first introduced back in late 2017. The word “Tensor” is actually a mathematical term that describes a relationship between different mathematical objects that are linked together. It could be an array of numbers a matrix of multiplications.

No alt text provided for this image
A small selection of applications for use in an Data Science environment

Tensor math is widely used for calculations in physics and engineering and solves all kinds of complex issues from fluid mechanics to astrophysics, and to crunch these numbers demands immense power from multiple CPUs and GPUs. Deep Learning, where handling of huge collections of data in enormous arrays of neural networks, is another field where Tensors are used extensively.

Another vital part of a Data science Workstations is the memory. A typic configuration requires at least 64+ GB of RAM; way more than the average desktop or notebook can muster. Here 512GB of memory is not uncommon. Data Science Workstations usually are not relying on Windows as operating system like a normal PC does but are heavily dependent on a Linux kernel-based system.

Here are some examples of what a good Data Science workstation should look like:

?ENTRY / MID-RANGE

ThinkStation P520 Desktop Workstation

????????Intel Xeon W2255 10C CPU

????????64GB DDR4-2933 ECC Memory

????????1x NVIDIA RTX A6000 GPU

????????1x 512GB or 1TB+ M.2 NVMe SSD

????????Ubuntu 20.04 LTS w/RAPIDS Linux OS

?

HIGH-END

ThinkStation P920 Desktop Workstation

????????2x Intel Xeon Gold 6246 12C CPUs (24C Total)

????????192GB DDR4-2933 ECC Memory

????????Optional Intel Optane DCPMM (Max. 2TB)

????????2x NVIDIA RTX A6000 GPUs w/NVLINK

????????1x 1TB+ M.2 NVMe SSD

????????Ubuntu 20.04 LTS w/RAPIDS Linux OS

?

ThinkStation P620 Desktop Workstation

????????AMD Threadripper Pro 3995WX 64C CPU

????????256GB DDR4-3200 ECC Memory (8Ch.)

????????2x NVIDIA RTX A6000 GPUs w/NVLINK

????????2TB+ M.2 PCIe Gen.4 NVMe SSD

????????Ubuntu 20.04 LTS w/RAPIDS Linux OS

?

ULTRA HIGH-END

ThinkStation P920 Desktop Workstation

????????2x Intel Xeon Gold 6258R 28C CPUs (56C Total)

????????384GB+ DDR4-2933 ECC Memory

????????Optional Intel Optane DCPMM (Max. 2TB)

????????2x NVIDIA RTX A6000 GPUs w/NVLINK

????????Optional 3rd NVIDIA RTX A6000 GPU

????????2x 2TB+ M.2 NVMe SSD

????????Ubuntu 20.04 LTS w/RAPIDS Linux OS

No alt text provided for this image

?

Lenovos unique Data Science Stack, which simplifies the deployment of Performance Workstations to AI Professionals, makes it easy for any company to embrace Data Science as a tool of their own. Lenovo optionally preloads the application on to any Data Science Workstation customers order.

If you want to learn more about Data Science Workstations from Lenovo, please don't hesitate to contact me or one of my great colleagues.

??

#lenovo?#workstation?#artificialintelligence #artificial #intelligence #thinkstation #thinkpad?#intel #AMD #python #Nvidia #tensor #chatgpt #openai #ai #datascience?#nvidia?#wearelenovo

要查看或添加评论,请登录

社区洞察

其他会员也浏览了