The 5 things you want to know about Artificial Intelligence

Artificial Intelligence (AI) has been around for a while, the term being coined by John McCarthy in 1956.

In the last few years the interest in AI has grown with immense speed. Many claim to understand AI – Google Translate being their favorite point of reference – when really we have to face the fact that there are only a handful of experts out there that truly know what AI is and what it’s capable of. In order to give you a starting point into the conversation, here are the top 5 things you should know about Artificial Intelligence. 

1) Artificial Intelligence on the rise

AI technology can be found more and more in our everyday personal and work environment. This is due to the capability to store larger amounts of data, the lower cost of such technology making it accessible for most businesses and the private sector as well as the recent advancements in technology overall. The most common example next to Google would be our Smartphone’s integrated personal assistant also known as “Siri”. The software uses speech recognition and natural language processing to react, process and improve when communicating with the user. While there are still a few that reject the idea of owning a smartphone there will be technologies in the near future you can’t miss out on– especially not if you are a company looking to stay competitive and increase efficiency. With regard to the pharmaceutical industry, we can also see a strong need of restructuring business models and adapting to new technologies. “The litany of concerns that pharmaceutical companies face includes payors tightening up on cost management, strained government healthcare budgets, the need to understand and adopt new technologies, and challenges to their traditional pricing mechanisms by empowered stakeholders, from patient to payors,” states Jo Pisani and Dr. Myrto Lee from Strategy&

2) AI vs. human intelligence – who is helping who?

Change is good. This also counts for change in technology. While there is a huge discussion about AI replacing human beings at work and augmenting the unemployment rate, the reality is that in the future there will be a hybrid system of machines and human beings working in close relation to each other. AI technologies are still in very early development stages and always in need of human input and supervision. The real treasure lies in machines’ capability to exceed human brain power when it comes to the amount of data they can process and the accuracy in which they do so. Machines liberate workers from repetitive, banal work, making space for them to be more creative, innovative and efficient. Additionally machines are capable of covering tasks that otherwise would not have gotten any attention at all. 

3) Key challenges

There are four key challenges businesses are facing with regard to Artificial Intelligence. First, there is an overall lack of understanding of who should implement innovative technologies in the operating business and the ability to predict the return on investment. Secondly, there are subject matter experts in every business but no guarantee of their tech savvy. The need to recruit AI specialists or partnering with an expert is substantial. Thirdly is the need of data. The results AI technologies deliver are fully reliant on the data with which they are fed, “with 43% of analytics professionals citing that ‘ensuring data quality from a variety of sources’ as their biggest analytics challenge,” states Rowan Curran and Brandon Purcell from Forrester Research. Finally, there is a big impact on the business itself. Restructuring the workforce is an inevitable task many businesses are reluctant to do, but must if they wish to outperform their competition.

4) Artificial Intelligence in Life Sciences

There are several areas in the Life Science industry in which Artificial Intelligence technologies can be implemented. In the near future, there will be a great advancement in this sector rooted in the wide array of sources for valuable data –large amounts of useful data being the core of AI. Insights can be drawn from hundreds of millions of associations between biologically meaningful entities (e.g., molecules, cells, organs) and unstructured text such as literature references, lab data, insurance data, patient records, research data, and even social media data. AI can filter out the noise and use this data for the identification of new drug candidates, detection of new indications for existing drugs, analysis of biological targets, safety profiles, and the recruitment of suitable patients for clinical studies – just to name a few. As a conclusion we can say that AI is just starting to play an important role in discovering and developing new solutions and will surely drive drug development in the future, paving the way for optimized, more efficient and accelerated processes in a highly regulated and rather old-fashioned environment. 

5) Proper terminology

When talking about Artificial Intelligence, often times there are several terms used in the same context, some of them being – RPA which stands for Robotic Process Automation, NLP short for Natural Language Processing, and some used interchangeably like Predictive Analytics, Deep Learning or Machine Learning. AI, however, is the broad field of study under which all these terms fall. Merriam Webster defines it as “a branch of computer science dealing with the simulation of intelligent behavior in computers”. So, how is AI different from traditional software? Traditional software needs to be told exactly what it has to do (aka programmed with code) while AI needs to be trained so that it can act how you want it to in the future (kind of like training a dog). AI absorbs information/concepts, processes them (creates relationships and context), then acts. Also, like training a dog, you let it know where it made a mistake and it will remember for the next time (machine learning). Just as one example: When doing a traditional search for articles on adverse events, the search string needs to be exactly defined with the words to be searched using Boolean operators <AND>, <OR> and <NOT>. When using AI, a human will need to provide a number of examples (data) of the types of articles you are looking for, then the software will search for articles that are similar to the ones you provided for training and then will earn by being instructed by a human trainer whether an article was correctly selected or not.

Here are a few helpful definitions of the buzzwords you’ll want to know:

Data mining

The process by which patterns are discovered within large sets of data with the goal of extracting useful information from it.

Machine learning

A field of AI, focused on getting machines to act without being programmed to do so. Machines "learn" from patterns they recognize and adjust their behavior accordingly. Facebook recommending photo tags using image recognition is a good example. The software “learned” what a person looks like from previously tagged images and guesses that it’s them based on that learning. In the financial industry, machine learning predicts bad loans, finds risky applicants, and generates credit scores.

Natural language processing (NLP)

Software that will analyze, understand, and generate languages that humans use naturally. An example is the software used by Amazon’s Alexa Voice Service to play music, make calls, send and receive messages, provide information, news, sports scores, weather, and more.

Predictive Analytics

The act of analyzing current and past data to look for patterns that can help make predictions about future events or performance. Such software is used e.g. by services such as Spotify which propose to their users new music based on what music they have heard in the past. 

Robotic Process Automation

Robotic process automation (RPA) is the application of technology that allows employees in a company to configure computer software or a “robot” (“bot”) to capture and interpret existing applications for processing a transaction, manipulating data, triggering responses and communicating with other digital systems. Such bots may be used by Regulatory staff to automate submission of published dossiers via a gateway to health authorities.

Reference


Nice article Tilo Netzer. Although, I find the ideas that Alan Turing working on during the late 1940s/early 1950s : Turing, A.M. (1952) The Chemical Basis of Morphogenesis, Philosophical Transactions of the Royal Society, Series B, Biological Sciences, Vol. 237, Issue 641 (37-72) and, those of John von Neumann during that period, https://youtu.be/fJltiCjPeMA?t=50m55s , to be much more interesting.

回复
Varinder Singh

Sr. Analyst at Syneos Health (Previously INC Research/inVentiv Health)

7 年

Nice article! AI is the future but we might need to proceed with caution given the harmful consequences it can have.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了