Power of Natural Language Processing (NLP) and its Applications in Business

Power of Natural Language Processing (NLP) and its Applications in Business

1.0???????Preliminaries

With smart assistants like Siri, Cortana, Alexa, and Google Assistant, such conversations have become very common these days. These intelligent personal assistants are adept at understanding the context of the query raised and presenting results in the form of spoken language, sometimes also providing useful links, directions on maps, etc. Such systems are based on Natural Language Processing (NLP) - a combination of computer science, artificial intelligence, and computational linguistics – aimed to help humans and machines communicate in natural language, just like a human-to-human conversation.

In the year 1950, Alan Turing, a well-known mathematician and computer scientist proposed the well-known Turing Test. In his paper Computing Machinery and Intelligence, he stated that, if a machine can have a conversation with a person and trick him into believing that he is actually speaking to a human, then such a machine is artificially intelligent. This was the foundation of NLP technology. An effective NLP system can comprehend the question and its meaning, dissect it, determine appropriate action, and respond in a language the user will understand. Natural Language Understanding (NLU) and Natural Language Generation (NLG) are two subsets of Natural Language Processing (NLP): NLP = NLU + NLG. Although all three of them deal with natural language, each one plays different roles at different points. The terms Processing, Understanding, and Generation are self-explanatory here.

·??????NLP identifies and processes the most significant data and structures it into text, numbers, or computer language

·??????NLU understands the human language and converts it into data

·??????NLG uses the structured data and generates meaningful narratives out of it

No alt text provided for this image

Figure 1. NLP, NLU and NLP

Let us go through each one of them separately to understand the differences and co-relation better.

2.0???????Natural Language Processing (NLP)

Natural Language Processing (NLP) is a blanket term used to describe a machine’s ability to ingest what is said to it, break it down, comprehend its meaning, determine appropriate action, and respond back in a language the user will understand. It is as if commands are given to the machine than it will respond with the appropriate actions. Natural language processing, which evolved from computational linguistics, uses methods such as computer science, artificial intelligence, linguistics, and data science, to enable computers to understand human language in both written and verbal forms. Natural language processing emphasizes its use of machine learning and deep learning techniques to complete tasks like language translation or question answering. Natural language processing works by taking unstructured data and converting it into a structured data format. It does this through the identification of named entities (a process called named entity recognition) and identification of word patterns, using methods like tokenization, stemming, and lemmatization, which examine the root forms of words.

No alt text provided for this image

Figure 2. How NLP works

While a number of NLP algorithms exist, different approaches tend to be used for different types of language tasks. For example,

·??????Hidden Markov Chains tend to be used for part-of-speech tagging. The Hidden Markov Models (HMM) is a probabilistic model for modelling generative sequences characterized by an underlying process generating an observable sequence. HMMs have various applications such as in speech recognition, signal processing, and some low-level NLP tasks such as POS tagging, phrase chunking, and extracting information from documents. HMMs are also used in converting speech to text in speech recognition. HMMs are based on Markov chains. A Markov chain is a model that describes a sequence of potential events in which the probability of an event is dependant only on the state which is attained in the previous event. Markov model is based on a Markov assumption in predicting the probability of a sequence.

·??????Recurrent Neural Networks help to generate the appropriate sequence of text. Recurrent Neural Network is a Deep Learning and Artificial Neural Network design that is suited for sequential data processing. It has proven to be comparatively accurate and efficient for building language models and in tasks of speech recognition. The name of this neural networks comes from the fact that they operate in a recurrent way. This means that the same operation is performed for every element of a sequence, with its output depending on the current input, and the previous operations. This is achieved by looping an output of the network at time t with the input of the network at time t + 1. These loops allow persistence of information from one time step to the next one.

·??????N-grams, a simple language model (LM), assign probabilities to sentences or phrases to predict the accuracy of a response. It's a probabilistic model that's trained on a corpus of text. The Brown, WSJ, and Switchboard corpus are the three most used tagged corpora for the English language. Such a model is useful in many NLP applications including speech recognition, machine translation and predictive text input. An N-gram model is built by counting how often word sequences occur in corpus text and then estimating the probabilities.

These techniques work together to support popular technology such as chatbots, or speech recognition products like Amazon’s Alexa or Apple’s Siri. However, its application has been broader than that, affecting other industries such as education and healthcare.

NLP Tools

·??????Python and the Natural Language Toolkit (NLTK): The Python programing language provides a wide range of tools and libraries for attacking specific NLP tasks. Many of these are found in the Natural Language Toolkit, or NLTK, an open source collection of libraries, programs, and education resources for building NLP programs. The NLTK includes libraries for many of the NLP tasks listed above, plus libraries for subtasks, such as sentence parsing, word segmentation, stemming and lemmatization (methods of trimming words down to their roots), and tokenization (for breaking phrases, sentences, paragraphs and passages into tokens that help the computer better understand the text). It also includes libraries for implementing capabilities such as semantic reasoning, the ability to reach logical conclusions based on facts extracted from text.

·??????Statistical NLP, Machine Learning and Deep Learning: The earliest NLP applications were hand-coded, rules-based systems that could perform certain NLP tasks, but couldn't easily scale to accommodate a seemingly endless stream of exceptions or the increasing volumes of text and voice data. Enter statistical NLP, which combines computer algorithms with machine learning and deep learning models to automatically extract, classify, and label elements of text and voice data and then assign a statistical likelihood to each possible meaning of those elements. Today, deep learning models and learning techniques based on convolutional neural networks (CNNs) and recurrent neural networks (RNNs) enable NLP systems that 'learn' as they work and extract ever more accurate meaning from huge volumes of raw, unstructured, and unlabelled text and voice data sets.

·??????SpaCy: In most cases, SpaCy is faster than NLTK, but it has only a single implementation for each NLP component. It represents data as an object, simply the interface to build applications. It can also integrate with several other tools and frameworks. It has a simple interface.

·??????PyTorch-NLP: PyTorch-NLP has just been around for a little over a year, yet it already has a tremendous community. It's a fantastic tool for quick prototyping. It is primarily designed for researchers but may also be utilized for prototypes and early production workloads.

·??????Nlp.js: Nlp.js is built on numerous other NLP tools, such as Franc and Brain.js. Many aspects of NLP, including classification, sentiment analysis, stemming, named entity identification, and natural language creation, are accessible through it. It also supports a variety of languages. It is a tool with a simple UI that connects to a number of other excellent programs.

·??????OpenNLP: It is hosted by the Apache Foundation and also can integrate with other Apache projects, like Apache Flink, Apache NiFi, and Apache Spark, easily. It covers all the common processing components of NLP. It allows users to use it as a command line or within the application as a library. It supports multiple languages. It is ready for production workloads using Java.

Use Cases

·??????Spam detection: The best spam detection technologies use NLP's text classification capabilities to scan emails for language that often indicates spam or phishing. These indicators can include overuse of financial terms, characteristic bad grammar, threatening language, inappropriate urgency, misspelled company names, and more. Spam detection is one of a handful of NLP problems that experts consider 'mostly solved'.

·??????Machine translation: Google Translate is an example of widely available NLP technology at work. Effective translation captures accurately the meaning and tone of the input language and translate it to text with the same meaning and desired impact in the output language. Machine translation tools are making good progress in terms of accuracy. A great way to test any machine translation tool is to translate text to one language and then back to the original.

·??????Virtual agents and chatbots: Virtual agents such as Apple's Siri and Amazon's Alexa use speech recognition to recognize patterns in voice commands and natural language generation to respond with appropriate action or helpful comments. Chatbots perform the same magic in response to typed text entries. The best of these also learn to recognize contextual clues about human requests and use them to provide even better responses or options over time. The next enhancement for these applications is question answering, the ability to respond to questions—anticipated or not—with relevant and helpful answers in their own words.

·??????Social media sentiment analysis: NLP has become an essential business tool for uncovering hidden data insights from social media channels. Sentiment analysis can analyse language used in social media posts, responses, reviews, and more to extract attitudes and emotions in response to products, promotions, and events–information companies can use in product designs, advertising campaigns, and more.

·??????Text summarization: Text summarization uses NLP techniques to digest huge volumes of digital text and create summaries and synopses for indexes, research databases, or busy readers who don't have time to read full text. The best text summarization applications use semantic reasoning and natural language generation (NLG) to add useful context and conclusions to summaries.

3.0???????Natural Language Understanding (NLU)

Natural Language Understanding (NLU) is a subset of NLP that deals with much narrower, but equally important face of how to best handle unstructured inputs and convert them into a structured form that a machine can understand and act upon. While humans are able to effortlessly handle mispronunciations, swapped words, contractions, colloquialisms, and other quirks, machines are less adept at handling unpredictable inputs. While giving any command to the machine, the machine will convert that command to the structured manner that it will understand the command then can act upon accordingly, this process is known as Natural Language Understanding (NLU).

NLU uses syntactic and semantic analysis of text and speech to determine the meaning of a sentence. Syntax refers to the grammatical structure of a sentence, while semantics alludes to its intended meaning. NLU also establishes a relevant ontology: a data structure which specifies the relationships between words and phrases. While humans naturally do this in conversation, the combination of these analyses is required for a machine to understand the intended meaning of different texts.

It is quite possible that the same text has various meanings, or different words have the same meaning, or that the meaning changes with the context. However, the understanding of natural language is based on the following three techniques: Syntax - understands the grammar of the text, Semantics - understands the actual meaning of the text, Pragmatics - understands what the text is trying to communicate. For example, let’s take the following two sentences:

·??????Alice is swimming against the current.

·??????The current version of the report is in the folder.

In the first sentence, the word, current is a noun. The verb that precedes it, swimming, provides additional context to the reader, allowing to conclude that the flow of water in the ocean is been referred. The second sentence uses the word current, but as an adjective. The noun it describes, version, denotes multiple iterations of a report, enabling us to determine that we are referring to the most up-to-date status of a file.

NLU is used in data mining to understand consumer attitudes. In particular, sentiment analysis enables brands to monitor their customer feedback more closely, allowing them to cluster positive and negative social media comments and track net promoter scores. By reviewing comments with negative sentiment, companies are able to identify and address potential problem areas within their products or services more quickly.

NLU reads, understands, processes, and creates speech. Chat-enabled business bots interacts with users just like a real human would, without any supervision. It can be applied to gather news, categorize and archive text, and analyse content. Google acquired API.ai provides tools for speech recognition and NLU. It mainly scrapes through unstructured language information and converts it into structured data, and then processed and analysed for the desired results.

Use Cases

·??????Automatic Ticket Tagging & Routing. With text analysis solutions like MonkeyLearn to automate customer service, machines can understand the content of customer support tickets and route them to the correct departments without employees having to open every single ticket and helps them prioritize urgent tickets. According to Zendesk, tech companies receive more than 2,600 customer support inquiries per month in form of unstructured data (email, social media, live chat, etc.) by topic, sentiment, and urgency (among others). These tickets can then be routed directly to the relevant agent and prioritized.

·??????Machine Translation (MT). Accurately translating text or speech from one language to another is one of the toughest challenges of natural language processing and natural language understanding. Using complex algorithms that rely on linguistic rules and AI machine training, Google Translate, Microsoft Translator, and Facebook Translation have become leaders in the field of “generic” language translation. Text can be typed or whole documents can be uploaded and receive translations in dozens of languages using machine translation tools. Google Translate includes OCR software, which allows machines to extract text from images, read and translate it.

·??????Automated Reasoning. Automated reasoning is a subfield of cognitive science that is used to automatically prove mathematical theorems or make logical inferences about a medical diagnosis. It gives machines a form of reasoning or logic, and allows them to infer new facts by deduction. Simply put, using previously gathered and analysed information, computer programs are able to generate conclusions using IF-THEN deduction rules.

·??????Question Answering. Question answering is a subfield of NLP and speech recognition that uses NLU to help computers automatically understand natural language questions. For example, here’s a common question you might ask Google Assistant: “What’s the weather like tomorrow?” NLP tools can split this question into topic (weather) and date (tomorrow), understand it and gather the most appropriate answer from unstructured collections of “natural language documents”: online news reports, collected web pages, reference texts, etc. By default, virtual assistants tell the weather for current location, unless particular city is specified.

4.0???????Natural Language Generation (NLG)

Natural Language Generation (NLG) is sort of a translator that turn structured data such as a knowledge base into human understandable text. It turns long documents that summarize or justify the contents of databases, by summarizing medical records, generating product descriptions from different sources or automated news reports. Automated NLG is often compared to the method humans use after they flip concepts into writing or speech. NLG is also viewed as opposite of NLU. Based on its extent, NLG in AI can be classified into three types – Basic NLG, Template-driven NLG, and Advanced NLG.

·??????Basic NLG - The simplest level or basic NLG identifies and gathers a few data points and transcribe them into sentences. For example, a simple weather report like this: “the humidity today is 78%.”

·??????Template-driven NLG - The next level of NLG, also known as template-driven NLG, uses template-heavy paragraphs to generate language as per the dynamic data. It relies on hard-coded rules with canned text, placeholders and special data representations. Here, language is generated by the virtue of preliminary business rules guided by looping commands like if/else statements. Sports score charts, stock market updates and basic business reports can be made using this type of NLG.

·??????Advanced NLG - Advanced NLG tools are more flexible than basic and template-driven NLG. It uses Machine Learning to convert data into narratives with a distinct introduction, elaboration, and conclusion. Deep learning neural networks that learn lexical, morphological and grammar patterns from written language are applied to execute this form of NLG. Phrazor, an augmented analytics tool, uses NLG technology to generate elaborate narratives be it in sports, finance, or pharma as per the end user’s requirement. It also features report templates built for a wide range of use cases which help business users to generate insights from business data within minutes.

An automated text generation process involves 6 stages:

No alt text provided for this image

Figure 3. How NLG works

·??????Content Determination. The limits of the content should be determined. The data often contains more information than necessary. In football news example, content regarding goals, cards, and penalties will be important for readers.

·??????Data interpretation. The analysed data is interpreted. With machine learning techniques, patterns can be recognized in the processed data. This is where data is put into context. For instance, information such as the winner of the match, goal scorers & assisters, minutes when goals are scored are identified in this stage.

·??????Document planning. In this stage, the structures in the data are organized with the goal of creating a narrative structure and document plan. Football news generally starts with a paragraph that indicates the score of the game with a comment that describes the level of intensity and competitiveness in the game, then the writer reminds the pre-game standings of teams, describes other highlights of the game in the next paragraphs, and ends with player and coach interviews.

·??????Sentence Aggregation. It is also called micro planning, and this process is about choosing the expressions and words in each sentence for the end-user. In other words, this stage is where different sentences are aggregated in context because of their relevance.

·??????Grammaticalization. Grammaticalization stage makes sure that the whole report follows the correct grammatical form, spelling, and punctuation. This includes validation of actual text according to the rules of syntax, morphology, and orthography. For instance, football games are written in the past tense.

·??????Language Implementation. This stage involves inputting data into templates and ensuring that the document is output in the right format and according to the preferences of the user.

NLG Tools

NLG relies on machine learning algorithms and other approaches to create machine-generated text in response to user inputs. Some of the methodologies used include the following:

·??????Markov Chain. The Markov model is a mathematical method used in statistics and machine learning to model and analyse systems that are able to make random choices, such as language generation. Markov chains start with an initial state and then randomly generate subsequent states based on the prior one. The model learns about the current state and the previous state and then calculates the probability of moving to the next state based on the previous two. In a machine learning context, the algorithm creates phrases and sentences by choosing words that are statistically likely to appear together.

·??????Recurrent Neural Network (RNN). These AI systems are used to process sequential data in different ways. RNNs can be used to transfer information from one system to another, such as translating sentences written in one language to another. RNNs are also used to identify patterns in data which can help in identifying images. An RNN can be trained to recognize different objects in an image or to identify the various parts of speech in a sentence.

·??????Long Short-Term Memory (LSTM). This type of RNN is used in deep learning where a system needs to learn from experience. LSTM networks are commonly used in NLP tasks because they can learn the context required for processing sequences of data. To learn long-term dependencies, LSTM networks use a gating mechanism to limit the number of previous steps that can affect the current step.

·??????Transformer. This neural network architecture is able to learn long-range dependencies in language and can create sentences from the meanings of words. Transformer is related to AI. It was developed by OpenAI, a non-profit AI research company in San Francisco. Transformer includes two encoders: one for processing inputs of any length and another to output the generated sentences.

The three main Transformer models are as follows:

·??????Generative Pre-trained Transformer (GPT) is a type of NLG technology used with business intelligence (BI) software. When GPT is implemented with a BI system, it uses NLG technology or machine learning algorithms to write reports, presentations and other content. The system generates content based on information it is fed, which could be a combination of data, metadata and procedural rules.

·??????Bidirectional Encoder Representations from Transformers (BERT) is the successor to the Transformer system that Google originally created for its speech recognition service. BERT is a language model that learns human language by learning the syntactic information, which is the relationships between words, and the semantic information, which is the meaning of the words.

·??????XLNet is an artificial neural network that is trained on a set of data. It identifies patterns that it uses to make a logical conclusion. An NLP engine can extract information from a simple natural language query. XLNet aims to teach itself to be able to read and interpret text and use this knowledge to write new text. XLNet has two parts: an encoder and a decoder. The encoder uses the syntactic rules of language to convert sentences into vector-based representation; the decoder uses these rules to convert the vector-based representation back into a meaningful sentence.

Use Cases

·??????Weather Forecasting Systems. Generate textual weather forecasts from representations of graphical weather maps.

·??????Machine Translation. NLG can be consider as a translation process, which convert an input non-linguistic representation to language speci?c output. So, if the system can generate this input representation from a source language, multilingual translation can be achieved ef?ciently.

·??????Authoring Tools. NLG technology can also be used to build authoring aids, systems which help people create routine documents.

·??????Text Summarization. Applications of NLG can be ex-tended in automatic summary generation in Medical ?eld, News analysis etc.

·??????Question Answering. QA is the task of automatically answering a question posed in natural language. The System generates an answer to a question, a QA computer program may use either a pre-structured database or a collection of natural language documents.

5.0???????NLP Applications in Business

General

·??????Translation - One of the top use cases of natural language processing is translation. Today, translation applications leverage NLP and machine learning to understand and produce an accurate translation of global languages in both text and voice formats.

·??????Autocorrect - NLP is used to identify a misspelled word by cross-matching it to a set of relevant words in the language dictionary used as a training set. The misspelled word is then fed to a machine learning algorithm that calculates the word’s distance to correct words in the training set, adds, removes, or replaces letters from the word, and matches it to a word candidate which fits the overall meaning of a sentence.

·??????Autocomplete - Autocomplete, or sentence completion, combines NLP with certain machine learning algorithms (e.g. Supervised learning, Recurrent neural networks (RNN), or Latent semantic analysis (LSA)) in order to predict the likelihood of a following word or sentence to complete the meaning.

·??????Conversational AI - Conversational AI is the technology that enables automatic conversation between computers and humans. It is the heart of chatbots and virtual assistants like Siri or Alexa. Conversational AI applications rely on NLP and intent recognition to understand user queries, dig in their training data, and generate a relevant response based on automated rule-based tasks, such as answering FAQs or booking flights.

·??????Voice recognition, Automatic speech recognition (ASR) or Speech to text (STT) - It is a software that converts human speech from its analog form (acoustic sound waves) to a digital form that can be recognized by machines. ASR works by: splitting audio of a speech recording into individual sounds (tokens), analysing each sound, using algorithms (NLP, deep learning, Hidden Markov Model, N-grams) to find the most probable word fit in that language, converting the sounds into text. Today, smartphones integrate speech recognition with their systems to conduct voice search (e.g. Siri).

·??????Automatic text summarization - Automatic text summarization is the process of shortening long texts or paragraphs and generating a concise summary that passes the intended message. There are 2 main methods to summarize texts: (i) Extractive summary: In this method, the output text will be a combination of meaningful sentences extracted directly from the original text. (ii) Abstractive summary: This method is more advanced, as the output is a new text. Here, it understands general meaning of sentences, interpret the context and generate new sentences based on the overall meaning.

In both methods, NLP is used in the text interpretation steps, which are: cleaning the text from filling words, Sampling the text into shorter sentences (tokens), Creating a similarity matrix that represents relations between different tokens, Calculating sentence ranks based on semantic similarity, Selecting sentences with top ranks in order to generate the summary (either extractive or abstractive)

·??????Language models - Language models are AI models which rely on NLP and deep learning to generate human-like text and speech as an output. Language models are used for machine translation, part-of-speech (PoS) tagging, optical character recognition (OCR), handwriting recognition, etc. Some of language models are GPT transformers by OpenAI, and LaMDA by Google. These models were trained on large datasets crawled from the internet and web sources in order to automate tasks that require language understanding and technical sophistication.

Healthcare

·??????Dictation - To document clinical procedures and results, physicians dictate the processes to a voice recorder or a medical stenographer to be transcribed later to texts and input to the EMR and EHR systems. NLP can be used to analyse the voice records and convert them to text, in order to be fed to EMRs and patients’ records.

·??????Clinical documentation - Primary care physicians normally spend ~6 hours on EHR data entry during a typical workday. NLP can be used in combination with OCR to extract healthcare data from EHRs, physicians’ notes, or medical forms, in order to be fed to data entry software (e.g. RPA bots). This significantly reduces the time spent on data entry and increases quality of data as no human errors occur in the process.

·??????Clinical trial matching - NLP can be used to interpret the description of clinical trials, and check unstructured doctors’ notes and pathology reports, in order to recognize individuals who would be eligible to participate in a given clinical trial. The algorithm with NLP model uses medical records and research papers as training data in order to be able to recognize medical terminology and synonyms, interpret the general context of a trial, generate a list of criteria for trial eligibility, and evaluate participants’ applications accordingly. A team at Columbia University developed an open-source tool called DQueST which reads trials on ClinicalTrials.gov and then generates plain-English questions such as “What is your BMI?” to assess users’ eligibility.

·??????Computational phenotyping - Phenotyping is the process of analyzing a patient’s physical or biochemical characteristics (phenotype) by relying on only genetic data from DNA sequencing or genotyping. Computational phenotyping uses structured data (EHR, diagnoses, medication prescriptions) and unstructured data (physicians vocal records which summarize patients’ medical history, immunizations, allergies, radiology images, and laboratory test results, as well as progress notes and discharge reports). Computational phenotyping enables patient diagnosis categorization, novel phenotype discovery, clinical trial screening, pharmacogenomics, drug-drug interaction (DDI), etc. In this case, NLP is used for keyword search in rule-based systems which search for specific keywords (e.g. pneumonia in the right lower lobe) through the unstructured data, filter the noise, check for abbreviations or synonyms, and match the keyword to an underlying event defined previously by rules.

·??????Computer assisted coding (CAC) - Computer Assisted Coding (CAC) tools are a type of software that screens medical documentations and produces medical codes for specific phrases and terminologies within the document. NLP-based CACs screen can analyse and interpret unstructured healthcare data to extract features (e.g. medical facts) that support the codes assigned.

·??????Clinical diagnosis - NLP is used to build medical models which can recognize disease criteria based on standard clinical terminology and medical word usage. IBM Watson, a cognitive NLP solution, has been used in MD Anderson Cancer Center to analyse patients’ EHR documents and suggest treatment recommendations, and had 90% accuracy. However, Watson faced a challenge when deciphering physicians’ handwriting, and generated incorrect responses due to shorthand misinterpretations. According to project leaders, Watson could not reliably distinguish the acronym for Acute Lymphoblastic Leukaemia “ALL” from physician’s shorthand for allergy “ALL”.

·??????Virtual therapists - Virtual therapists are an application of conversational AI in healthcare. NLP is used to train the algorithm on mental health diseases and evidence-based guidelines, in order to deliver cognitive behavioural therapy (CBT) for patients with depression, post-traumatic stress disorder (PTSD), and anxiety. In addition, virtual therapists can be used to converse with autistic patients to improve their social skills and job interview skills. For example, Woebot chatbot provides CBT, mindfulness, and Dialectical Behavior Therapy (CBT).

Finance

·??????Credit scoring - Credit scoring is a statistical analysis performed by lenders, banks, and financial institutions to determine the creditworthiness of an individual or a business. NLP can assist in credit scoring by extracting relevant data from unstructured documents such as loan documentations, income, investments, expenses, etc. and feed it to credit scoring software to determine the credit score. Modern credit scoring software utilize NLP to extract information from personal profiles (e.g. social media accounts, mobile applications) and utilize machine learning algorithms to weigh these features and assess creditworthiness.

·??????Insurance claims management - NLP can be used in combination with OCR to analyse insurance claims. For example, IBM Watson has been used to comb through structured and unstructured text data in order to detect the right information to process insurance claims, and feed it to an ML algorithm which labels the data according to the sections of the claim application form, and by the terminology that is filled into it.

·??????Financial reporting - NLP can be combined with machine learning algorithms to identify significant data in unstructured financial statements, invoices, or payment documentations, extract it, and feed it to an automation solution, such as an RPA bot utilized for reporting in order to generate financial reports.

·??????Financial auditing - NLP enables the automation of financial auditing by: Screening financial documents, classifying financial statement content, Identifying document similarities and differences. In turn, this enables the detection of deviations and anomalies in financial statements.

·??????Fraud detection - NLP can be combined with ML and predictive analytics to detect fraud and misinterpreted information from unstructured financial documents. NLP linguistic models detect deceptive emails, which were identified by a “reduced frequency of first-person pronouns and exclusive words, and elevated frequency of negative emotion words and action verbs”. The researchers used an SVM classifier algorithm to analyse linguistic features of annual reports, including voice, active versus passive tone, and readability, detecting an association between these features and fraudulent financial statements.

·??????Stock prices prediction - NLP is used in combination with KNN classification algorithms to assess real-time web-based financial news, in order to facilitate ‘news-based trading’, where analysts seek to isolate financial news that affects stock prices and market activity. To extract real-time web data, analysts can rely on web scraping or web crawling tools. Bright Data’s data collector is a web scraping tool that targets websites, extracts their data in real-time, and delivers it to end users in the designated format.

Retail and E-commerce

·??????Customer service chatbots - A 2019 survey revealed that 65% of decision-makers in customer service believe that a chatbot can understand the customer’s context, and 52% said that chatbots can automate actions based on customer responses. Chatbots in customer service can: Answer FAQs, Schedule appointments, Book tickets, Process and track orders, Cross sell, Onboard new users

·??????In-store bot - Several retail shops use NLP-based virtual assistants in their stores to guide customers in their shopping journey. A virtual assistant can be in the form of a mobile application which the customer uses to navigate the store or a touch screen in the store which can communicate with customers via voice or text. In-store bots act as shopping assistants, suggest products to customers, help customers locate the desired product, and provide information about upcoming sales or promotions.

·??????Market intelligence - Marketers can rely on web scraping to extract e-commerce data (e.g. blogs, social media posts, news websites), as well as product data (reviews, ranks, comments) and combine it with NLP capabilities to analyse consumer sentiments, detect market trends, and optimize their marketing strategies.

·??????Semantic based search - Semantic search refers to a search method that aims to not only find keywords but understand the context of the search query and suggest fitting responses. Many online retail and e-commerce websites rely on NLP-powered semantic search engines to leverage long-tail search strings (e.g. women white pants size 38), understand the shopper’s intent, and improve the visibility of numerous products. Retailers claim that on average, e-commerce sites with a semantic search bar experience a mere 2% cart abandonment rate, compared to the 40% rate on sites with non-semantic search.

HR

·??????Resume evaluation - NLP can be used in combination with classification machine learning algorithms to screen candidates’ resumes, extract relevant keywords (education, skills, previous roles), and classify candidates based on their profile match to a certain position in an organization. Additionally, NLP can be used to summarize resumes of candidates who match specific roles in order to help recruiters skim through resumes faster and focus on specific requirements of the job.

·??????Recruiting chatbots - Recruiting chatbots, also known as hiring assistants, are used to automate the communication between recruiters and candidates. Recruiting chatbots use NLP for: Screening candidate resumes, Scheduling interviews, Answer candidates’ questions about the position, build candidate profiles, Facilitating candidate onboarding.

·??????Interview assessment - Many large enterprises, especially during the COVID-19 pandemic, are using interviewing platforms to conduct interviews with candidates. These platforms enable candidates to record videos, answer questions about the job, and upload files such as certificates or reference letters. NLP is particularly useful for interview platforms to analyse candidate sentiment, screen uploaded documentations, check for references, and detect specific keywords which can reflect positive or negative behaviour during the interview, as well as transcribe the video and summarize it for archiving purposes.

·??????Employee sentiment analysis - NLP can be used to detect employees’ job satisfaction, motivation, friction areas, difficulties, as well as racial and sexual bias. NLP is used to screen feedback surveys, public emails, employee comments on social media and job employment websites, etc. This enables HR employees to better detect conflict areas, identify potential successful employees, recognize training requirements, keep employees engages, and optimize the work culture.

Cybersecurity

·??????Spam detection - NLP models can be used for text classification in order to detect spam-related words, sentences, and sentiment in emails, text messages, and social media messaging applications. Spam detection NLP models typically follow these steps: (i) Data cleaning and pre-processing: removing filling and stop words. (ii) Tokenization: sampling text into smaller sentences and paragraphs. (iii) Part-of-speech (PoS) tagging: tagging a word in a sentence or paragraph to its corresponding part of a speech tag, based on its context and definition. The processed data will be fed to a classification algorithm (e.g. decision tree, KNN, random forest) in order to classify the data into spam or ham (i.e. non-spam email).

·??????Data exfiltration prevention - Data exfiltration is a security breach that involves unauthorized data copying or transfer from one device to another. To exfiltrate data, attackers use cybersecurity techniques such as domain name system (DNS) tunneling (i.e. DNS queries which reflect a demand for information sent from a user’s computer (DNS client to a DNS server) and phishing emails which lead users to provide hackers with personal information. NLP can be used to detect DNS queries, malicious language, and text anomalies in order to detect malware and prevent data exfiltration.

References

·??????Eda Kavlakoglu, NLP vs. NLU vs. NLG: the differences between three natural language processing concepts, November 2020, https://www.ibm.com/blogs/watson/2020/11/nlp-vs-nlu-vs-nlg-the-differences-between-three-natural-language-processing-concepts/

·??????Jagreet Kaur, What are the Differences Between NLP, NLU, and NLG?, October 2021, https://www.xenonstack.com/blog/difference-between-nlp-nlu-nlg

·??????Romil Shah, NLU vs NLP vs NLG: The Understanding, Processing and Generation of Natural Language Explained, https://phrazor.ai/blog/difference-between-nlu-and-nlg-explained

·??????Ronak Koradiya, Differentiate Between NLP, NLG, and NLU, June 2019

·??????Ivy Wigmore, Natural language generation (NLG), July 2021, https://www.techtarget.com/searchenterpriseai/definition/natural-language-generation-NLG

·??????Gurpreet Kaur, GurinderPal Singh, Importance of Natural Language Processing, Its Features, Components and Applications, October 2019, https://www.technoarete.org/common_abstract/pdf/IJERCSE/v6/i10/Ext_34967.pdf

·??????Natural Language Processing (NLP), July 2020, https://www.ibm.com/cloud/learn/natural-language-processing

·??????Romil Shah, A Quick Guide to Natural Language Generation (NLG), https://phrazor.ai/blog/what-is-natural-language-generation-nlg

·??????Cem Dilmegani, Natural Language Generation (NLG): What it is & How it works, April 2022, https://research.aimultiple.com/nlg/

·??????Rachel Wolff, What Is Natural Language Understanding (NLU)?, January 2021,?https://monkeylearn.com/blog/natural-language-understanding/

·??????Cem Dilmegani, Top 30 NLP Use Cases: Comprehensive Guide for 2022, May 2022, https://research.aimultiple.com/nlp-use-cases/

Rakesh Kumar

Managing Director @ Trijotech | SAP Analytics Cloud, BTP, Planning & Consolidation

2 年

要查看或添加评论,请登录

Dr. Vivek Pandey的更多文章

社区洞察

其他会员也浏览了