History of NLP and NLU to drive machine understanding of language
Nym - Under the Hood V1

History of NLP and NLU to drive machine understanding of language


No alt text provided for this image

Most of us have shopped for a used car before. Some will say the experience was harrowing and included high-pressure sales tactics. Some will say they had a great experience with a no-haggle sales methodology. Some may only buy used cars from private sellers for a better value or fear of dealership antics. Regardless of where or how you purchase your used vehicle, it is essential to research the vehicle's history. Simply popping the hood may not tell you much (unless the engine is smoking). It would help if you did a deeper dive to understand how your new-to-you vehicle will perform.  What good is a shiny engine when the oil hasn't been changed in 30K miles?

We shall not pop Nym's shiny hood just yet. First, let’s dive into the history of language understanding. This history will help build the foundation for how Nym works.

History of Language Understanding 

Interest in machine understanding of language began in the mid-1900s with the renowned Turing test. The Turing test became a criterion for a machine’s ability to understand human language and sparked the idea of machine language understanding and translation. In the early 1950s, the new field of modern generative linguistics was born, which influenced the cognitive sciences and led to the development of machine natural language processing (NLP). Among the various approaches in this area was that of theoretical linguist Noam Chomsky. His work revolutionized linguistics by claiming that machine models should be expected to recognize grammatically correct or incorrect sentences, just as people do. 

In the decade following the invention of NLP, the NLP field split into two main branches: a rule-based approach focused on syntax, and a statistical approach aimed at probabilistic language understanding. Ultimately, the probabilistic approach proved more versatile and became more commonly accepted than the rule-based approach, with applications in such areas as machine translation, speech recognition, sentiment extraction, and understanding of expressed intent and meaning.  

In the 1960s, Natural Language Understanding (NLU) emerged as a subfield of NLP, focusing on the capabilities of machine-human interaction using rule-based and statistical approaches. NLU is aimed at analyzing machine reading comprehension. It addresses challenges in ambiguity resolution and discourse modeling and detects logical relationships between objects described in a passage of text. To understand text, many NLU approaches use knowledge graphs, known as ontologies, to map linguistic entities with real-world concepts and events. 

Nym has leveraged these advances in linguistic theory and syntactic complexity to develop a new form of NLU designed specifically for understanding clinical language. By combining computational linguistics with clinical knowledge, the Nym CLU technology has created the first effective solution for machines to understand the narrative of a patient chart. 

Are you ready to take a look under the hood? Let's run through what we know.

  1. We have had a chance to review the history of the technology, and it looks promising.
  2. Nym made some advancements that enable it to understand clinical language and reconstructs the narrative for autonomous medical coding.

So far, so good, right? The history checks out, but does it work as described? Let's dive deeper! Stay tuned for the next edition, and we will take a peek under the hood and break down each component of Nym one-by-one.

I look forward to your thoughts below and at [email protected]

要查看或添加评论,请登录

Pete Lauth的更多文章

  • Mechanics of Autonomous Medical Coding

    Mechanics of Autonomous Medical Coding

    Parse trees, ontologies, and autonomous coding, oh-my! Medical coding is very complex, and the advent of autonomous…

  • Clinical Language Understanding

    Clinical Language Understanding

    Hi, and welcome back! Before we look under the hood, did you get a chance to review all the information regarding the…

社区洞察

其他会员也浏览了