Web Site Interaction Use Case – Loosely Coupling ChatGPT to Enterprise Knowledge Graphs
DALL-E Generated Image

Web Site Interaction Use Case – Loosely Coupling ChatGPT to Enterprise Knowledge Graphs

Large Language Models (LLMs) are a powerful innovation that enables software application functionality to be driven via conversational UI/UX. To put it mildly, this is a game-changer already disrupting the software industry as we know it—despite the well-documented issues LLMs have with factual inaccuracies, commonly referred to as “hallucinations.”

In this post, we explore the common challenge of website interaction (a sub-category of customer support) to demonstrate the utility of ChatGPT when loosely coupled with a Knowledge Graph, facilitated by the OpenLink Personal Assistant (OPAL) Chat Widget. Fundamentally, nobody wants to navigate a web of documents on a website or corporate intranet if they can obtain the information they seek via conversation with a Smart Agent.

What is the OPAL Chat Widget?

The OPAL Chat Widget is a JavaScript-based client library for interacting with an OPAL Server instance. The OPAL Server itself is a Virtuoso Server module comprising a collection of Stored Procedures for guard railed interactions with ChatGPT, via its Completions API, courtesy of its external functions integration capability.

Why is this useful?

It enables controlled interactions with ChatGPT that minimize the effects of hallucinations.

How does it work?

Loosely Coupled Interactions across OPAL Widget, Server, Knowledge Graph, and ChatGPT


The interaction process is straightforward:

  1. The user submits a prompt to the OpenLink Personal Assistant (OPAL), which acts as a protective layer around ChatGPT or other LLMs, such as Mistral.
  2. Through external function integration between OPAL and ChatGPT, a context for interaction is established that drives the prompt completion pipeline, including Knowledge Graph lookups.
  3. The response is returned to the user, with a notice indicating whether the response was sourced from our knowledge base or inferred by ChatGPT.

To serve its purpose while supporting anonymous access and managing costs with OpenAI, we implement fine-grained attribute-based access controls. These controls create a sandbox environment where steps 1-3 operate under a strictly enforced usage policy.

Live Usage

Virtuoso License Installation Assistance via Virtuoso Support Assistant

Simply visit any of the following webpages and engage with the chatbot by asking questions or selecting one of the conversation openers:

  1. OpenLink Software Home Page
  2. Virtuoso Home Page
  3. ODBC & JDBC Data Access Drivers Home Page

Critical Technologies Used

The system I describe relies on several key components:

  1. RDF: A data definition language that accommodates a variety of notations, syntaxes, and data serialization formats.
  2. SPARQL: A query language for performing declarative operations on RDF statements stored in a Knowledge Base or Knowledge Graph, ensuring both performance and scalability required for public web deployment (via our Virtuoso platform).
  3. Reasoning and Inference: Integrated into the underlying data management platform (our Virtuoso platform) to enhance data manipulation and query outcomes.

Conclusion

As demonstrated in this post, using a deployed solution, loosely coupling LLMs and enterprise Knowledge Graphs provide a powerful example of the new kind of software solutions that are now possible using Smart Agents via conversational interfaces.

Related

要查看或添加评论,请登录

社区洞察

其他会员也浏览了