Integrating MCP (Model Context Protocol) with LangChain4j to access GitHub
Large Language Models (LLMs) have transformed how we interact with software, but they often lack direct access to real-world data and services. The Model Context Protocol (MCP) addresses this limitation by providing a standardised way for LLMs to interact with external systems.
In this article, you'll learn how to use LangChain4j to connect an LLM with GitHub through MCP: it shows how to fetch and summarise GitHub repository commits.
What is MCP?
The Model Context Protocol (MCP) is an open protocol that standardises how Large Language Model (LLM) applications communicate with external data sources and tools. It enables integration between LLMs and various services, allowing them to access real-world data and perform actions through a well-defined interface. MCP follows a client-server architecture built on JSON-RPC where:
MCP servers can expose three main types of capabilities:
Each of these server capability is accessed through specific protocol messages. For example, to use a tool:
The GitHub MCP Server
The MCP ecosystem includes numerous servers, primarily implemented in TypeScript and Python, each providing specific functionality to LLMs. These servers enable LLMs to interact with various services and data sources, from file systems to databases, and from search engines to version control systems.
The GitHub MCP server exposes GitHub's functionality through a set of standardised tools that LLMs can discover and use. It can be executed locally on your machine using Docker. When an LLM connects to this MCP server, it can retrieve a list of available tools through the tools/list endpoint. These tools include operations such as:
When an LLM decides to use a tool, it sends a tools/call request with the appropriate parameters (the name of the tool and the parameters of this tool). The GitHub MCP server then executes the corresponding GitHub API operation and returns the results to the MCP client in a standardised format, that the LLM can then process.
MCP in LangChain4j
LangChain4j has introduced a new module that simplifies the integration of MCP servers into Java applications. This module provides a clean API for establishing connections with MCP servers and executing tools through them.
The core components of the LangChain4j MCP implementation include:
Here's how you create an MCP client in LangChain4j:
McpTransport transport = new StdioMcpTransport.Builder()
.command(List.of("your-command"))
.logEvents(true)
.build();
McpClient mcpClient = new DefaultMcpClient.Builder()
.transport(transport)
.build();
McpToolProvider toolProvider = McpToolProvider.builder()
.mcpClients(List.of(mcpClient))
.build();
Using LangChain4j and MCP to Connect to GitHub
Let's examine a practical example that uses the GitHub MCP server to summarise commits from a public repository. First of all, to interact with GitHub, you need to execute the GitHub MCP server in Docker. To build the Docker image, you can follow these instructions.
领英推荐
Once the GitHub MCP server is up and running, you can connect to it using LangChain4j. For that, you first need to set up the AI model (here we use OpenAI's GPT-4o-mini model):
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.logRequests(true)
.logResponses(true)
.build();
Next, you configure the MCP transport layer to communicate with the GitHub MCP server using Docker. Notice the docker command that starts the container mcp/github:
McpTransport transport = new StdioMcpTransport.Builder()
.command(List.of("/usr/local/bin/docker", "run", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "-i", "mcp/github"))
.logEvents(true)
.build();
You then create an MCP client and tool provider:
McpClient mcpClient = new DefaultMcpClient.Builder()
.transport(transport)
.build();
McpToolProvider toolProvider = McpToolProvider.builder()
.mcpClients(List.of(mcpClient))
.build();
Finally, you create an AI service and use it to interact with GitHub by asking to summarise the last 3 commits:
Bot bot = AiServices.builder(Bot.class)
.chatLanguageModel(model)
.toolProvider(toolProvider)
.build();
String response = bot.chat("Summarize the last 3 commits of the LangChain4j GitHub repository");
System.out.println("RESPONSE: " + response);
The response will be something like:
Here are the summaries of the last three commits in the LangChain4j GitHub repository:
* Commit SHA: 1a8a70e
* Date: February 18, 2025
* Author: Dmytro Liubarskyi
* Message: "docu: added tools <-> MCP links"
* Details: This commit added documentation links between tools and MCP.
* Commit SHA: d755ae5
* Date: February 18, 2025
* Author: minglu7
* Message: "Fix deprecated methods in DefaultContentInjector (#2505)"
* Details: This commit addressed deprecated methods in the `DefaultContentInjector`
* Commit SHA: 55484f2
* Date: February 18, 2025
* Author: SylvestrePinget
* Message: "Removing amazon.titan-text-express-v1 and ai21.jamba-instruct-v1:0 in ITS (#2532)"
* Details: This commit removed legacy models, `amazon.titan-text-express-v1`
Under the Hood
For the LLM to be able to answer the user's question about the last commits, many interactions need to happen:
Conclusion
MCP provides a powerful way to extend LLM capabilities by enabling standardised access to external tools and data sources. LangChain4j's implementation makes it straightforward to integrate MCP servers into Java applications, requiring minimal setup and configuration.
MCP faces the challenges of establishing itself in the rapidly evolving AI landscape. Wider adoption and community contributions will be crucial for MCP to become the de facto standard for connecting LLMs with external tools and services.
You can find all the code of this article in the LangChain4j Sample repository.
Where to Go Next
To learn more about MCP and LangChain4j, you can explore these resources:
pgx-outbox | Senior Golang Software Engineer at Zalando Oy
2 周Thanks for article. In the last flow diagram how the application decides to use MCP Server first before calling LLM? I guess this interaction is missing "The LLM picks up the tools which it thinks is appropriate, here list_commits."
AI Startup co-founder, CTO, Senior Software Engineer, FF4j creator
3 周Great article! For old dudes like me.. I am thinking of MCP as the Service Registry (Consult, Eureka, Zookeeper anyone ?) for AI Tools. (+ adding a protocol on top of it). Just for the meme...welcome TOA (tools oriented architecture)... Question open to debate what motivates the implementation of an MCP SERVER as a service provider, or as an application owner to open your tools to the IS.... work for consultants up to 2030.
Directeur des Systèmes d'Information, enthousiaste et accompagné d'une belle équipe
3 周J'adore !!! Merci Antonio de m'avoir fait découvrir cela. Ca ouvre de belles perspectives pour la mise en place de RAG. Finalement, dans ce cas, pas de base vectorielle.
husband | father | musician | nerd stuff | duality
3 周this is rad. love seeing MCP in action like this. feels like a real step toward making AI actually useful instead of just chatty. we've been using it with Nigel (Scott Silvi's AI system that turns natural language into all sorts of structured outputs) to pull in github context too, and it's been a game changer for making LLMs actually do something productive.