Building an MCP Server for Pinata’s API
Written by: Steve Simkins
Republished from here: https://bit.ly/4iT6qwX
What if AI could help manage your files? Not only on your computer, but in the cloud too? What if not only could it create content, but operate with file storage to upload it, then perhaps share it to socials or create a t-shirt? We can all agree that AI is in quite the hype bubble, and perhaps it will pop eventually, but there is also some parts that feel like magic. For a while, the magic came when a new model came out and changed how AI chat or processes ran, but now it’s all wrapped up in one word: context. AI models are limited by what they know and what they can interact with. Most people have hit a context deadline where the model training stopped, but the magic of context is that it doesn’t necessarily need to be trained. As we’ve worked with RAG with AI before, we know very well how powerful context can be.
This is where the Model Context Protocol, also known as MCP, comes into play. MCP is a protocol built by Anthropic in attempt to standardize how AI models can get extended context and make actions with it. The most common form of MCP - that is all the rage now - is an MCP Server. These are essentially specialized API servers that help provide resources, tools, and prompts to clients. The easiest way to think of it is a plugin for AI models. With something like a Notion MCP server, I can attach it to my chat session, give it access to my Notion files, and now the AI client can not only read the content inside, but create content and move things around if I want to. Take it a step further, if I have multiple plugins that the AI uses in coordination, then we have an AI hand that stretches into multiple places of the internet with one goal: whatever I tell it to do!
Of course, we had to give it shot ourselves, so we built pinata-mcp, a server that runs locally and can give Claude Code or Desktop access to your Pinata account. You can do things like ask questions about your files on IPFS, upload a file, or even retrieve it! In this post, we’ll go over the pieces that make the Pinata MCP run, so even you can have more context :)
Resources
MCP defines resources as data that can be exposed to clients, and it can include:
We stuck with the basics, and used local files and directories as the resource piece of the Pinata MCP.
// List available resources
server.server.setRequestHandler(ListResourcesRequestSchema, async () => {
return {
resources: [
{
uriTemplate: "file://{path}",
name: "Local Files",
description: "Access local files to upload to Pinata IPFS (only from allowed directories)"
}
]
};
});
Our server takes an argument of what directories you want to allow it to read, so you can keep sensitive info secure and only grant what what you want to show it. It can be either a single folder or a collection of folders, or simply all of them. With this access the Pinata MCP can read files and then upload them to Pinata.
Tools
At the core of MCP, there are tools. These are like functions that the AI model has access to. They include descriptions and context to show the model what they can do, and additionally tools are strictly types to provide what can go in, and what will come out. With tools the Pinata MCP can upload new files, update existing ones, query and list them, as well as use groups to organize them. Here’s an example for searching files:
server.tool(
"searchFiles",
{
network: z.enum(["public", "private"]).default("public"),
name: z.string().optional(),
cid: z.string().optional(),
mimeType: z.string().optional(),
limit: z.number().optional().default(100),
},
async ({ network, name, cid, mimeType, limit }) => {
try {
// Build query parameters
const params = new URLSearchParams();
if (name) params.append("name", name);
if (cid) params.append("cid", cid);
if (mimeType) params.append("mimeType", mimeType);
if (limit) params.append("limit", limit.toString());
const url = `https://api.pinata.cloud/v3/files/${network}?${params.toString()}`;
const response = await fetch(url, {
method: "GET",
headers: getHeaders(),
});
if (!response.ok) {
throw new Error(`Failed to search files: ${response.status} ${response.statusText}`);
}
const data = await response.json();
return {
content: [{ type: "text", text: JSON.stringify(data, null, 2) }],
};
} catch (error) {
return {
content: [{ type: "text", text: `Error: ${error}` }],
};
}
}
);
MCP in Action
To actually use an MCP it requires running the code for the server with arguments for what it’s allowed to do, as well as environment variables such as API keys. For Claude Desktop, it takes the form of a JSON file that looks something like this:
{
"mcpServers": {
"pinata": {
"command": "npx",
"args": [
"pinata-mcp",
"/path/to/allowed/directory"
],
"env": {
"PINATA_JWT": "<YOUR_JWT>",
"GATEWAY_URL": "example.mypinata.cloud"
}
}
}
}
When we first started testing out the Pinata MCP, we instantly had the “aha” moment. It was obvious how powerful MCPs could be, especially with the right access and context. The first thing we did was try it in Claude Code which only runs in the terminal. We started with file queries, asking about specific files and details.
Then we felt poetic and asked it to write a poem, then upload it as a text file.
Once we had effectively tested Claude Code, it was time to move onto Claude Desktop. This is where it was truly impressive to watch. When using Sonnet 3.7 with extended thinking, you can watch it make API requests, then realize it needs to refine and change the request, over and over until it has all the info it needs.
Then, of course, we had to try some content generation and uploading it to Pinata. My personal favorite was this sonnet about IPFS.
Tempted to try it out? Pinata MCP is open source and is available for anyone to use! Check out the repo below.
Wrapping Up
Despite how amazing the MCP experience has been, it still has a while to go before it’s consumer ready. Trying to install these and use them still requires a fair amount of technical ability, but it is still a glimpse into the future. With tools and standards like MCP, we see that the AI revolution doesn’t have to be accelerated by new models or better hardware, but simply taking a smarter route to the end goal. AI will not be limited by how much it’s trained. Instead it will reach as far as it’s plugins can take it, and we’re curious what that could look like in the next few years.
Happy Pinning!