FastAPI Cache: A Simple Caching System
Jeremiah Talamantes
Appsec @ Podium, Founder @ Compliiant.io, Founder @ Mitigated.io (Sold), Founder @ RedTeam Security (Sold), Author of Building Security Partner Programs, Social Engineer's Playbook and Physical Red Team Operations
Caching is an essential tool for optimizing web applications and other systems. The basic idea is to minimize server load and access times by storing a copy of frequently used data in a quickly accessible location. This method has several hidden advantages that can boost efficiency, user satisfaction, and system performance.
For starters, caching dramatically enhances application performance and usability. Databases and external APIs are examples of slower storage layers that programs might avoid repeatedly accessing by storing copies of frequently accessed data in a cache. The end result is a more responsive and seamless user experience due to the reduced response times for user queries.
If you like my content, please visit Compliiant.io and share it with your friends and colleagues. Cybersecurity services, like Penetration Testing and Vulnerability Management, for a low monthly subscription. Pause or cancel at any time. See https://compliiant.io/
Caching also helps improve system efficiency and scalability by reducing the strain on external services and underlying databases. Serving data from the cache reduces the need to query the database directly, which in turn lessens the computational strain on the database server and the likelihood of bottlenecks during periods of high traffic. Consequently, systems may scale up to accommodate a larger user base and more transactions without breaking the bank on new gear or resources.
In addition, caching is essential for reducing costs and preserving the environment. Organizations can save operating expenses by achieving the same level of system performance with fewer servers or cloud resources through caching optimization of resource consumption. A smaller carbon footprint for IT operations is another benefit of efficient resource consumption that helps make computing more sustainable.
Why Use FastAPI Cache?
Before we dig into code, let's talk about why FastAPI Cache might be ideal:
Getting Started with FastAPI Cache
Assuming you've got a FastAPI application up and running, integrating FastAPI Cache involves several key steps: installing the package, setting up the cache backend, and applying caching to your endpoints.
Download the sample code from my Github repo:
Step 1: Install FastAPI Cache
First, you'll need to install the FastAPI Cache package and its dependencies. In this example, we'll use Redis as our caching backend.
This command installs the fastapi-cache package, equipping you with the tools needed to implement caching in your application.
Wait, does this use Redis?
Yes. But here's a short primer on Redis for those who are unfamiliar. Redis stands for Remote Dictionary Server, and it's an open-source, in-memory data structure store. It can be used as a database, cache, and message broker. Redis supports various data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams.
Redis is renowned for its speed, thanks to its in-memory storage. It's capable of handling millions of requests per second for real-time applications in industries like gaming, technology, financial services, and more. Beyond its performance, Redis offers:
领英推荐
I chose Redis here because, although it requires additional setup, you can start with a very basic configuration and dramatically increase complexity as your needs grow. https://redis.com/
Setting Up Your FastAPI Application with Redis Cache
Now, let's set up FastAPI Cache using Redis as the backend. Redis is chosen for its efficiency and widespread use as a caching solution. Ensure you have a Redis server accessible for your application to connect to.
Define the Redis Cache Dependency
In the snippet above, we define a dependency function redis_cache that retrieves the configured Redis cache backend. This function will be used with FastAPI's Depends to inject the cache into your route functions.
Initialize the Cache on Application Startup
To make sure your cache is ready to go when your application starts, just set up an event handler for the startup event.
Here, you'll initialize the Redis cache backend.
This code connects to a Redis instance at the specified URL (redis://redis). Simply adjust the URL to match your Redis server configuration.
Using the Cache in Your Application Routes
With the cache setup complete, you can now use it within your application routes. The following example demonstrates fetching and setting cache values.
In this route, we attempt to retrieve a value from the cache. If it's not found (in_cache is None), we set a new cache entry with a key of 'some_cached_key', assign it a value of 'new_value', and specify a Time-To-Live (TTL) of 5 seconds.
Cleaning Up on Application Shutdown
Finally, ensure you clean up the cache connection when your application is shutting down by handling the shutdown event.
This code ensures that the cache connections are properly closed when your FastAPI application exits, preventing resource leaks or other potential issues. Remember, the specific Redis URL and cache keys should be tailored to fit your application's requirements and environment settings.
Download the code from my Github repo here
If you like my content, please visit Compliiant.io and share it with your friends and colleagues. Cybersecurity services, like Penetration Testing and Vulnerability Management, for a low monthly subscription. Pause or cancel at any time. See https://compliiant.io/
Product Designer | UI/UX Enthusiast | Product Design and Research | SaaS Based Digital Products
8 个月Impressive insights! Can't wait to see the impact of your implementation. ?? Jeremiah Talamantes