Redis API response caching with Python

Published on 2025-01-02

Docker WebTech

About Redis, it only took me ~3 hours to re-deploy FastAPI app with redis, lol.

Redis in three sentences

(a) Redis (Remote Dictionary Server) is an open-source, in-memory (RAM and not disk) data structure storage used as a database, cache, and message broker, offering ultra-fast performance due to its memory-based architecture. (b) It supports various data structures like strings, hashes, lists, and sets, making it versatile for real-time applications. (c) As you can tell from the name, it store key-value pairs hence a dictionary.

Caching 101

Caching is an effective way of storing most frequently assessed data in a temporary storage layer to save data retrievals and improve latency. - For front-end/ client-side, we have browser cache, - For static assets such as images and scripts, we utilize CDN (content delivery network) / Edge Server, - For back-end/ server-side, we have solutions like Redis or Memcached. Data in the cache is not meant to be permanent (otherwise, they will become stale) and requires expiration policies such as Time-to-Live (TTL) and Manual Eviction or Cache Invalidation.

Why Redis Cache

For my use case (stock analysis app), I need to call multiple API endpoints from multiple services. My main purpose in injecting Redis Dependency to my FastAPI app is to store these frequently accessed data (API responses) in the memory instead of live calling or storing them in the database. It is also a great way to overcome the rate limiting, a common use case for API throttling. Another cool thing is that we can set expiration times for cached data, so our cached data will always be fresh. Plus, redis has a built-in persistent function to save the in-memory state to dump files. When the system starts up, the dump files are loaded, and the data is ready to use again.

Redis Implementation

With that, let's begin: 1. Install redis and run it (we can install it locally or docker. I'm going with the docker). For docker, see this post. Below is a simple docker-compose file with two services: redis and redisinsight.

# docker-compose.yml

services:
  redis:
    container_name: redis
    image: "redis:alpine"
    env_file:
      # Remember to create .env file in the current working directory. 
      # The .env file should contains the key-value pairs for REDIS_PASSWORD=<super-secret>
      - .env
    environment:
      - REDIS_PASSWORD=${REDIS_PASSWORD}
    ports:
      - 6373:6373
    volumes:
      # mounts a local currnt working directory (./redis-data) for persistent storage
      - ./redis-data:/data
    command:
      # Save a snapshot of the DB every 100 seconds if a least 10 write operation was performed
      - "--save 100 10"
      # Set password
      - "--requirepass ${REDIS_PASSWORD}"
      - "--loglevel warning"

  redisinsight: # redis db visualization dashboard
    container_name: redisinsight
    image: redis/redisinsight:latest
    ports:
      - 5540:5540
    volumes:
    # docker-managed named volume
      - redisinsight:/data

volumes:
  redis-data:
  redisinsight:
2. Install redi-py (Python Redis Client) in python environment.
Side-note: Python comes bundled with the venv module to create virtual environments. It is better to create a virtual python environment for each python project as the more Python projects we have, the more likely it is that we would need to work with different versions of Python libraries, or even Python itself. Newer versions of libraries for one project can break compatibility in another project. After activating the python virtual environment, issue the install of redis with pip3, python package managers in Python3 environment, just like npm for Node.js and brew for Unix.
pip3 install redis
3. We can test the connection to redis in python terminal session. Just type python in the terminal to open a python session.
import redis 
pool = redis.ConnectionPool(host='localhost', port=6379, password='<super-secret>', db=0)
r = redis.Redis(connection_pool=pool)
r.ping() # should return true 
For querying the memory usage metrics and their values (in bytes).
r.memory_stats()
4. Test data insertion and retrieval
r.set('foo','bar',ex=10,nx=True) # setting expire in 10 seconds and nx stands for Not eXists. 
r.get('foo') # should return b'bar'
Note: In Python, when interacting with Redis using a library like redis-py, the data retrieved from Redis is returned as bytes by default. This is because Redis itself communicates using raw byte strings for maximum efficiency and flexibility, as it does not assume any specific encoding. For decoding, simply add decode_responses=True in the connection pool.
redis.ConnectionPool(host='localhost', port=6379, db=0, password='<super-secret>', decode_responses=True)
5. Wrapping up with redis implementation in FastAPI project
Following the files structure proposed by FastAPI, we should create a dependency file for redis connection, which is what we will do here.
# redis_client.py ## this is redis dependency file 
import redis
import settings
from fastapi import Depends

# Create Redis connection pool
def create_redis_pool() -> redis.ConnectionPool:
  return redis.ConnectionPool(
    host=settings.REDIS_URL, 
    password=settings.REDIS_PASSWORD,
    port=settings.REDIS_PORT, 
    db=0, 
    decode_responses=True
  )

# Get Redis client using the connection pool
def get_redis_client(pool: redis.ConnectionPool = Depends(create_redis_pool)) -> redis.Redis:
    return redis.Redis(connection_pool=pool)
And then, in the router file, we can inject the Redis client as a dependency. That way, the pool is created at module level and all Redis instances will be created out of the pool. Another aspect we should implement is cache-invalidation strategy which I will leave it for now.
# In router.py
from redis_client import get_redis_client 
import redis

# inject the Redis client as a dependency to router function 
redis_client: redis.Redis = Depends(get_redis_client)

# inside the router function, we can create a record in redis if the key not exist otherwise fetch from redis, this is a very skeleton code: 
if (redis_client.ping()):
  redis_cache_key = f"<unique_key>"
  cached_data = redis_client.get(redis_cache_key)
  if cached_data is not None: 
    return cached_data
  else:
    <some-logic-code-here>
    redis_client.set(redis_cache_key, api_data, ex=600, nx=True)
    return api_data
6. Monitoring
We should also monitor cache hit rate and redis memory usage which is why we have redisinsight in our docker-compose file.

References