Caching¶
1. What is Caching?¶
1.1 Definition¶
Caching is a technique that temporarily stores frequently accessed data in a fast storage layer (such as memory) to reduce retrieval time and improve application performance. Instead of recalculating or reloading the same data repeatedly from a slow source (e.g., a database or an API), the system can fetch it from a cache, significantly speeding up responses.
1.2 Real-World Analogy¶
Imagine a busy coffee shop. If a customer orders the same drink repeatedly, instead of preparing it from scratch each time, the barista might pre-make a batch and serve it instantly when someone orders it. This is how caching works—storing results for quick retrieval instead of performing the same task repeatedly.
1.3 Basic Example of Caching in Python¶
import time
cache_store = {}
def expensive_computation(x: int) -> int:
if x in cache_store:
print("Returning cached result")
return cache_store[x]
print("Performing expensive computation...")
time.sleep(3) # Simulating a slow operation
result = x * x
cache_store[x] = result
return result
# First call (slow)
print(expensive_computation(10)) # Computes and stores result
# Second call (fast)
print(expensive_computation(10)) # Fetches from cache
Explanation¶
- We use a dictionary
cache_store
to store computed results. - If the result exists in the cache, it is returned instantly.
- If not, the function performs a slow operation and caches the result for future use.
- The second call avoids recomputation, making it much faster.
2. Benefits of Caching¶
2.1 Faster Response Times¶
Caching speeds up data retrieval by serving precomputed or preloaded results instead of performing expensive operations.
2.2 Reduced Load on Databases & APIs¶
Fetching data from cache reduces the number of database queries or API calls, minimizing load and costs.
2.3 Improved Scalability¶
With caching, applications handle more users efficiently without overwhelming the database or backend services.
2.4 Example: Database Query Without vs. With Caching¶
Without Caching (Slow Execution)¶
import sqlite3
import time
conn = sqlite3.connect(":memory:")
cursor = conn.cursor()
cursor.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
cursor.execute("INSERT INTO users (name) VALUES ('Alice')")
conn.commit()
def get_user_slow(user_id: int):
time.sleep(2) # Simulating slow DB query
cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))
return cursor.fetchone()
# Slow execution
print(get_user_slow(1))
print(get_user_slow(1))
With Caching (Fast Execution)¶
import sqlite3
import time
conn = sqlite3.connect(":memory:")
cursor = conn.cursor()
cursor.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
cursor.execute("INSERT INTO users (name) VALUES ('Alice')")
conn.commit()
cache_store = {}
def get_user_fast(user_id: int):
if user_id in cache_store:
print("Returning from cache")
return cache_store[user_id]
time.sleep(2) # Simulating slow DB query
cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))
result = cursor.fetchone()
cache_store[user_id] = result
return result
# First call is slow, second call is instant
print(get_user_fast(1))
print(get_user_fast(1))
Explanation¶
- Without caching, every call queries the database, making repeated calls slow.
- With caching, the first call fetches from the database, but subsequent calls retrieve the stored value instantly.
3. What is Caching in Esmerald?¶
Esmerald provides a built-in caching system to speed up responses, reduce redundant processing, and optimize performance. It supports multiple backends, including:
- In-Memory Caching (default)
- Redis Caching
- Custom Backends
Esmerald’s caching system integrates seamlessly with request handlers using the @cache
decorator.
4. How to Use Caching in Esmerald¶
4.1 Using the @cache
Decorator¶
The @cache
decorator allows caching responses for a defined ttl
(time-to-live) and a chosen backend.
from esmerald.utils.decorators import cache
Basic Example¶
from esmerald import Esmerald, Gateway, get
from esmerald.utils.decorators import cache
@get("/expensive/{value}")
@cache(ttl=10) # Cache for 10 seconds
async def expensive_operation(value: int) -> dict:
return {"result": value * 2}
app = Esmerald(routes=[Gateway(handler=expensive_operation)])
4.2 Specifying a Cache Backend¶
Using Redis as a Backend¶
from esmerald import get
from esmerald.caches.redis import RedisCache
from esmerald.utils.decorators import cache
redis_cache = RedisCache(redis_url="redis://localhost:6379")
@get("/data/{key}")
@cache(backend=redis_cache, ttl=30)
async def fetch_data(key: str) -> dict:
return {"key": key, "value": key[::-1]} # Simulating an expensive operation
5. Customizing Caching in Esmerald¶
5.1 Using Esmerald Settings to Set a Default Cache Backend¶
Instead of specifying the backend every time, we can configure a global cache backend using EsmeraldAPISettings
.
Example: Setting Redis as the Default Backend¶
from esmerald import EsmeraldAPISettings
from esmerald.caches.redis import RedisCache
class CustomSettings(EsmeraldAPISettings):
cache_backend = RedisCache(redis_url="redis://localhost:6379")
✅ Now, all @cache
decorators without a specified backend will use Redis.
Tip
You can set the default backend to any supported backend, including custom ones. This allows you to maintain a consistent caching strategy across your application.
The default cache backend is the InMemoryCache, which is used if no backend is specified.
6. Building Custom Caching Backends¶
You can extend Esmerald’s caching system by creating your own backend.
6.1 Custom File-Based Cache Backend¶
To create a custom backend, you need to implement the CacheBackend
interface.
That can be imported from:
from esmerald.protocols.cache import CacheBackend
Example¶
import json
import os
from typing import Any
from esmerald.protocols.cache import CacheBackend
class FileCache(CacheBackend):
def __init__(self, directory: str = "cache_files") -> None:
self.directory = directory
os.makedirs(directory, exist_ok=True)
async def get(self, key: str) -> Any | None:
filepath = os.path.join(self.directory, key)
if os.path.exists(filepath):
with open(filepath) as f:
return json.load(f)
return None
async def set(self, key: str, value: Any, ttl: int | None = None) -> None:
filepath = os.path.join(self.directory, key)
with open(filepath, "w") as f:
json.dump(value, f)
async def delete(self, key: str) -> None:
filepath = os.path.join(self.directory, key)
if os.path.exists(filepath):
os.remove(filepath)
✅ This custom backend caches data in files instead of memory or Redis.
6.2 Using the Custom Backend in Esmerald¶
Now you can use the custom backend in your Esmerald application.
from esmerald import get
from esmerald.utils.decorators import cache
file_cache = FileCache()
@get("/file-cache/{data}")
@cache(backend=file_cache, ttl=60)
async def file_cached_endpoint(data: str) -> dict:
return {"data": data, "cached": True}
✅ Data is now cached in files instead of memory or Redis.
Recap¶
✅ Esmerald provides an easy-to-use caching system with multiple backends.
✅ You can use the @cache
decorator to cache responses.
✅ You can set a global cache backend via EsmeraldAPISettings
.
✅ You can create custom caching backends to store data in different ways.