What Is Caching?
Caching is a technique where the result of an expensive function call is stored so that subsequent calls with the same arguments can return instantly without recomputation.
Why Use Caching?
- Speeds up repetitive operations
- Reduces CPU usage
- Useful for recursive functions, API responses, and database queries
Built-in Caching with functools.lru_cache
Import and Apply Decorator
from functools import lru_cache
@lru_cache(maxsize=None) # Unlimited cache
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
print(fibonacci(50)) # Fast thanks to caching!
Without caching, calculating fibonacci(50)
would be very slow.
Parameters of lru_cache
-
maxsize
: Maximum number of cached results. UseNone
for unlimited. -
typed
: Set toTrue
to cache different types separately (e.g.,1
and1.0
).
Checking Cache Stats
print(fibonacci.cache_info())
Output:
CacheInfo(hits=48, misses=51, maxsize=None, currsize=51)
Clearing the Cache
fibonacci.cache_clear()
Real Use Case: Slow Computation or Database Query
@lru_cache(maxsize=100)
def expensive_lookup(user_id):
# Simulate slow DB/API call
import time
time.sleep(2)
return f"User {user_id}"
print(expensive_lookup(5)) # Slow on first call
print(expensive_lookup(5)) # Instant from cache
Manual Caching Example (Using Dictionary)
cache = {}
def factorial(n):
if n in cache:
return cache[n]
if n == 0:
result = 1
else:
result = n * factorial(n - 1)
cache[n] = result
return result
Useful if you want full control over cache storage and eviction.
Advanced: Third-Party Caching with joblib
Install:
pip install joblib
Use:
from joblib import Memory
cachedir = "./cache"
memory = Memory(cachedir)
@memory.cache
def slow_add(a, b):
import time
time.sleep(2)
return a + b
Summary
Method | Use Case | Built-in |
---|---|---|
functools.lru_cache |
Recursive or repetitive functions | Yes |
Manual dictionary | Custom cache control | Yes |
joblib.Memory |
Disk-based caching for large data | No (external) |
Caching is a powerful optimization technique in Python that can drastically reduce execution time in compute-heavy or repetitive tasks. With tools like lru_cache
, you can speed up functions with just one line of code.
0 Comments:
Post a Comment