A few months ago, I built a Django application for managing customer orders. At first, everything worked smoothly because I was testing with a small amount of data. However, as the number of users and records grew, the application became noticeably slower. Some pages took several seconds to load, and certain API responses were unacceptably slow.
At first, I assumed the issue was due to server limitations, but after analyzing the performance, I realized that the problem was not Django itself—it was how I had structured my database queries, caching strategy, and background tasks. Through trial and error, I found several techniques that significantly improved the performance of my application. In this post, I will share the key optimizations that made a real difference.
Identifying the Performance Issues
Before making any optimizations, I needed to understand what was slowing down the application. Django has great built-in tools for debugging, but I also used external libraries to get deeper insights.
1. Django Debug Toolbar: This helped me see the number of queries and their execution times.
pip install django-debug-toolbar
After enabling it in my project, I immediately noticed that some pages were making hundreds of database queries.
2. Silk Profiler: A Django profiling tool that helped me track slow API calls.
pip install django-silk
This tool revealed that some endpoints were running expensive queries repeatedly.
3. PostgreSQL EXPLAIN ANALYZE: Running queries with EXPLAIN ANALYZE showed which ones were scanning too many rows or not using indexes efficiently.
Optimizing Database Queries
Reducing Unnecessary Queries
One of the biggest performance issues in Django applications is making too many database queries, especially the “N+1 query problem.” At first, I wrote my queries like this:
orders = Order.objects.all()
for order in orders:
print(order.customer.name)
The problem with this code is that it makes a separate database query for each customer’s name. With hundreds of orders, this resulted in hundreds of queries.
I fixed this by using select_related, which performs a SQL join and retrieves related data in a single query:
orders = Order.objects.select_related('customer').all()
for order in orders:
print(order.customer.name)
Now, only one query is made instead of hundreds.
For relationships where multiple related objects needed to be fetched, I used prefetch_related:
orders = Order.objects.prefetch_related('items').all()
for order in orders:
for item in order.items.all():
print(item.product.name)
This significantly reduced the number of queries and improved response times.
Adding Indexes to Speed Up Queries
Another issue I found was that certain queries were slow because they were scanning large tables instead of using indexes.
For example, filtering orders by customer ID was slow:
Order.objects.filter(customer_id=123)
After running EXPLAIN ANALYZE, I found that this query was using a sequential scan, which means it was checking every row in the table. The fix was simple:
class Order(models.Model):
customer = models.ForeignKey(Customer, on_delete=models.CASCADE)
class Meta:
indexes = [
models.Index(fields=['customer']),
]
After adding the index, the query time dropped from hundreds of milliseconds to just a few milliseconds.
Using Caching to Reduce Database Load
Some queries were being run repeatedly even though the data did not change often. For example, the homepage displayed the top five best-selling products, but every time a user visited the page, it queried the database again.
To fix this, I used Django’s caching framework with Redis.
First, I installed Redis and Django Redis:
sudo apt install redis
pip install django-redis
Then, I updated settings.py:
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://127.0.0.1:6379/1",
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
}
}
}
Now, I cached the expensive query:
from django.core.cache import cache
def get_top_products():
cache_key = "top_products"
products = cache.get(cache_key)
if not products:
products = list(Product.objects.order_by('-sales')[:5])
cache.set(cache_key, products, timeout=600) # Cache for 10 minutes
return products
This change meant that instead of hitting the database on every request, the top products list was fetched from Redis, making the homepage much faster.
Optimizing Static and Media Files
When I tested page load speed using Chrome DevTools, I saw that some CSS and JavaScript files were taking a long time to load. Initially, I served static files using Django, but this is inefficient for production.
I installed WhiteNoise to serve static files efficiently:
pip install whitenoise
Then, I updated settings.py:
MIDDLEWARE = [
"django.middleware.security.SecurityMiddleware",
"whitenoise.middleware.WhiteNoiseMiddleware",
]
STATICFILES_STORAGE = "whitenoise.storage.CompressedManifestStaticFilesStorage"
With WhiteNoise, static files were compressed and cached properly, making the front-end load much faster.
Handling Long-Running Tasks with Celery
Some operations, such as sending emails and generating reports, were blocking requests and making the user wait. I moved these tasks to Celery, which runs background jobs asynchronously.
First, I installed Celery:
pip install celery
Then, I configured Celery with Redis as the message broker:
CELERY_BROKER_URL = “redis://localhost:6379/0”
I created a background task for sending emails:
from celery import shared_task
from django.core.mail import send_mail
@shared_task
def send_order_confirmation_email(user_email):
send_mail(
"Order Confirmation",
"Your order has been placed successfully.",
"[email protected]",
[user_email],
)
Now, instead of sending emails synchronously in views, I called the task asynchronously:
send_order_confirmation_email.delay(user.email)
This allowed the system to remain responsive while emails were sent in the background.
Final Results
After making these optimizations, my Django application became significantly faster.
- The number of database queries per request dropped from hundreds to just a few.
- API response times improved from several seconds to less than 200 milliseconds.
- The homepage loaded almost instantly thanks to caching.
- Long-running tasks like email sending no longer blocked the user experience.
Happy coding!