Login

Memory efficient Django Queryset Iterator

Author:
WoLpH
Posted:
March 3, 2010
Language:
Python
Version:
1.1
Tags:
queryset iterator memory gc
Score:
10 (after 10 ratings)

While checking up on some cronjobs at YouTellMe we had some problems with large cronjobs that took way too much memory. Since Django normally loads all objects into it's memory when iterating over a queryset (even with .iterator, although in that case it's not Django holding it in it's memory, but your database client) I needed a solution that chunks the querysets so they're only keeping a small subset in memory.

Example on how to use it: my_queryset = queryset_iterator(MyItem.objects.all()) for item in my_queryset: item.do_something()

More info on my blog

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
import gc

def queryset_iterator(queryset, chunksize=1000):
    '''''
    Iterate over a Django Queryset ordered by the primary key

    This method loads a maximum of chunksize (default: 1000) rows in it's
    memory at the same time while django normally would load all rows in it's
    memory. Using the iterator() method only causes it to not preload all the
    classes.

    Note that the implementation of the iterator does not support ordered query sets.
    '''
    pk = 0
    last_pk = queryset.order_by('-pk')[0].pk
    queryset = queryset.order_by('pk')
    while pk < last_pk:
        for row in queryset.filter(pk__gt=pk)[:chunksize]:
            pk = row.pk
            yield row
        gc.collect()

More like this

  1. Send large files through Django, and how to generate Zip files by jcrocholl 7 years, 8 months ago
  2. Batch querysets by jkocherhans 6 years, 5 months ago
  3. Queryset Foreach by kcarnold 6 years ago
  4. ModelChoiceField with optiongroups by anentropic 5 years ago
  5. Mini issue tracker by pbx 8 years, 1 month ago

Comments

guettli (on April 15, 2010):

Django does not load all rows in memory, but it caches the result while iterating over the result. At the end you have everything in memory (if you don't use .iterator()). For most cases this is no problem.

I had memory problems when looping over huge querysets. I solved them with this:

Check connection.queries is empty. settings.DEBUG==True will store all queries there. (Or replace the list with a dummy object, which does not store anything):

from django.db import connection
assert not connection.queries, 'settings.DEBUG=True?'

Use queryset.iterator() to disable the internal cache.

Use values_list() if you know you need only some values.

#

barthed (on September 8, 2010):

Thanks. It saved my day!

The only issue I encountered is that the function throws an exception when the queryset is empty (no result).

#

tomgruner (on November 17, 2010):

Thanks, this worked great for a table I have with 250,000 rows in it!

#

Please login first before commenting.