Snippet List
You can use this cache backend to cache data in-process and avoid the overhead of pickling. Make absolutely sure you don't modify any data you've stored to or retrieved from the cache. Make deep copies instead if necessary.
The backend is basically identical to Django's stock locmem cache (as of r15852 - after 1.3rc1) with pickling removed. It has been tested with that specific Django revision, so basically it's >=1.3 compatible.
See [Django ticket #6124](http://code.djangoproject.com/ticket/6124) for some background information.
- cache
- pickle
- backend
- locmem
- memory
- process
While checking up on some cronjobs at [YouTellMe](http://www.youtellme.nl/) we had some problems with large cronjobs that took way too much memory. Since Django normally loads all objects into it's memory when iterating over a queryset (even with .iterator, although in that case it's not Django holding it in it's memory, but your database client) I needed a solution that chunks the querysets so they're only keeping a small subset in memory.
Example on how to use it:
`my_queryset = queryset_iterator(MyItem.objects.all())
for item in my_queryset:
item.do_something()`
[More info on my blog](http://www.mellowmorning.com/2010/03/03/django-query-set-iterator-for-really-large-querysets/)
- queryset
- iterator
- memory
- gc
2 snippets posted so far.