Login

Overcome the bulk_create() size limitation using SQLite

Author:
alpar
Posted:
March 28, 2012
Language:
Python
Version:
Not specified
Score:
0 (after 0 ratings)

As of django 1.4, the newly introduced bulk_create() function has a strong limitation when using SQLite. Namely it causes an error if you want to create more than 999/F objects, where F is the number of fields in the object type.

The above safe_bulk_create(objs) function solves this issue by splitting the list of objects to appropriately sized bulks.

This solution also works fine with other db backends, and according to my experiments, it causes no significant overhead comparing to using bulk_create() directly.

For more details on the issue, see https://code.djangoproject.com/ticket/17788

Thanks to charettes for pointing out how to calculate the number of fields in an object.

1
2
3
4
5
6
def safe_bulk_create(objs):
    """Wrapper to overcome the size limitation of standard bulk_create()"""
    if objs:
        BULK_SIZE = 900/len(objs[0].__class__._meta.fields)
        for i in range(0,len(objs),BULK_SIZE):
            objs[0].__class__.objects.bulk_create(objs[i:i+BULK_SIZE])

More like this

  1. Django Settings Assignment Expressions aka Walrus Operator example by webology 2 weeks, 6 days ago
  2. codigo alto nivel by MrRocklion 2 months, 2 weeks ago
  3. Load template from specific app by Krzysiek555 3 months, 1 week ago
  4. PostgreSQL JSON subqueries by dolamroth 3 months, 1 week ago
  5. "Magic Link" Management Command by webology 8 months, 2 weeks ago

Comments

ML-Chen (on September 28, 2020):

Note that in Django 1.5, bulk_create has been modified so that on a SQLite backend, it automatically splits it into batches with no more than 999 fields each. So this safe_bulk_create function is only necessary in Django 1.4. In my experience with Django 3.1, using this function was considerably slower (~1.6×) than just using bulk_create.

#

Please login first before commenting.