For a site that I’ve been working on, I’ve enabled a load up of data all in a python script. Darned handy. The magic method that does all that nice reseting bits is:
from django.core import management
from django.contrib.auth.create_superuser import createsuperuser
from django.db import connection
cursor = connection.cursor()
cursor.execute(“select tablename from pg_tables where schemaname=’public'”)
tables = cursor.fetchall()
for table in tables:
cursor.execute(‘drop table “%s” cascade’ % table)
And then I go on to run a bunch of other methods to do all that cool initial data loading goodness. Only we ran into a problem:
Traceback (most recent call last):
File “initial_data_load.py”, line 641, in ?
File “initial_data_load.py”, line 474, in create_geography_table
latitude = radians(float(lat)),)
File “/users/home/joseph/local/Django-0.95/django/db/models/manager.py”, line 73, in create
File “/users/home/joseph/local/Django-0.95/django/db/models/query.py”, line 223, in create
File “/users/home/joseph/local/Django-0.95/django/db/models/base.py”, line 203, in save
File “/users/home/joseph/local/Django-0.95/django/db/backends/util.py”, line 19, in execute
I couldn’t figure out what was happening for a while, since it was only happening on one system (and annoyingly – not my laptop…). Turns out the detail is in that last line of the traceback. We had “DEBUG=True” enabled in settings.py, and that there Geography table – well, it’s a big’un. We blew out the memory for the python process because when DEBUG=True, the database cursor does a really nice little thing – it keeps all your SQL queries for you. Which is great until you hit that per-process memory limit and your script terminates unexpectedly.
Switching the settings.py to DEBUG=False stopped keeping those queries in memory – and the memory exception went away. Yeah for the wisdom of DEBUG=False!