At least in theory. It can be quite jarring to come across situations where this is not the case. And that is exactly what happened to me today.
Imagine a simple Django model:
>>> class Foo(models.Model):
>>> date = models.DateField()
Now assign a somewhat unusual date to a model instance: >>> f = Foo()
>>> f.date = datetime.date(1,2,3)
>>> f.date.year
1
>>> f.date.month
2
>>> f.date.day
3
You can see that we are talking about a date a long time ago. Nevertheless, a valid date as we can confirm by querying the individual elements of the date. Now save the instance: >>> f.save()
If you are using sqlite and do a select * from foo;
in your database shell, you will see the following as the output: 1-02-03
If you load a Django object from that table, you will find everything in order and as expected.However, now try the same with PostgreSQL as database. After doing the very same thing in Python, take a look at what's written in the database table:
2003-01-02
Oops!Now that shouldn't happen!
The interesting thing is that as soon as you specify a year that has at least 3 digits, PostgreSQL stores the correct date. Do
f.date = datetime.date(101,2,3)
and after you save the model in the database it will have stored what we expect to see: 0101-02-03
Also interesting, if you specify a year with two digits and a value more than 12, you get an exception. For example f.date = datetime.date(99,2,3)
: Traceback (most recent call last):
.....
DataError: date/time field value out of range: " 99-02-03"
HINT: Perhaps you need a different "datestyle" setting.
I checked the DATESTYLE setting for my database, and indeed, it was ISO, which defines a date as YYYY-MM-DD. In theory, therefore, that should work. But obviously, something is not quite lining up.To summarize, while it all works fine with sqlite, in PostgreSQL I get:
- A wrong date stored for years between 1 and 12
- An exception for years between 13 and 99
- Correct behaviour for years 100 or up
You should follow me on twitter here.
No comments:
Post a Comment