• Adam Thurgar

Choose the right datatype


Recently I read about the outage at Basecamp 3 that occurred when they ran out of integers.

https://m.signalvnoise.com/update-on-basecamp-3-being-stuck-in-read-only-as-of-nov-8-9-22am-cst-c41df1a58352

This reminded me of a time when I was working on a very busy web site that was growing very quickly.

The initial database design had an identity column (1,1) using an int.

We ran out of integers.

There were two choices, do a reseed of the identity column and start using a negative increment (-1)

OR

Change the datatype to a bigint.

The first choice to me was a bit of a quick and dirty. The better architectural choice was to take the outage and change the datatype. After quick testing this change was made.

Here are the different int datatypes.

Datatype Range Storage bigint -2^63 (-9,223,372,036,854,775,808) to 2^63-1 (9,223,372,036,854,775,807) 8 Bytes int -2^31 (-2,147,483,648) to 2^31-1 (2,147,483,647) 4 Bytes smallint -2^15 (-32,768) to 2^15-1 (32,767) 2 Bytes tinyint 0 to 255 1 Byte

Moral to the story is to choose the right datatype and be aware of the integer ranges and the space required. Sometimes it may be better to waste a bit of space and therefore never have to worry about running out.


6 views

Recent Posts

See All

Cardinality estimator

Recently I was asked by a software vendor to review a particular query that ran in under a second on a SQL Server 2014 installation at a compatibility level of 110 (SQL Server 2012), but when run unde

Index fragmentation

A law firm client, occasionally has issues with their legal software, that is provided by the global leader in this field. The response from the software provider is always the same - you have index f

Deleting large amounts of data

I had a client call me about wanting to delete a large amount of data from their database. They knew what tables they wanted to delete from. They also knew that deleting large amounts of data causes t