4 months ago

How to best deal with MySQL implicit conversion performance hits between table columns of different data types (i.e. where varchar = bigint)

Posted 4 months ago by lambooni

The majority of my tables where the ID visible to the user use a bigint ID. i.e. the orders table has an ID which is also used as the customer facing order number.

All of my other tables use a UUID primary key.

I have a few other tables that store information that relates to all the others. i.e. a stock_movements table will hold information on how stock has moved for a delivery, an order, a delivery exception (and so on) and therefore the related ID column can hold both bigint's and UUID's.

I have therefore set this column to a varchar but have not realised searching a varchar with a bigint has a huge MySQL performance hit. Just on a simple count query I have found this to make a difference from 400ms to 20ms, just by manually adding single quotes to the integer within the query.


// SLOW - 400ms

SELECT COUNT(id) FROM stock_movements WHERE entity_type = 'order' AND entity_id 123456

// FAST - 20ms

SELECT COUNT(id) FROM stock_movements WHERE entity_type = 'order' AND entity_id '123456'

(entity_id being a varchar in the database)

I wondered the best ways to deal with this? Options...

  • Start again and keep all ID's consistent (probably best but not practical at this stage).
  • Cast all ID's to strings on the model. I am not sure if this has any negative impact elsewhere.
  • Cast the ID's at query time. I think doing this with the query builder or laravel relationships may be tricky.
  • ?

Please sign in or create an account to participate in this conversation.