JSON sucks. Don't get me wrong, I love the simplicity of it. It's simple, it's easy, it's portable, it's ubiquitous at this point. None of that means it doesn't suck. Outside of Javascript (hence the portability), JSON itself is limited to native types in the grammar:

  • null
  • object (like a hash)
  • array
  • string
  • integer (signed 32bit)
  • number (double)
  • boolean

Really? It's 2010 and we're all flocking to a grammar where we can only accurately represent integers up to 231-1. WTF? Any new standard we adopted should certainly have had grammar specifications for: 1, 8, 16, 32, 64, 128 bit signed and unsigned integers, 32bit and 64bit IEEE floating point (float and double) as well as arbitrary precision real numbers. I'd much rather have a grammar that I struggle to translate into my native language data types (like, "hmmm, what am I supposed to do with a real in C?") than I would have a grammar in which I cannot precisely express myself.

Every time I need to (correctly) represent a large integer such as 4611686018427387900, I'm forced to do so in a string. It causes me to throw up in mouth a little. Everyone seems dead set on this, so I suppose I'll learn to cherish the flavor of bile.