Why JSON sucks.
- object (like a hash)
- integer (signed 32bit)
- number (double)
Really? It's 2010 and we're all flocking to a grammar where we can only accurately represent integers up to 231-1. WTF? Any new standard we adopted should certainly have had grammar specifications for: 1, 8, 16, 32, 64, 128 bit signed and unsigned integers, 32bit and 64bit IEEE floating point (float and double) as well as arbitrary precision real numbers. I'd much rather have a grammar that I struggle to translate into my native language data types (like, "hmmm, what am I supposed to do with a real in C?") than I would have a grammar in which I cannot precisely express myself.
Every time I need to (correctly) represent a large integer such as 4611686018427387900, I'm forced to do so in a string. It causes me to throw up in mouth a little. Everyone seems dead set on this, so I suppose I'll learn to cherish the flavor of bile.