Number

The number type is inhabited by real numeric values. Specifically, it has 53 bits of precision, meaning it can accurately represent integer values between $$2^{53}$$ and $$-2^{53}$$, after which arithmetic operations will lose precision. This is because the number type is implemented using double-precision floating-point numbers.