and there you have an unanswerable question! :-)
at an atomic level in computing, as others have said, it's a stream of ones and zeroes..
after that everything else is implementation, i.e how to interpret, use, store etc.
In fact it's one of the "bug bears" in computing that there are so many "data format standards" (used loosely not definitionally)..
or the age old joke:
"The good thing about (data) standards is there are so many of them!"
No comments yet.