Or why not actually come up with a standard for what a byte is so there aren't so many discrepencies between each machine?
Ever looked up the definition of a byte?
There is no memory standard so if there is a discrepency of 767 bytes at 32k because someone decided to round how big does that discrepency become when you get to a yotta byte?