Gibibyte Versus Gigabyte

When you buy a hard drive, you usually take note of the capacity and, depending if you’re a metric kind of person or a computer tech, you expect to find the number of bytes  in your hard drive to be as  you expect them.

The metric type of person would say that 320 Gigabytes is 320,000,000,000 bytes while the computer tech would say that 320 Gigabytes is 320 x 2^30 (around 343,597,383,700 bytes).   The confusion is further compounded by the way operating systems report the number of available bytes in your hard drive.  Note that most hard drive manufacturers nowadays use the metric “definition” of Gigabyte (i.e. 320 GB is 320,000,000,000 bytes).

Recently, however, around 2008, a standard known as IEEE 1541-2002, was reaffirmed and is hoped to remedy the situation by defining Gigabyte in terms of the metric system and Gibibyte (gigabinary byte) in terms of powers of 1024 (which is in turn also expressed as 2^10).  At last, we can now differentiate between the two old views of Gigabyte:  1 Gigabyte (1,000^3)  is NOT equal to 1 Gibibyte (2^30 or 1024^3 bytes)!

So far, Snow Leopard properly reports the size of the drive according to this standard.  I’m not sure about Windows 7.  Perhaps it would follow the standard as well.

This standard should be introduced in current and future computer books soon.