Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage Operating Systems Windows Hardware

When 1 GB Is Really 0.9313 Gigabytes 618

An anonymous reader writes "When it comes to RAM, as every geek knows, 1 GB does not mean 1 billion bytes.. it means 2**30 (1,073,741,824) bytes. However, several decades ago "they" decided that GB, MB, and KB would be interpreted differently when it comes to disk drives; 1 GB means exactly 1 billion bytes. Ed Bott points out that Microsoft's marketers and Windows kernel developers aren't on the same page when it comes to these units: the marketers use the more generous decimal interpretation, while Windows measures and reports capacity using the binary (2**30) measure. Careful customers who bother to check what they've got have been known to get peeved by the discrepancy."
This discussion has been archived. No new comments can be posted.

When 1 GB Is Really 0.9313 Gigabytes

Comments Filter:
  • by liamevo ( 1358257 ) on Monday February 11, 2013 @06:53AM (#42857061)

    Article is a forum post from 2008 talking about things we knew before then.

    Why was this posted?

  • Even worse! (Score:5, Insightful)

    by Anonymous Coward on Monday February 11, 2013 @06:56AM (#42857075)

    To make it even worse, the first comment in that forum states:

    > this is common knowledge for most ppl here.

    timothy should get fired. He's not doing what he is supposed to be doing in a very grossly incompetent, outright insulting, way.

  • by mrbluze ( 1034940 ) on Monday February 11, 2013 @06:56AM (#42857077) Journal

    Article is a forum post from 2008 talking about things we knew before then.

    Why was this posted?

    Extra slow news day?

  • by DiSKiLLeR ( 17651 ) on Monday February 11, 2013 @06:58AM (#42857089) Homepage Journal

    Yeah yeah, this is old news.. but nevertheless, this is one of the only things in the IT industry that really peeves me off.

    Its not just Windows, but Linux and every other OS uses the base 2 notation for KB, MB, GB, TB, etc.

    Why can't we just oust hard drive manufacturers for what they're really doing (ripping us off) and force them to just base 2 notation :/

  • by StoneyMahoney ( 1488261 ) on Monday February 11, 2013 @07:02AM (#42857107)

    If the computer industry can't adapt to counting the way of the rest of the world does, that's our problem. We should be pointing at whoever originally decided that they should usurp the already established term Kilo to mean 1024 and slapping them upside the head. Anything less is pure arrogance on our part.

  • by dosius ( 230542 ) <bridget@buric.co> on Monday February 11, 2013 @07:07AM (#42857153) Journal

    Likewise I say "true GB" for 1024-based and "salesman's GB" for 1000-based. Because the 1024-based units ARE the true units, and the 1000-based units WERE created just to make hard drives look bigger than they actually were.

    -uso.

  • by mwvdlee ( 775178 ) on Monday February 11, 2013 @07:08AM (#42857155) Homepage

    Because the rest of the world refuses to use the obvious and standarddized solution of using Ki and Mi.

  • Not this again. (Score:5, Insightful)

    by serviscope_minor ( 664417 ) on Monday February 11, 2013 @07:09AM (#42857165) Journal

    There is no big grand conspiracy of evil marketing people versus the grand world of computer people.

    1G = 10^9 in every area.

    1Gbit/s = 1e9 bits per second (noone complains)
    1GHz = 1e9 cycles per second (noone complains)
    1GT/s = 1e9 transfers per second (noone complaines)
    1GB = 1e9 bytes (oh the horror! the evil marketing oh woe woe woe)

    The only reason it that 1GB = 1GiB every caught on is because RAM really relies on a power of 2 address bus, so it's always very closely tied into powers of 2 and it's convenient to round that to its nearest decimal equivalent in order to talk about it succinctly.

    There was never any reason to do it for anything else, and hard disk manufacturers pretty much never used GiB when they meant GB.

    And even the venerable 3.5" floppy was an unholy mixture of KB and KiB multiplied together.

  • by Anonymous Coward on Monday February 11, 2013 @07:20AM (#42857235)

    The fault lies 100% at the feet of the, typically, totally technically incompetent marketers.

    Hard drives used, in the long ago good old days, to be measured in base 2 sizes. Back in the days of 20 Meg and 40 Meg and 80 Meg, they were measured using base 2 and so buying an 80 Meg HD, you got 80 "computer" Megs. This was also back in the days of 10+ different HD makers (lots of competition).

    Then at some point an idiot marketer was looking for any edge to make his/her companies product look different than the competition. And they discovered that if instead of dividing the count of bytes by 1024*1024 they instead divided by 1000*1000, the result was a larger number. I.e., a 200 Meg hard drive could now be advertised as 209 Meg. Since 209 was larger than 200, they felt this gave them a "one-up" on the other guys. And once the first idiot marketer did this, the rest of the idiot marketers soon followed suit, because they could not have their own products looking "smaller" on the shelf, and the result is that now HD's are the only computer component that is advertised in base 10 sizes.

    The idiot marketers are also why when you go to buy a hard disk that is only about 15 cubic inches for the disk itself you find the box to be about 5 cubic feet on the store shelf. Not all of that 5 cubic feet is for "padding". 99% of it is to make the box look larger on the shelf.

  • by serviscope_minor ( 664417 ) on Monday February 11, 2013 @07:22AM (#42857247) Journal

    1000-based units WERE created just to make hard drives look bigger than they actually were.

    Invoke Poe's law.

    I honestly can't tell.

  • by Anonymous Coward on Monday February 11, 2013 @07:26AM (#42857269)

    Back when I first studied c.eng, it was drummed into us that base 2 units were ONLY to be used for references to perfectly binarily addressable devices. RAM as we have it today with word lengths that are also powers of two is one. CPU Registers another. Some displays at the time were as well, although no longer, and the origin there was RAM based.

    All else, such as file sizes, card, tape or disk storage, network bandwidth, logic frequency and the like were strictly Base 10.

    Then small systems crept in and base 2 assumptions began to spread. The 1980s brought hard drives marketed with base 2 units. In the 1990s people started to believe a 10MHz cpu was 1024*1024*10 Hz.

    Now this century it's not uncommon to find self-professed geeks calculating say, theoretical throughputs based on the idea their gig-ethernet is 1073741824 bits per second, or that their CPU/RAM speeds use similar numbers in GHz.

    it amuses and saddens me to see newbie geeks calling base 10 hard drive sizes "marketing units", when they simply haven't been taught correctly.

  • by Canazza ( 1428553 ) on Monday February 11, 2013 @07:39AM (#42857325)

    and by the same standards, 2^10 is a KiB [wikipedia.org]

    and yes, why is this geek news when anyone with either a passing interest, or who has ever done a wiki crawl, will know this?

  • by gravis777 ( 123605 ) on Monday February 11, 2013 @07:44AM (#42857345)

    My thoughts exactly. This is an article appropriate for The Today Show or something where you are informing the illiterate masses, not something worthy of posting on Slashdot.

    BTW, this reminds me - a couple of weeks ago on the Today show, they were talking about new cool comptuer terms. One they were talking about was "animated GIFs". I felt like I jumped into a time machine and went back 20 years into the past.

  • Re:Terabytes (Score:5, Insightful)

    by neyla ( 2455118 ) on Monday February 11, 2013 @07:44AM (#42857353)

    It does indeed get worse and worst with increasing size of the units.

    The difference between 1 KB in base-10 and base-2 is 2.4%

    The difference between 1MB in base10 and base2 is 4.9%

    The difference between 1GB in base10 and base2 is 7.4%

    The difference between 1TB in base10 and base2 is 10%

    The difference between 1PB in base10 and base2 is 13%

    The difference between 1EB in base10 and base2 is 15%

    2.4% difference isn't a huge deal, but 15% difference is much more noticeable.

  • Re:GiB (Score:3, Insightful)

    by Psychotria ( 953670 ) on Monday February 11, 2013 @07:48AM (#42857379)

    Sorry, but that's rubbish. GiB was "invented" to justify the incorrect marketing. 1GB has *always* and forever will be 2^30.

  • Re:GiB (Score:2, Insightful)

    by Psychotria ( 953670 ) on Monday February 11, 2013 @07:52AM (#42857393)

    The United States of America can't convert to metric and SI units so it's not reasonable that they could convert to any standard. It is a country full of dumb arses (or, asses because they also cannot spell.)

  • by isorox ( 205688 ) on Monday February 11, 2013 @07:53AM (#42857405) Homepage Journal

    My thoughts exactly. This is an article appropriate for The Today Show or something where you are informing the illiterate masses, not something worthy of posting on Slashdot.

    BTW, this reminds me - a couple of weeks ago on the Today show, they were talking about new cool comptuer terms. One they were talking about was "animated GIFs". I felt like I jumped into a time machine and went back 20 years into the past.

    Slashdot is full of illiterate masses now

  • by equex ( 747231 ) on Monday February 11, 2013 @07:57AM (#42857419) Homepage
    SI is irrelevant in this case, because it obviously does not meet the physical reality of a RAM chip. Some things are just not a multiple of 10, get over it.
  • by Twinbee ( 767046 ) on Monday February 11, 2013 @07:58AM (#42857427)
    Not all of us like kilo to mean 1024. I don't. However, there's a good argument for getting the world to switch to base 8 or 16 for the basic number system. That would trickier to achieve, but we would all be happier in the end, and everything would be consistent (I do like base 12 however, sigh...).
  • by Anonymous Coward on Monday February 11, 2013 @08:01AM (#42857443)

    A terminology they just up and made up later. I have never heard anyone actually use it.

    Ask yourself, when is the last time you heard someone refer to mebibytes and gibibytes. Everyone uses metric prefixes.

  • by Twinbee ( 767046 ) on Monday February 11, 2013 @08:14AM (#42857515)
    It's not the news in itself - it's the discussion which comes from it. I'm firmly in the camp that we should swallow our collective (ahem) 'pride', and realize that it's actually a good thing to standardize and be consistent with the rest of the scientific world in saying that yes, 1kB = 1000 bytes.

    Failing the switch to a base number 16 system (which I think is an admirable goal for humanity, or maybe base 12), that's how it should stay.
  • by ByOhTek ( 1181381 ) on Monday February 11, 2013 @08:32AM (#42857597) Journal

    I'd pretty much agree to the "we should use base 2, computers are base 2"

    However, the article makes a bit of an overstatement. This is not a kernel dweeb vs. marketing dweeb issue. This is a software developer vs. hardware developer issue.

    Sofware developer: Base 2 is easier to work with. We use base 2 (or more precisely, the base (2^10) derivative).
    Hardware developer: If we use base 10 (or more precisely, the base (10^3) derivative) our drives appear larger.

    The point of the system is that it is easy to calculate/work with. While base 10 is easier for humans, base 2 can have some efficiency shortcuts (add the option of using shifts instead of multiplies/divides). Since the vast majority of the time, we see the data at an twice abstracted level (simplified, abbreviated, through CLI or GUI applications), and the exceptions are almost always still slightly abstracted (through code, use of hex/octal/etc. rather than the native bits and bytes), what is easier for the humans (who rarely deal with it direct) is less important than what is more efficient for the computer.

  • by peppepz ( 1311345 ) on Monday February 11, 2013 @08:34AM (#42857609)

    Please remind me: How many bits is there in an SI byte? Is it 10, 100 or 1000.

    There is no "byte" in the SI. The question is therefore irrelevant. There's an IEC standard containing prefixes for 2^10, 2^100, 2^1000 etc, and those prefixes are kibi-, mibi-, gibi- and so on. The SI officially references them, even if they're not strictly part of it.

    If your byte contains 8 bits, you are either using the binary sizes, or you are mixing things to fool the customer.

    What's the relationship between the number of bits in a byte being 8 and 2 being the base for the multiples of the byte? Moreover, deciding that "a byte" is *the* unit of the smallest addressable memory cell of machines is a oversemplification, because there were in the past, and there might be in the future, machines having a word size which is not even a power of two. If anything, one might think that using powers of two to "size" memory comes from the fact that the widths of the ranges addressable by a bus made of binary wires are by nature powers of two - but that has nothing to do with the fact that the addressed items are bytes, 37-bit words or whatever.

    Hard disks are memory, and counting that memory in powers of two makes no sense for them, since they store bits in very strange patterns, therefore hard disk manufacturers never adopted it. Computer networks transfer memory, and counting that memory in powers of two makes no sense, especially since they often transfer bits and not bytes, hence network designers prefer using bits and their decimal multiples rather than their binary counterparts, and they've always done so.

    If you broaden your vision, you'll see that it's transistor-based memory to be "the exception". Therefore the onus should be on operators of that field to adopt the standard binary prefixes, as ugly as they may sound (and no I don't like them either), in order to avoid ambiguity with the terms used by the rest of the world.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday February 11, 2013 @08:37AM (#42857621)
    Comment removed based on user account deletion
  • Oh puh-lease (Score:4, Insightful)

    by Excelcia ( 906188 ) <slashdot@excelcia.ca> on Monday February 11, 2013 @08:39AM (#42857637) Homepage Journal

    I think it's safe to say that most people who have been in the scene from the beginning think the "correct" SI definition of kilo can also fuck off when it comes to computers. As can kebi, mebi and friends. It had been accepted from the beginning that "kilo" meant something just a little different when it came to describing bytes. I accepted that. Everyone accepted that. There was no problem. Even in academic circles, there were no issues.

    The problem came with the storage industry and their pious "oh, but that's not what SI says the units mean". If you think that conforming to strict SI is the reason they made their change, then I'd suggest you not accept kool-aid from strangers. Ever. It was marketing greed, nothing more

    However, while I think kebi, mebi and friends can fall down a deep dark hole, I actually don't mind using their unit symbols. At least in that way there is no misunderstanding in writing what is meant, and the trickle down effect from intellectual papers where it's vital that it's specified what value is means to more lay writings can occur without changing the unit symbols. But I do not now, nor will I ever, read 500MiB as "five hundred mebibytes".

  • Re:Even worse! (Score:5, Insightful)

    by Bogtha ( 906264 ) on Monday February 11, 2013 @08:41AM (#42857655)

    Don't be ridiculous, a 5-line Perl script would do a much better job. I suspect he is a 10 million-line Brainfuck program.

  • by staalmannen ( 1705340 ) on Monday February 11, 2013 @08:50AM (#42857707)

    and by the same standards, 2^10 is a KiB [wikipedia.org]

    and yes, why is this geek news when anyone with either a passing interest, or who has ever done a wiki crawl, will know this?

    Indeed and since when did it matter what Microsoft does on ./ ? Stuff on ./ seems to get less and less "nerd" (figuring out how stuff works / hack together solutions) and more and more "geek" (the "tech hipster" buying the latest stuff, preferably before it is cool).

  • by Skapare ( 16644 ) on Monday February 11, 2013 @09:01AM (#42857789) Homepage

    The basic issue is Marketing Speak. Those people don't understand how to use the Geek Speak values of 1024, 1048576, and 1073741824. They are going to use 1000, 1000000, and 1000000000. Just understand that and live with it. I do. As long as the sectors come across as sizes 512 and 4096 (instead of 500 and 4000), the device can work. I remember working with mainframes and having sector sizes of 800 on some drives.

    I don't use this KiB, MiB, and GiB crap in my software. The standards group that made that doesn't have oversight on software. It was intended for hardware and marketing, which hardly ever uses it. I have code for doing number conversion with metric-LIKE suffixes, but that specifically needs a single letter, so that's just gonna be the way it is. Use it where the binary-ish values apply and don't use it where you need powers of ten.

    It's all about knowing which way to interpret the numbers. For disk drives I know they are talking about k=1000, M=1000000, and G=1000000000.

  • by neokushan ( 932374 ) on Monday February 11, 2013 @09:22AM (#42857917)

    and yes, why is this geek news when anyone with either a passing interest, or who has ever done a wiki crawl, will know this?

    Easy, because it's bashing Windows and this is Slashdot.

  • Re:Not this again (Score:2, Insightful)

    by Anonymous Coward on Monday February 11, 2013 @09:42AM (#42858059)

    This is really quite simple. We cannot not allow nomenclature abuse. If you let people redefine all the terms as they like, it all breaks down. So, 1K is one thousand, no mater what you want to tell me it is. 1Ki was defined to make your life easier, so quit complaining.

  • by nedlohs ( 1335013 ) on Monday February 11, 2013 @09:57AM (#42858209)

    What does the units you use in memory allocation have to do with how you define kilobyte? The computer doesn't care if you call 1024 bytes a kilobyte or a foomboozlebyte so what possible difference can it make? Nor does the computer care if what you call a kilobyte is 1000 bytes or 1024 bytes or 27 bytes.

    Do you do malloc(1000) to get 1024 bytes allocated on your weird computer or something??? f not then how does 1 kilobyte == 1000 bytes stop you from allocating memory by powers of 2, surely your logic has to be doing that calculation already and really doesn't care what you call 2^10 bytes.

  • Re:GiB (Score:5, Insightful)

    by L4t3r4lu5 ( 1216702 ) on Monday February 11, 2013 @10:51AM (#42858699)
    Sorry, but that's rubbish. "Giga is a unit prefix in the metric (base 10) system denoting 10^9. Its symbol is G." Quoted from Wikipedia, GB = Gigabyte = 10^9 bytes.

    The fact that the IT tech industry refused to go by this standard first used in 1947 is our (their) own fault. They should have invented a new unit prefix and stuck to it.

    Other areas where they got it right: Gigabit networking, Gigahertz clock speeds. Why the fuck should they get a free pass for Gigabyte?!
  • by steelfood ( 895457 ) on Monday February 11, 2013 @01:00PM (#42860555)

    When you read GiB in your head, do you say "gigabyte" or "gibibyte?"

  • by Anonymous Coward on Monday February 11, 2013 @01:24PM (#42861021)

    Right, which is why disk sectors are 500 or more recently 4000 bytes...

  • by gomiam ( 587421 ) on Monday February 11, 2013 @01:40PM (#42861315)
    I think about "binary gigabytes". I certainly dislike the sound of Gibibytes.
  • by bws111 ( 1216812 ) on Monday February 11, 2013 @03:36PM (#42863339)

    None of what you are talking about has anything to do with what I said. I am talking about the measurement of things, not the things themselves.

    Memory components are power-of-two boundaries in size. This is necessary because if they were other than a power-of-two in size, math would have to be performed on each memory access. For instance, if you had memory chips that were 1000 bytes in size, and you wanted to access byte 1024, you would have to perform a calculation to find that the byte is at location 24 in the second chip. With binary sizes however, all you need to do is use the address lines to directly access the correct location in the correct chip. Also note that the word-size of the data does not matter: you could return 1 bit, 8 bits, 10 bits, anything at all. What matters is that the number of 'things' (whatever size of the 'thing' itself is) is always a power of two.

    Network speeds are not dependant in the slightest on a power-of-two, regardless of the data being transported. There is absolutely no reason to say that a network that can transfer 1024 bits per second is in any way better or more natural than one that can transfer 1000 bps or one that can transfer 1100 bps. There is no reason to assume that a 'kilobit per second' is anything other than 1000 bps. And if you change the measurement to count bytes instead of bits, a network can transfer 137.5 Bps as easily as it can transfer 1100 bps, or 1.1Kbps.

    Hard disk sizes are like network speeds: there is no inherent power-of-two to their size. There is no reason why a disk could not be made to hold exactly 1000000 bytes (excluding the fact that you would have a partial sector). Therefore, trying to force some power-of-two based prefix on those sizes is just silly.

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...