this post was submitted on 27 Nov 2023
887 points (96.8% liked)

memes

10261 readers
2824 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

Sister communities

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] tromars@feddit.de 167 points 11 months ago (6 children)

I know this a a joke but in case some people are actually curious: The manufacturer gives the capacity in Terabytes (= 1 Trillion Bytes) and the operating system probably shows it in Tebibytes (1024^4 Bytes ≈ 1.1 Trillion Bytes). So 2 Terabytes are two trillion bytes which is approximately 1.82 Tebibytes

[–] takeda@lemmy.world 133 points 11 months ago (6 children)

They could easily use the proper units, but sometime someone decided to cheat and now everyone does to the point that this is the standard now.

[–] accideath@lemmy.world 21 points 11 months ago

Before mibi-, gibi-, tibibytes, etc. were a thing, it was the harddrive manufacturers who were creating a little. Everyone saw a kilobyte as 1024 bytes but the storage manufacturers used the SI definition of kilo=1000 to their advantage.

By now, however, kibibytes being 1024 bytes and kilobytes being 1000 bytes is pretty much standard, that most agree on. One notable exception is of course Windows…

[–] sudoku@programming.dev 8 points 11 months ago (2 children)

Indeed, Windows could easily stop mislabeling TiB as TB, but it seems it's too hard for them.

[–] guy@lemmy.world 8 points 11 months ago* (last edited 11 months ago) (1 children)

The IEC changing the definition of 1KB from 1024 bytes to 1000 bytes was a terrible idea that's given us this whole mess. Sure, it's nice and consistent with scientific prefix now... except it's far from consistent in actual usage. So many things still consider it binary prefix following the JEDEC standard. Like KiB that's always 1024 bytes, I really think they should've introduced another new unambiguous unit eg. KoB that's always 1000 bytes and deprecated the poorly defined KB altogether

[–] sudoku@programming.dev 7 points 11 months ago (2 children)

M stands for Mega, a SI prefix that existed longer than the computer data that is being labeled. MB being 1000000 bytes was always the correct definition, it's just that someone decided that they could somehow change it.

[–] guy@lemmy.world 3 points 11 months ago

Consistency with proper scientific prefix is nice to have, but consistency within the computing industry itself is really important, and now we have neither. In this industry, binary calculations were centric, and powers of 2 were much more useful. They really should've picked a different prefix to begin with, yes. However, for the IEC correcting it retroactively, this has failed. It's a mess that's far from actually standardised now

[–] barsoap@lemm.ee 0 points 11 months ago (1 children)

B and b have never been SI units. Closest is Bq. So if people had not been insisting that it's confusing noone would've been confused.

[–] sudoku@programming.dev 2 points 11 months ago

does not mean you can misuse SI prefixes if the unit itself is not part of the system.

[–] TechAdmin@lemmy.world 6 points 11 months ago

I think there were some court cases in the US the HDD manufacturers won that allows them to keep using those stupid crap units to continue to mislead people. Been a minor annoyance for decades but since all the competition do it & no govt is willing to do anything everyone is stuck accepting it as is. I should start writing down the capacity in multiple units in review whenever buy storage devices going forward.

[–] henfredemars@infosec.pub 4 points 11 months ago

And as far as my wife is concerned, I'm definitely 6 ft tall. Height ain't what it used to be.

[–] ininewcrow@lemmy.ca 3 points 11 months ago (1 children)

So what you're saying is that ... we can make up whatever number and standard we want? ... In that case, would you like to buy my 2 Tyranosaurusbytes Hard Drive?

[–] Knusper@feddit.de 7 points 11 months ago

Nah, the prefixes kilo-, mega-, giga- etc. are defined precisely how hard drive manufacturers use them, in the SI standard: https://en.wikipedia.org/wiki/International_System_of_Units#Prefixes

The 1024-based magnitudes, which the computing industry introduced, were non-standard. These days, the prefixes are officially called kibi-, mebi, gibi- etc.: https://en.wikipedia.org/wiki/Binary_prefix

[–] CosmicTurtle@lemmy.world 32 points 11 months ago (3 children)

You're missing a huge part of the reason why the term 'tebibytes' even exists.

Back in the 90s, when USB sticks were just coming out, a megabyte was still 1024 kilobytes. Companies saw the market get saturated with drives but they were still expensive and we hadn't fully figured out how to miniaturize them.

So some CEO got the bright idea of changing the definition of a "megabyte" to mean 1000. That way they could say that their drive had more megabytes than their competitors. "It's just 24 kilobytes. Who's going to notice?"

Nerds.

They stormed various boards to complain but because the average user didn't care, sales went through the roof and soon the entire storage industry changed. Shortly after that, they started cutting costs to actually make smaller sized drives but calling them by their original size, ie. 64MB* (64 MB is 64000).

The people who actually cared had to invent the term "mebibyte" purely because of some CEO wanting to make money. And today we have a standard that only serves to confuse people who actually care that their 2TB is actually 2048 GiB or 1.8 TiB.

[–] pHr34kY@lemmy.world 19 points 11 months ago* (last edited 11 months ago) (1 children)

Dude, a "1.44MB" floppy disk was 1.38MiB once formatted (1,474,560 B raw). It's been going on for eternity.

It's inconsistent across time though. 700MB on a CD-R was MiB, but a 4.7GB DVD was not.

[–] davidgro@lemmy.world 4 points 11 months ago

One thing to point out, The floppy thing isn't due to formatting, the units themselves were screwed up: It's not 1.44 million bytes or 1.44 MiB regardless of formatting - they are 1440 kiB! (Which produces the raw size you gave) which is about 1.406 MiB unformatted.

The reason is because they were doubled from 720 kiB disks*, and the largest standard 5¼ inch disks ("1.2 MB") were doubled from 600 kiB*. I guess it seemed easier or less confusing to the users then double 600k becoming 1.17M.

(* Those smaller sizes were themselves already doubled from earlier sizes. The "1.44 MB" ones are "Double sided double density")

[–] wischi@programming.dev 10 points 11 months ago* (last edited 11 months ago)

That's just wrong. "Kilo" is ancient Greek for "thousand". It always meant 1000. Because bytes are grouped on powers of two and because of the pure coincidence that 10^3 (1000) is almost the same size as 2^10 (1024) people colloquially said kilobyte when they meant 1024 bytes, but that was always wrong.

Update: To make it even clearer. Try to think what historical would have happened if instead of binary, most computers would use ternary. Nobody would even think about reusing kilo for 3^6 (=729) or 3^7 (=2187) because they are not even close.

Resuing well established prefixes like kilo was always a stupid idea.

[–] gornius@lemmy.world 6 points 11 months ago

Or - you know - for consistency? In physics kilo, mega etc. are always 10^(3n), but then for some bizarre reason, unit of information uses the same prefixes, but as 2^(10n).

[–] IWantToFuckSpez@kbin.social 3 points 11 months ago

Depends on the OS. For some reason MacOS uses Base 10.