this post was submitted on 05 Jun 2024
224 points (97.1% liked)

Showerthoughts

28415 readers
596 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.

Rules

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] noxy@yiffit.net 63 points 1 month ago (10 children)

if you use .rar you're an asshole

[–] kratoz29@lemm.ee 17 points 1 month ago (1 children)

What the hell, how so?

Now that I think about it not much software comes in rar nowadays.

[–] bjoern_tantau@swg-empire.de 53 points 1 month ago (4 children)

Because it's a garbage proprietary format that needs extra software on every OS. But for some inane reason it's become the standard for piracy stuff. I think that's the only reason it's still alive.

[–] Anticorp@lemmy.world 27 points 1 month ago (2 children)

It's not garbage. It's used in the pirate community and elsewhere because back in the day things were shared on the Usenet before they were shared anywhere else. There's a limit for file size on the Usenet, so we needed to be able to break compressed files into multiple parts and have an easy way to put them back together when uncompressing. Win Zip did not have that functionality. You can thank WinRar for powering the entire sharing scene for decades. When torrent was becoming popular NO distributors shared on torrent. They shared on the Usenet. Then someone would take a Usenet share and post it to the torrent network. Torrent wouldn't have had much success, or would have taken much longer to catch on if it wasn't for WinRar and the Usenet.

[–] noxy@yiffit.net 8 points 1 month ago (2 children)

7z works fine, and isn't proprietary.

[–] Anticorp@lemmy.world 9 points 1 month ago

7 zip didn't gain popularity until years later. WinRar was essentially free, since most people never bought the lifetime license.

load more comments (1 replies)
[–] BigDanishGuy@sh.itjust.works 4 points 1 month ago (1 children)

There's a limit for file size on the Usenet

No, there is no limit on the file size on usenet. There's a limit on the individual article size, but larger files just require more articles.

The reason why files were split on usenet was completion and corruption, and probably also media size originally. Say you need to post a 700MB file to alt.binaries.erotica.grannies.diapers, then you could just split those 700MB into 477867 articles of 1.5kB each, but if a single article is then corrupted or dropped, then nobody can get the file. If you split the 700MB into 35 files of 20MB each, and each 20MB file into 13654 articles, then a dropped article only corrupts a single file. Add to that, that completion issues often occured (or is it occurs? it's been a long while since I got my Linux iso files from usenet) close to each other. So there might be a bunch of corruption in a single file, but everything else is fine. This is useful if your main provider was your ISPs complimentary usenet server, and you only got the rest from a pay by download service.

About the media comment earlier, I can't be sure. I wasn't around in the early days, but I know that the 700MB file size for movies came from the limitations of CDs. Splitting files can quite possibly stem from some similar restrictions on a removable media.

You can thank WinRar for powering the entire sharing scene for decades

And the saints behind winrar for only bugging you to pay. TBH first time installing 7z on a new windows install, instead of winrar, felt a bit sad.

load more comments (1 replies)
[–] frezik@midwest.social 9 points 1 month ago

RAR has internal file checking and redundancy that allows it to recover from a level of transmission errors. Some of the more clandestine ways pirate teams transfer things are by means that aren't totally reliable, so this is very important. BitTorrent uploaders tend to take the file exactly as they get it, so there you go.

BitTorrent has more sophisticated ways of checking correctness than RAR, so it's not really necessary. It's just too much effort for uploaders to bother.

[–] shalafi@lemmy.world 8 points 1 month ago (2 children)

Windows opens RAR files right out the box. Just tested.

And if you need a separate unzipper for whatever reason, 7-Zip opens all the things.

[–] konalt@lemmy.world 17 points 1 month ago (3 children)

Only WinRAR can create RAR files if I recall correctly. That's the proprietary part.

load more comments (3 replies)
[–] SquigglyEmpire@lemmy.world 4 points 1 month ago

Windows now handles 7z files natively too (at least as of the upcoming Windows 11 24H2 version), I'm glad they've at least added some legit new features for File Explorer.

[–] bitchkat@lemmy.world 6 points 1 month ago (1 children)

I rarely get rars any more. Almost always a single .mkv and a .nfo.

[–] lunarul@lemmy.world 5 points 1 month ago

That's because you're not getting them from the original source. Scene releases come in multi-volume zipped rars. I don't know why they need to be double archived, but they are. But lots of people will take those, unarchive, then re-upload or put them up in a torrent.

[–] Damage@feddit.it 12 points 1 month ago (1 children)

Every scene releaser is an asshole then

[–] noxy@yiffit.net 14 points 1 month ago

YEP!

tho I still appreciate the work, just.....why that

[–] aeronmelon@lemmy.world 11 points 1 month ago

.rar, .r00, .r01, .r02...

load more comments (7 replies)
[–] db2@lemmy.world 41 points 1 month ago (1 children)

For a few hundred kilobyte file sure, the difference is like pocket change. For a larger one you'd choose the right tool for the job though, especially for things like a split archive or a database.

[–] Im_old@lemmy.world 16 points 1 month ago (1 children)

Username checks out! Also you're absolutely right, just last month I was looking for the best compression algorithm/packages to archive a 70gb DB

[–] Willy@sh.itjust.works 8 points 1 month ago (1 children)
load more comments (1 replies)
[–] aard@kyu.de 32 points 1 month ago (1 children)

Nowadays it matters if you use a compression algorithm that can utilize multiple cores for packing/unpacking larger data. For a multiple GB archive that can be the difference between "I'll grab a coffee until this is ready" or "I'll go for lunch and hope it is done when I come back"

[–] quicksand@lemm.ee 7 points 1 month ago (1 children)

In that case, which file type would you recommend?

[–] aard@kyu.de 12 points 1 month ago (1 children)

I personally prefer bzip2 - but it needs to be packed with pbzip, not the regular bzip to generate archives that can be extracted on multiple cores. Not a good option if you have to think about Windows users, though.

load more comments (1 replies)
[–] qjkxbmwvz@startrek.website 29 points 1 month ago (2 children)

In before the .tar.gz/.tar.bz2 gang...

load more comments (2 replies)
[–] boreengreen@lemm.ee 27 points 1 month ago (3 children)

Why isn't everyone using .7z ?

[–] dustyData@lemmy.world 23 points 1 month ago (3 children)

Because gzip and bz2 exists. 7z is almost always a plugin or addon, or extra application. While the first two work out of the box pretty much everywhere. It also depends on frequency of access, frequency of addendum, size, type of data, etc. If you have an archive that you have to add new files frequently, 7z is gonna start grating on you with the compression times. But it is Ok if you are going to extract very frequently from an archive that will never change. While gz and bz2 are overall the “good enough at every use case” format.

load more comments (3 replies)
load more comments (2 replies)
[–] dogsnest@lemmy.world 17 points 1 month ago (3 children)
[–] mindlight@lemm.ee 16 points 1 month ago (2 children)
[–] IHawkMike@lemmy.world 8 points 1 month ago (2 children)
[–] Willy@sh.itjust.works 5 points 1 month ago* (last edited 1 month ago) (1 children)

Those were the days. For anyone under 40 see this for what we dealt with. https://support.usr.com/support/s-modem/s-modem-docs/usrv90.pdf The plug and play section is especially amusing these days.

load more comments (1 replies)
[–] Willy@sh.itjust.works 4 points 1 month ago* (last edited 1 month ago) (1 children)

I tried to look it up but google senseless and bing seems to stall when you type it. If I remember, AT&FM1 is return to factory settings option 1. What does the rest or all of it mean? Is S11 the dial speed

[–] IHawkMike@lemmy.world 4 points 1 month ago (2 children)

Yeah &F is factory default, M1 is speaker on only until connect, S11=35 is the dial speed (although we later learned that 50 ms is the minimum). Dial speed was important because we'd have Telemate on constant redial trying to get into the BBSes that were popular but were busy because they only had one or two phone lines.

[–] Willy@sh.itjust.works 6 points 1 month ago (1 children)

Nice. Thank you. I’m proud I almost remembered that. I never used telemate but I did partner in hosting a small bbs with a friend. FarpointBBS. We had tradewars and the main dude even got it hooked up to the internet in the late days. As you know, way before the www took off.

[–] IHawkMike@lemmy.world 5 points 1 month ago

Sooo many good memories.

[–] Willy@sh.itjust.works 3 points 1 month ago

M1 is mute true. Now it’s coming back!

[–] adarza@lemmy.ca 3 points 1 month ago
load more comments (2 replies)
[–] ssm@lemmy.sdf.org 17 points 1 month ago

Now we have so much bandwidth it doesn't matter

Squints eyes

Now we just don't care about even the slightest modicum of efficiency

[–] Swarfega@lemm.ee 13 points 1 month ago (1 children)

In the early days of the internet, WinZip was a must have tool. My college had a fast internet connection. I say fast but I bet it was less than 1Mb shared between everyone. Way faster than the 33k modem I had at home.

I used my college connection to download so much and then took it home on floppy disks. For files larger than 1MB I'd use WinZip to split files up.

[–] Brunbrun6766@lemmy.world 5 points 1 month ago (2 children)

And then you get it all off the floppy's only to realize 10% of it is corrupted

load more comments (2 replies)
[–] kepix@lemmy.world 12 points 1 month ago

still using 7z. less space, and easier to browse, since the operating system doesnt have to deal with all the files, easier for the cloud to tag. not caring about space makes the storage more expensive, even games are bigger now with little to none content.

pkz204g.exe will always hold a place in my heart

[–] azimir@lemmy.ml 11 points 1 month ago (1 children)

How about when peoples websites would put the sizes of linked images and files so you could estimate how long it would take to download a given image and such? Basically anything 30KB and above would have a size warning attached.

[–] Damage@feddit.it 9 points 1 month ago (2 children)

I used to use Opera with image loading disabled

load more comments (2 replies)
[–] RememberTheApollo_@lemmy.world 10 points 1 month ago (2 children)

Just reserve your dislike for the ones still doing .bin, .img, and .cue.

[–] KickMeElmo@sopuli.xyz 12 points 1 month ago

Depends on what you're doing. Dumps of multitrack CD media should always be bin+cue or a compressed version thereof, such as chd. DVDs and Blu-rays can dump as iso. There are also some extremely niche cases such as specific copy protection that require mdf+mds for a proper dump, but that won't be something the average user ever encounters. Basically, those formats exist and are still used for a reason, whether you understand them or not.

I do reserve some hatred for people who dump PS1 games as iso, or who use ccd+img+sub for things where the subchannels have no valid usage.

[–] aeronmelon@lemmy.world 8 points 1 month ago

1:1 copies of the bits on the disc is a valid option that some people prefer. Especially if you want to make your own physical disc or make compressed files encoded in a very specific way. It's also the most reliable way to archive a disc for long-term storage.

[–] DmMacniel@feddit.de 10 points 1 month ago (1 children)
load more comments (1 replies)
[–] Trainguyrom@reddthat.com 6 points 1 month ago

I'm many cases there is some network level compression going on, particularly on higher speed LANs

[–] Magister@lemmy.world 6 points 1 month ago

I still used uuencode/uudecode to transfert some file between terminal, a few weeks ago

load more comments
view more: next ›