No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
With computer displays only limitation is hardware. If I had to hazard a guess, 144Hz is there because that's approximately maximum supported on widest range of hardware and 144Hz crystals were widely available and therefore cheap. Kind of how there's a huge market for rollerblade ball bearings. Pretty much all of the power tools are using them. They are simply everywhere because they are cheap.
I was really hoping you were Lemmy's 1996 rage in the cage account making every conversation about ball bearings
Haha, never heard of that.
Probably a reference to shittymorph? https://knowyourmeme.com/memes/the-undertaker-threw-mankind-off-hell-in-a-cell
Tell me more about the ball bearing industry please!
Also subscribing for roller blade ball bearing facts
I remember getting ABEC-5 bearings for my blades back in the day. Felt like you were rolling on ice. ABEC-7 was an option, but they were so expensive and the gains were supposedly marginal. Still, I sometimes wonder about what they would've been like.
Really no different. The ABEC rating is about machine tolerances so they can spin really fast.
Roller blades and skateboards just don't go that fast. Also the impacts and crap that they get off the ground damages them far more than what an industrial usage setting would.
They're just fleecing customers
I had huge 100mm wheels, so I thought I felt the difference between ABEC-3 and -5, but maybe that was just placebo.
To quote Wikipedia:
The ABEC rating does not specify many critical factors, such as load handling capabilities, ball precision, materials, material Rockwell hardness, degree of ball and raceway (cone) polishing, noise, vibration, and lubricant. Due to these factors, a high-quality ABEC 3 classified bearing could actually perform better than a lower-quality bearing which satisfies (the stricter) ABEC 7 requirement.
ABEC only rates tolerances. Nothing else. They were rated bearings you had so they performed better than chinese knockoffs. If you wanted good stuff, go with Japanese, German or Korean.
Divide. They needed buffer room because 30 60 or 120hz aren't always exactly 30, 60, or 120hz. Like you said 144 was just the cheapest that net or exceeded spec.
LCD crystals do have a theoretical maximum, but we donβt have display drivers or transmission standards that support those frequencies.
Didn't mean LCD crystals, but just crystal oscillators that are used for timing.
I have 160Hz screen.
Also, 144/24=6. 24fps is the original fps of the movies. So, 160 is more puzzling from this perspective. It is not divisible by 24 or 30.
Mine is an odd number, 165hz
75hz here. I thought it was pretty weird. It's basically extra spicy 60hz.
75hz was because of CRT monitors. This very old Tom's hardware article goes into the math behind the reasoning
72 Hz was used as a refresh rate for CRT monitors back in the day. Specifically because it was the average threshold that no users reported discomfort from CRT flicker. And 72 * 2.
It is likely a holdover from that era. I think from there, it is a multiple of 24 HZ so movie content scaled smoothly without tearing before vsync? Last part is a guess.
Old reel projectors actually flashed their light at 72Hz. They had to turn off the light to move the reel to the next slide so you couln't see the pictures moving up off the screen, and human eyes are better at spotting quickly flashing lights than they are at spotting microstuttery motion, so flashing the bulb once per frame at 24Hz in a dark room was headache inducing. The solution they came up with was just to flash the bulb 3 times per frame, which is 72Hz.
144Hz is not a holdover in the case of computer monitors. Itβs the maximum bandwidth you can push through DVI-D Dual-link at 1080p, which was the only standard that could support that refresh rate when they began producing LCD monitors built to run 144Hz.
I had to scroll a bit to make sure this answer was here before I wrote the same. π
The reason 60Hz was so prominent has to do with the power line frequency. Screens originated as cathode ray tube (CRT) TVs that were only able to use a single frequency, which was the one chosen by TV networks. They chose a the power line frequency because this minimizes flicker when recording light powered with the same frequency as the one you record with, and you want to play back in the same frequency for normal content.
This however isn't as important for modern monitors. You have other image sources than video content produced for TV which benefit from higher rates but don't need to match a multiple of 60. So nowadays manufacturers go as high as their panels allow, my guess is 144 exists because that's 6*24Hz (the latter being the "cinematic" frequency). My monitor for example is 75 Hz which is 1.5*50Hz, which is the European power line frequency, but the refresh rate is variable anyways, making it can match full multiples of content frequency dynamically if desired.
ITT: A ton of people who think computer displays can only sync at a single clockrate for some reason.
the numbers are a maximum and software can alter it lower or split it up. I worked in a visualization lab and we would often mess with the refresh rates. That being said you could alter it and the screen would not respond (show an image) so there must be some limitations.
Wait until you find out why 24.9 was a standard and still is for most of the movies. Logical at the time, completely retarded today.
Is that the same reason that 30fps and 29.97 fps are two different things?
They are related. Black and white TV was running at 30 frames for obvious easy timing since USA power grid is 60Hz, but then introduction of color added two more channels and caused interference between them. So signal became backwards compatible (luminance channel was black and white, while color TVs used two additional channels for color information) but color TVs had an issue. Whole 29.97 was a result of halving 60/1.001β59.94. That slowing down of 0.1% was to prevent dot crawl, or chroma crawl (that interference). So all of today's videos in 29.97, even digital ones, are in fact due to backwards compatibility with B&W TV which no longer exist and certainly pointless when it comes to digital formats.
On the other hand 24fps was just a convenience pick. It was easily divisible by 2, 3, 4, 6... and it was good enough since film stock was expensive. Europe rolled half of their power grid which is 50Hz, so 25... and movies stuck with 24 which was good enough but close enough to all the others. They still use this framerate today which is a joke considering you can get 8K video in resolution but have frame rate of a lantern show from last century.
movies stuck with 24 which was good enough but close enough to all the others. They still use this framerate today which is a joke considering you can get 8K video in resolution but have frame rate of a lantern show from last century.
"But when I saw The Hobbit with 48fps it looked so cheap and fake!"
π
Because it was fake. :) It's much harder to hide actors inability to fight when you see things moving instead of blurry frame. Or poor animations when your eyes have time to see details. Watch a good fighting movie like Ong Bak or anything by Jackie Chan and you'll be fine because they actually know how to fight. No faking needed.
Yep! Not the only issue with it, but certainly one of them.
We also have everyone associating smooth motion with soap operas because of cheap digital television cameras (IIRC).
I like higher framerates. Sweeping shots and action scenes in 24fps can be so jarring when you're used to videogames.
it did
Of course it did, Weta had no lead time at all. They had years for the original LotR trilogy. They were set up for failure.
But unfortunately it ruined the industry perception of 48fps movies for years. To the point that when the new Avatar came out last year they were like "it's 48fps but we promise we double up frames for some scenes so it's only 24fps for those ones, donβt worry!β
I forgot about that. It's true I didn't notice any problems in Avatar 2.
Well, share with the class
Did so, in other comment in this thread. :)
My iMac says it's 77hz.
60Hz was the original clock rate, determined by US power cycles way back in the day. This was 50Hz in some countries.
With LCD screens, the potential for higher frame rates became easier to achieve. They began to advertise 120Hz TVs and monitors, which set a new bar for frame rates. Some advertise 75Hz monitors, slightly better than 60Hz when crunching numbers. 75Hz is achieved by overclocking standard 60Hz control boards, most can achieve this refresh rate if they allow it. Later HDMI standards, DisplayPort and DVI-D support this frame rate at least up to 2K.
144Hz is the same trick as 75Hz, this time with a 120Hz control board. The true standard frame rate is 120Hz, it is clocked higher to achieve 144Hz. Why 144 exactly? This was most likely due to the lack of standards that originally supported higher frame rates. Dual-link DVI-D was the only one which could push 144Hz at 1080p. Any higher frame rate (or resolution) and the signal would exceed bandwidth. Now 144Hz is simply a new standard number and plenty of 1440p monitors are set to this frame rate.
Just to point out. I had 120hz on a CRT monitor back in the late 90s/early 2000s. The resolution was terrible though (either 640x480 or 800x600). At good resolutions (1024x768 or 1280x960) you were generally stuck with 75 to 90 at best.
60hz LCD screens were one of the reasons there was resistance among game players to move to LCD. Not to mention earlier units took a VGA input and as such the picture quality was usually bad compared to CRT and added latency. People buying LCDs did it for the aesthetics when they first became available. Where I worked, for example only the reception had an LCD screen.
Also, on a more pedantic point. 50hz is the power line frequency in the majority of the world.
Clear explanation! I assume the overclocking is the reason why my monitor goes to 165Hz.