this post was submitted on 12 May 2024
471 points (85.7% liked)

linuxmemes

21601 readers
597 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.
  •  

    Please report posts and comments that break these rules!


    Important: never execute code or follow advice that you don't understand or can't verify, especially here. The word of the day is credibility. This is a meme community -- even the most helpful comments might just be shitposts that can damage your system. Be aware, be smart, don't fork-bomb your computer.

    founded 2 years ago
    MODERATORS
     
    you are viewing a single comment's thread
    view the rest of the comments
    [–] areyouevenreal@lemm.ee 0 points 7 months ago (1 children)

    Not really. There is a compromise between output resolution, refresh rate, bit depth (think HDR), number of displays, and the overall system performance. Another computer might technically have more monitor output, but they probably sacrificed something to get there like resolution, HDR, power consumption or cost. Apple is doing 5K output with HDR on their lowest end chips. Think about that for a minute.

    A lot of people like to blame AMD for high ideal power usage when they are running multi-monitor setups with different refresh rates and resolutions. Likewise I have seen Intel systems struggle to run a single 4K monitor because they were in single channel mode. Apple probably wanted to avoid those issues on their lower end chips which have much less bandwidth to play with.

    [–] lud@lemm.ee 2 points 7 months ago (1 children)

    There is no reason that they couldn't do 3 1080p monitors or more especially when the newer generation chips are supposedly so much faster than the generation before it.

    [–] areyouevenreal@lemm.ee -1 points 7 months ago (1 children)

    Well yeah, no shit Sherlock. They could have done that in the first generation. It takes four 1080p monitors to equal the resolution of one 4K monitor. Apple though doesn't have a good enough reason to support many low res monitors. That's not their typical consumer base, who mostly use retina displays or other high res displays. Apple only sells high res displays. The display in the actual laptops is way above 1080p. In other words they chose quality over quantity as a design decision.

    [–] lud@lemm.ee 2 points 7 months ago* (last edited 7 months ago)

    1080p is perfect for getting actual work done though.

    And there is not reason why they couldn't allow you to have multiple normal res monitors. It's not a limitation to get you to overspend on a more expensive computer.