this post was submitted on 12 Jul 2024
970 points (95.6% liked)

Technology

59086 readers
3746 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Tattorack@lemmy.world 13 points 3 months ago (2 children)

Wouldn't just one GPU be enough to run the Sphere, or a I getting something wrong?

I remember hearing about that it's not exactly high resolution, each "pixel" being a bunch of pretty large lamps.

[–] ilinamorato@lemmy.world 17 points 3 months ago* (last edited 3 months ago) (3 children)

Wikipedia says it's 16,000x16,000 (which is way less than I thought). The way the math works, that's 16x as big as a 4k monitor, so 16 GPUs would make sense. And there's a screen inside and one outside, so double that. But I also can't figure out why it needs five times that. Redundancy? Poor optimization? I dunno.

[–] Tattorack@lemmy.world 12 points 3 months ago (3 children)

But wouldn't that be only necessary if it needed to render real-time graphics at such a scale? If I'm correct, all its doing is playing back videos.

[–] ilinamorato@lemmy.world 5 points 3 months ago

I think it's doing some non-trivial amount of rendering, since it's often syncing graphics with music played live.

[–] st14@lemmus.org 1 points 3 months ago

Live audio visualization in game engines is definitely a thing ex. https://youtu.be/IZL7VAt97ws?si=H74SwrLZYfsYNTY8

[–] stormeuh@lemmy.world 1 points 3 months ago

Even if it's just playing back videos, it still should compensate for the distortion of the spherical display. That's a "simple" 3d transformation, but with the amount of pixels, coordinating between the GPUs and some redundancy, it doesn't seem like an excessive amount of computing power. The whole thing is still an impressive excess though...

[–] Anyolduser@lemmynsfw.com 8 points 3 months ago (1 children)

I'm guessing it's the department of redundancy department, is my guess.

[–] ilinamorato@lemmy.world 2 points 3 months ago

Someone elsewhere in the thread suggested it might be a marketing thing on Nvidia's part, and that makes a lot of sense.

[–] markpaskal@lemmy.ca 5 points 3 months ago (1 children)

I work for a digital display company, and it is definitely redundancy. There will be at least two redundant display systems that go to the modules separately so they can switch between them to solve issues. If a component fails on one side they just switch to the other.

[–] ilinamorato@lemmy.world 2 points 3 months ago

Ah, nice. Thank you for bringing your expertise to my nonsense.

[–] umbraroze@lemmy.world 5 points 3 months ago

The way I think it, it's possible a really small number of GPUs would be enough to render the framebuffer, you'd just need an army of low-power graphics units to receive the data and render it on screens.

Having a high-power GPU for every screen is definitely a loss unless the render job is distributed really well and there's also people around to admire the results at the distance where the pixel differences no longer matter. Which is to say, not here.