this post was submitted on 17 Mar 2024
473 points (99.4% liked)

Games

31777 readers
1307 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS
 

How does this KEEP GETTING WORSE??

you are viewing a single comment's thread
view the rest of the comments
[–] starman2112@sh.itjust.works 104 points 6 months ago (5 children)

And this is not even beginning to touch content and features from other released versions of these games from 20 years ago not present, like four-screen splitscreen."

It's so cool and amazing that we finally have home theatre systems in every fucking house, and that's when devs decided we don't get split screen anymore. Modern hardware is wasted on modern devs. Can we send them back in time to learn how to optimize, and bring back the ones that knew how to properly utilize hardware?

[–] Saik0Shinigami@lemmy.saik0.com 39 points 6 months ago (2 children)

It’s so cool and amazing that we finally have home theatre systems in every fucking house

Yeah I've noticed this too and it bothers me. We had 4 way split on 20inch tube tvs on hardware that measure their ram in MBs... But on modern 75+inch tvs on consoles with GBs of ram... Nah, too hard. You need to buy 4 copies of the game and have 4 separate setups... and probably need to be in 4 separate houses.

Couch co-op dying is basically when I stopped bothering with consoles all together. If I'm going to use a glorified PC, might as well just use a full fat PC and ignore consoles all together. I miss the N64 days.

[–] barsoap@lemm.ee 10 points 6 months ago (2 children)

We had 4 way split on 20inch tube tvs on hardware that measure their ram in MBs

And were still compute-bound. Things like the N64 pretty much used resources per pixel, mesh data being so light that the whole level could be in the limited RAM at the same time -- and needed to be because there weren't CPU cycles left over to implement asset streaming. Nowadays the only stuff that is in RAM is what you actually see, and with four perspectives, yes, you need four times the VRAM as every player can look at something completely different.

Sure you can write the game to use 1/4th the resources but then you either use that for singleplayer and get bad reviews for bad graphics, or you develop two completely different sets of assets, exploding development costs. I'm sure there also exist shady kitten-drowing marketing fucks who would object on reasons of "but hear me out, let's just sell them four copies instead" but they don't even get to object because production-wise split-screen isn't an option nowadays for games which aren't specifically focussing on that kind of thing. You can't just add it to any random title for a tenner.

[–] Couldbealeotard@lemmy.world 3 points 6 months ago (1 children)

You know, a fun game doesn't have to be built to use all available resources.

[–] barsoap@lemm.ee 2 points 6 months ago (1 children)

I completely agree but I doubt you can afford a StarWars license if you're making an indie game. Needs oomph and AAA to repay itself, and that's before Disney marketing gets their turn to say no because they've seen the walk cycles in Clone Wars and went "no, we can't possibly go worse than that that'd damage the brand". I'm digressing but those walk cycles really are awful.

[–] Couldbealeotard@lemmy.world 2 points 6 months ago

I feel like your comment isn't a reply to what I've said.

[–] isles@lemmy.world 1 points 6 months ago (1 children)

Aren't you also effectively down-resing the 4 screens? You're not running 4x 1080p streams, you're running 540p, textures can be downscaled with no perceptual loss. Non-consoles are already designed to be adaptive to system resources, so I don't see why you need two completely different sets of assets. (IANA dev)

[–] barsoap@lemm.ee 2 points 6 months ago* (last edited 6 months ago)

Textures can be scaled quite easily, at least when talking about 4k to 1k below that could cause GFX people to shout at you because automatic processes are bound to lose the wrong details. Meshes OTOH forget it.

Also I kinda was assuming 4x 1k to throw at a 4k screen as people were talking about home theatre screens. The "uses 4x VRAM" equation is for displaying 4x at 1/4th the resolution as opposed to 1x at 1x the resolution, whatever that resolution may be, and assuming you don't have a second set of close-up assets -- you can't just take LOD assets, being in the distance is a different thing than being in the forefront at a lower resolution: Foreground assets get way more visual attention, it's just how human perception works so you can't get away with auto-decimating and whatnot.

[–] catalyst@lemmy.world 5 points 6 months ago

Yes, so much this! We played 4 player goldeneye on screens the size of a postage stamp for goodness sake.

[–] ipkpjersi@lemmy.ml 21 points 6 months ago* (last edited 6 months ago) (1 children)

Modern hardware is wasted on modern devs. Can we send them back in time to learn how to optimize, and bring back the ones that knew how to properly utilize hardware?

I think a lot of the blame is erroneously placed on devs, or it's used as a colloquialism. Anyone who has worked in a corporate environment as a developer knows that the developers are not the ones making the decisions. You really think that developers want to create a game that is bad, to have their name attached to something that is bad and to also know that they created something that is bad? No, developers want to make a good game, but time constraints and horrible management prioritizing the wrong things (mostly, microtransactions, monetizing the hell out of games, etc) results in bad games being created. Also, game development is more complex since games are more complex, hardware is more complex, and developers are expected to produce results in less time than ever before - it's not exactly easy, either.

It's an annoyance of mine and I'm sure you meant no harm by it, but as a developer (and as someone who has done game development on the side and knows a lot about the game development industry), it's something that bothers me when people blame bad games solely on devs, and not on the management who made decisions which ended up with games in a bad state.

With that said, I agree with your sentiments about modern hardware not being able to take advantage of long-forgotten cool features like four-screen splitscreen, offline modes (mostly in online games), arcade modes, etc. I really wish these features were prioritized.

[–] almar_quigley@lemmy.world 4 points 6 months ago

I agree with you on this point. I think”devs” is conflated for the developing company and its management and not individual devs.

[–] catloaf@lemm.ee 18 points 6 months ago (2 children)

It's not a question of capability. It's a question of cost-benefit spending developer time on a feature not many people would use.

Couch coop was a thing because there was no way for you to play from your own homes. Nowadays it's a nice-to-have, because you can jump online any time and play together, anywhere in the world, without organizing everyone to show up at one house.

[–] chiliedogg@lemmy.world 13 points 6 months ago

It also requires multiple copies of the game.

[–] ICastFist@programming.dev 11 points 6 months ago (1 children)

It’s a question of cost-benefit spending developer time on a feature not many people would use

Which is super ironic when you look at games that had an obviously tacked-on, rushed multiplayer component in the first place, such as Spec Ops: The Line, Bioshock 2 and Mass Effect 3

[–] Piemanding@sh.itjust.works 9 points 6 months ago

Goldeneye 007. Yeah seriously. The most famous multiplayer game of its generation very nearly didn't have multiplayer. It was tacked on when they finished the game and had a little bit of extra time and ROM storage.

[–] PillowTalk420@lemmy.world 4 points 6 months ago (1 children)

I would go back in time to 1995 and give John Carmack modern tools and maybe UE5 and see what happens.

[–] ICastFist@programming.dev 4 points 6 months ago

Even if you gave him a current-day computer to play with (otherwise, even supercomputers of the time would struggle to run UE5), he wouldn't achieve much, consumer grade computers back then really struggled with 3D graphics. Quake, released in 1996, would usually play around 10-20 FPS.

[–] barsoap@lemm.ee 3 points 6 months ago (2 children)

4x splitscreen needs approximately 4x VRAM with modern approaches to graphics: If you're looking at something sufficiently different than another player there's going to be nearly zero data in common between them, and you need VRAM for both sets. You go ahead and make a game run in 1/4th of its original budget.

[–] starman2112@sh.itjust.works 18 points 6 months ago* (last edited 6 months ago) (1 children)

I can't do that, but you know who could? The people who originally made the game. Had they simply re-released the game that they already made, it wouldn't be an issue. Us fans of the old games didn't stop playing because the graphics got too bad. Even if we did, this weird half step towards updating the graphics doesn't do anything for me. Low poly models with textures that quadruple the game's size are the worst possible middle ground.

My flatmates and I actually played through a galactic conquest campaign on the OG battlefront 2 like 2 months ago. It holds up.

[–] barsoap@lemm.ee -3 points 6 months ago (1 children)

I can’t do that, but you know who could? The people who originally made the game.

How to tell me you're not a gamedev without telling me you're not a gamedev. You don't just turn a knob and the game uses less VRAM, a 4x budget difference is a completely new pipeline, including assets.

Low poly models with textures that quadruple the game’s size are the worst possible middle ground.

Speaking about redoing mesh assets. Textures are easy, especially if they already exist in a higher resolution which will be the case for a 2015 game, but producing slightly higher-res meshes from the original sculpts is manual work. Topology and weight-painting at minimum.

So, different proposal: Don't do it yourself. Scrap together a couple of millions to have someone do it for you.

[–] starman2112@sh.itjust.works 8 points 6 months ago (1 children)

Wait, are you under the impression that this is a rerelease of the 2015 battlefront?

[–] barsoap@lemm.ee -1 points 6 months ago* (last edited 6 months ago)

I had no idea and it was the game that was listed without "2" or "3" etc. so that's the specs I used.

Yeah no the 2004 thing should run in 16x splitscreen on a quadcore Ryzen with a 4G RX 570. With headroom.

The general point still stands, though, you can't do the same thing with a 2015 game. On the flipside you should be able to run the 2004 game in different VMs on the same box, no native support required.

[–] MonkderZweite@feddit.ch 3 points 6 months ago (1 children)

and 1/4 of it's original resolution. Seems doable.

[–] barsoap@lemm.ee 7 points 6 months ago* (last edited 6 months ago)

Output resolution has negligible impact on VRAM use: 32M for a 4-byte buffer for 4k, 8M for 1080p. It's texture and mesh data that eats VRAM, texture and mesh data that's bound to be different between different cameras and thus, as I already said, can't be shared, you need to calculate with 4x VRAM use because you need to cover the worst-case scenario.