[-] Deestan@beehaw.org 25 points 9 months ago

Atrio: The Dark Wild - has you control a clone with a limited life span. When you die and resume from a new clone, the old clone corpse is lying around and you can harvest it for parts necessary to continue the story.

Sifu - when you "die" your character ages and gets stronger before trying again.

Karateka - plays a lot like a regular game with lives, but it's not the same life. Every time you have to resume from a new life, it's a different person attempting to get to the end.

Shadow of Mordor - when you are killed by an orc, you resurrect from a spirit. The orc, however, gets high-fives from all his mates and gets promoted, plus some new skills. Next time you see him he will call you out.

Hades - the entire story is based around you repeatedly failing and dying.

Super Meat Boy - well basically you die and restart, but when you finally beat the level, you get an instant replay with all your failed attempts simultaneously playing on top of it. The effect is more glorious the more you struggled to beat the level.

[-] Deestan@beehaw.org 22 points 10 months ago

This sucks.

Can't ditch it completely due to family, but got a few more contacts over on Signal after this announcement.

[-] Deestan@beehaw.org 13 points 10 months ago

Completely agree.

It's just a popular quasi-religion for rich people to keep doing what they do while coming off as megabrain angels.

[-] Deestan@beehaw.org 30 points 10 months ago* (last edited 10 months ago)

Longtermism is a cardboard halo. A thin excuse to act in complete self-interest while pretending it is good for humanity.

The further into the future we try to think, the more different factors and uncertainty dominate. This leaves you room to put in any argument you feel like, to make any prediction you feel like. So you pick something vaguely romantic or appealing to some relatively popular opinion, and hey you're golden.

I am approached by a beggar. What do I - the longterminist - do?

I feel like being kind today. My longterminist argument is that every bit of happiness and relief today carries compound interest into the future, and by giving this person some money today, they are content and don't have to resort to thievery, which again makes another person have a safe day and have mental energy to do a lot of good tomorrow. The goodness becomes bigger every step, over time. I give them $100. It's pretty obvious, really.

They smell and I don't want to deal with that right now. My longterminist argument is that helping out beggars actually just perpetuates a problem in society. If people can't function in society without random help, it's just a ticking bomb of a humanitarian disaster. Giving them money just postpones the time until the crisis is too big to ignore, and allows it to grow further. No, this is a problem that society needs to handle right now, and by giving money to this person I'm just helping the problem stay hidden. I ignore them and walk on by. It's pretty obvious, really.

My wife left me and I want other people to hurt like I do. My longterminist argument is that unfortunately, these people are rejects of society and I can't fix that. But we can prevent them from harassing productive citizens that work hard to create a better future. If fewer beggars make commuters sad and it gives a 1% improvement in productivity, that's a huge compound improvement in a few hundred years. So I kick him in the leg, yell at him, and call the police on him and say he tried to assault me. It's a bit cold-hearted, but it's obviously good long term.

[-] Deestan@beehaw.org 80 points 11 months ago

"Arr, this ❌ marks where I buried 44 billion doubloons!"

3
submitted 11 months ago by Deestan@beehaw.org to c/technology@beehaw.org

As far as I understand this, they seem to think that AI models trained on a set of affluent westerners with unknown biases can be told to "act like [demographic] and answer these questions."

It sounds completely bonkers not only from a moral perspective, but scientifically and statistically this is basically just making up data and hoping everyone is impressed by how complicated the data faking is to care.

[-] Deestan@beehaw.org 11 points 11 months ago

Most chatbots are speed bumps. Like phone menu trees and hold times, they slow you down on your way to get actual help.

Sometimes that means you give up before getting to the real help, which saves money on support.

Whether it's the intended effect or not, it is so well known at this point that we shouldn't excuse anyone using this tactic. It's malicious.

[-] Deestan@beehaw.org 15 points 11 months ago

Black & White

It has a mechanic where you bless a stone, then throw it across the map, and you get to build and influence an area around the rock. Basically it is the only sane way to expand.

I did not know. I spent painstaking hours slowly growing my village trying to get its area of influence to spread into where I needed to go.

[-] Deestan@beehaw.org 17 points 11 months ago

So in legalese, it basically says "we are scared but not sure why. GRR!"

[-] Deestan@beehaw.org 23 points 1 year ago

I did not think that through.

Oh well, exclusive Lemmy access promo I guess. I'll throw the Escapist an extra $10 on their next donation-enabled stream as an apology.

[-] Deestan@beehaw.org 28 points 1 year ago* (last edited 1 year ago)

Well, inflation is real. And they are using sales income to fund current development. That's as fair as it gets.

Would you be happy if they released it at 60$ and had periodic 60% sales?

[-] Deestan@beehaw.org 40 points 1 year ago* (last edited 1 year ago)

Me, I wish more games respected my time like that, instead of costing 40$ and going on 20% sale every few weeks, leaving me to hunt bargain bins to be able to get it at its "efficient" price.

128
submitted 1 year ago by Deestan@beehaw.org to c/gaming@beehaw.org
view more: next ›

Deestan

joined 1 year ago