this post was submitted on 22 Jun 2023
17 points (90.5% liked)

Asklemmy

43371 readers
1958 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
17
Deleted (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by IsThisLemmyOpen@lemmy.dbzer0.com to c/asklemmy@lemmy.ml
 

Deleted

you are viewing a single comment's thread
view the rest of the comments
[–] kthxbye_reddit@feddit.de 4 points 1 year ago (2 children)

The best case result is 1.001.000.000 (A+B) vs 1.000.000.000 (B) only. Worst case is I have 1.000.000 only.

I go with B only because the difference feels tiny / irrelevant.

Maybe I actually have free will and this is not determism kicking in, but who knows. I‘m not in for the odds with such a tiny benefit.

[–] OptimusFine@kbin.social 2 points 1 year ago (1 children)

Worst case is I have 1.000.000 only.

Except that's not the worst case. If the machine predicted you would pick A&B, then B contains nothing, so if you then only picked B (i.e. the machine's prediction was wrong), then you get zero. THAT'S the worst case. The question doesn't assume the machine's predictions are correct.

[–] kthxbye_reddit@feddit.de 2 points 1 year ago* (last edited 1 year ago)

Good point. Actually I was assuming that the machine’s predictions were never wrong. That’s also what is defined in the Newcomb’s Paradox wiki page.

If that‘s not a 100% given, you are definitely right.

[–] wols@lemmy.ml 2 points 1 year ago (1 children)

Well if you actually have free will, how can the machine predict your actions?

What if someone opened box B and showed you what was in it? What would that mean? What would you do?

[–] kthxbye_reddit@feddit.de 1 points 1 year ago* (last edited 1 year ago)

I meant, let’s imagine the machine predicted B and is wrong (because I take A+B). I would call that scenario „I have free will - no determinism.“ Then I will have 1.000.000.000 „only“. That’s a good result.

Maybe interesting: Wiki - Determinism