[-] intensely_human@lemm.ee 1 points 7 minutes ago

Oh really! Can you link to one of these documented instances? Since there’s so many …

[-] intensely_human@lemm.ee 1 points 11 minutes ago

Americans aren’t saying that because they forgot that they were once immigrants. They’re saying it because they remember, and they remember how they themselves immigrated, then scrambled to learn and speak English.

[-] intensely_human@lemm.ee 1 points 13 minutes ago

This is a myth as far as I know. I’ve never seen it happen.

[-] intensely_human@lemm.ee 1 points 18 hours ago

I wonder why it’s the deranged ones that are doing it.

[-] intensely_human@lemm.ee 5 points 21 hours ago

poly fill indeed

[-] intensely_human@lemm.ee 1 points 1 day ago

Cross posting will do that. Mods can cross-post

[-] intensely_human@lemm.ee 1 points 1 day ago

Yeah there’s only so many actually open-ended questions you can ask without being repetitive

lmao. I’m curious what you think this number is. How many open ended questions are there? Just ballpark.

[-] intensely_human@lemm.ee 1 points 1 day ago

Asking open ended questions that generate discussion is a good policy regardless of hating reddit

[-] intensely_human@lemm.ee 7 points 3 days ago

Yup, racism. Right out in the open. Upvoted, even.

3

O’Neill cylinder is that big rotating cylinder space station format that uses the spin for artificial gravity.

At higher elevations the gravity will be lower. BMX bikes will be fun too. Make a big jump and you can go across the center and land on the other side, or go into a zero-gee part in the middle, which works out if you’re always inside a curve.

-7

I’ve noticed ChatGPT gets less able to do precise reasoning or respond to instructions, the longer the conversation gets.

It felt exactly like working with a student who was getting tired and needed to rest.

Then I had above shower thought. Pretty cool right?

Every few months a new ChatGPT v4 is deployed. It’s got new training data, up through X date. They train up a new model on the new content in the world, including ChatGPT conversations from users who’ve opted into that (or didn’t opt out, can’t remember how it’s presented).

It’s like GPT is “sleeping”, to consolidate “the day’s” knowledge into long term memory. All the data in the current conversation is its short term memory. After handling a certain amount of complexity in one conversation, the coherence of responses breaks down, becomes more habitual and less responsive to nuance. It gets tired and can’t go much further.

view more: next ›

intensely_human

joined 1 year ago