this post was submitted on 31 Dec 2023
-9 points (35.5% liked)

Showerthoughts

32345 readers
916 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
 

I’ve noticed ChatGPT gets less able to do precise reasoning or respond to instructions, the longer the conversation gets.

It felt exactly like working with a student who was getting tired and needed to rest.

Then I had above shower thought. Pretty cool right?

Every few months a new ChatGPT v4 is deployed. It’s got new training data, up through X date. They train up a new model on the new content in the world, including ChatGPT conversations from users who’ve opted into that (or didn’t opt out, can’t remember how it’s presented).

It’s like GPT is “sleeping”, to consolidate “the day’s” knowledge into long term memory. All the data in the current conversation is its short term memory. After handling a certain amount of complexity in one conversation, the coherence of responses breaks down, becomes more habitual and less responsive to nuance. It gets tired and can’t go much further.

top 1 comments
sorted by: hot top controversial new old
[–] cheese_greater@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

I wonder if it has more to do with the fact that the more inputs are available, the more complicated and prone to error something is. Also, ChatGPT doesn't "get tired" or at least not in the human meaning, I guess it could exhaust the memory of whatever its running on would be the closest analogy if my understanding is correct...

Re:complexity, this is true of all things, and computer/programming stuff is no different. The more complex it is or the larger the dataset, the greater potential for noise vs signal