For a long time I’ve thought it would be cool to upload my consciousness into a machine and be able to talk to a version of myself that didn’t have emotions and cravings.

It might tell me that being around my parents has consistently had a negative effect on my mood for years now, even if I don’t see it. Or that I don’t really love X, I just like having sex with her. Maybe it could determine that Y makes me uncomfortable, but has had an overall positive effect on my life. It could mirror myself back to me in a highly objective way.

Of course this is still science fiction, but @TheOtherJake@beehaw.org has pointed out to me that it’s now just a little bit closer to being a reality.

With Private GPT, I could set up my own localized AI.

https://generativeai.pub/how-to-setup-and-run-privategpt-a-step-by-step-guide-ab6a1544803e

https://github.com/imartinez/privateGPT

I could feed this AI with information that I wasn’t comfortable showing to anyone else. I’ve been keeping diaries for most of my adult life. Once PrivateGPT was trained on the basic language model, I could feed it my diaries, and then have a chat with myself.

I realize PrivateGPT is not sentient, but this is still exciting, and my mind is kinda blown right now.

Edit 1: Guys, this isn’t about me creating a therapist-in-a-box to solve any particular emotional problem. It’s just an interesting idea about using a pattern recognition tool on myself, and have it create summaries of things I’ve said. Lighten up.

Edit 2: It was anticlimactic. This thing basically spits out word salad no matter what I ask it, even if the question has a correct answer, like a specific date.

  • ivanafterall@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Yeah, so the AI would STILL be very favorable about having sex with X, for example, because it’s trained on your writing/speaking/whatever.

    “What do I feel about this?”

    “Well, an average of what you’ve always felt about it, roughly…”

    • jecxjo@midwest.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Well sort of. If you never talked about dating for instance, and you then started taking to the AI about dating it may not put two and two together to get that it relates to sex. It wouldn’t be able to infer anything about the topic as it only knows what the statistically most likely next word is.

      That’s what i feel like most people don’t get. Even uploading years and years of your own text will only match your writing style and the very specific things you’ve said about specific topics. That why the writers strike is kind of dumb. This form of AI wont invent new stories, just rehash old ones.

      …oh…now I see why they are on strike.