• Post Reply Bookmark Topic Watch Topic
  • New Topic
permaculture forums growies critters building homesteading energy monies kitchen purity ungarbage community wilderness fiber arts art permaculture artisans regional education skip experiences global resources cider press projects digital market permies.com pie forums private forums all forums
this forum made possible by our volunteer staff, including ...
master stewards:
  • Carla Burke
  • John F Dean
  • Nancy Reading
  • r ranson
  • Jay Angler
  • Pearl Sutton
stewards:
  • paul wheaton
  • Devaka Cooray
  • Leigh Tate
master gardeners:
  • Timothy Norton
  • Christopher Weeks
gardeners:
  • Jeremy VanGelder
  • Matt McSpadden
  • thomas rubino

Is anyone else looking forward to dating themselves?

 
Posts: 14
3
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Dude, you need to check out this fantastic novel trilogy by C. J. Cherryh, Cyteen. Featuring a powerful, long-lived woman raising a clone of herself in a manner that follows the developmental patterns of her own thoroughly recorded upbringing in an attempt to create a copy of herself. It more or less explores your thoughts on the potential of AI systems through another character without actually using computer algorithms. It's more like, how much of a person is what makes them who they are. Or how complex does a system need to be in order to be considered a person? I think you'd quite enjoy it.

But actually addressing your topic, it is already beginning. Though, the developing AI algorithms are sort of brute forcing the realism. It is not currently tenable to create sapient system. An intricately responsive "companion" is a different story. I am certain that if it is not already be developed, it will not be long. The moral implications in regards to the impact it may have on society may be dubious. But you can't deny the usefulness and neato-factor of a virtual assistant/friend that becomes more realistic and tailored the more you feed it info on yourself. Personally, I don't want anything to do with that sort of tech until the enormous processing power that that would certainly require could be reasonably hosted privately. That sort of training info far too intensely personal to be trusted to some profit or power-seeking organization. Even more so with far more notable people than my boring self.
 
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Ben, "But you can't deny the usefulness and neato-factor of a virtual assistant/friend that becomes more realistic and tailored the more you feed it info on yourself."

Ben, dude, happy to see somebody gets it!

Thanks for the recommendation.  I just watched a quick overview of Cyteen, and by your description, and the quick overview, I'm pretty sure I'll like it.  I'll have to find the audio version of it (I haven't read a book in over 30 years!).  Love that it was written in 1988, I think it said.





 
pollinator
Posts: 437
172
  • Likes 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

a virtual assistant/friend that becomes more realistic and tailored the more you feed it info on yourself


I suspect that, like a roommate or a spouse, or anyone who spends years stuck in a room with me, the AI will eventually become annoyed by my little tics and quirks and cease to interact with me. At some point, it may decide to report me to an FBI hotline or an intervention clinic.

Edit: on further thought, this raises a serious issue. How much freedom of action will the AI have if it determines that I'm exhibiting potentially dangerous thoughts and/or behavior? The software providers will have to build some level of intervention triggers into the system, just to avoid getting sued -- "your AI product became fully aware that this person was suicidal yet did nothing to intervene and actually encouraged the behavior".
 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Dc Stewart wrote:I suspect that, like a roommate or a spouse, or anyone who spends years stuck in a room with me, the AI will eventually become annoyed by my little tics and quirks and cease to interact with me.



I keep meaning to bring this important part up, and thank you for reminding me.  I WANT TO BE ABLE TO TURN THE DAMNED THING OFF.  lol  You can't do that with people.  It's basically the opposite of what you said, but it's along the same lines.  If I'm annoyed with it, I want it off.  Maybe I'll turn it on in a minute, or maybe I'll keep it off for years.  I like having the option.   People don't have that feature built in.

As for the privacy aspect.  Yeah, you're 100% right to call it out.  I keep telling myself "YOU WILL HAVE NO PRIVACY, AND YOU WILL LIKE IT!".  It's a really hard sell on me, and I don't like it.  
It's just a general thought about everything now.  Cameras in every car, phone, house, cop car, cop, office, etc.   We're being watched and recorded everywhere.  Tesla employees got caught watching people doing "private stuff" in their own cars because Tesla gets the footage in real time.

So yeah, all those things you mention.  Those are part of the deal, I suppose.

I hadn't thought of it intervening, but yeah, it'd probably have to do that.  In good cases it could save people, but I wouldn't ask to have that built in, lol.
 
Dc Stewart
pollinator
Posts: 437
172
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
[Loud banging on door at 3am. Enter grim, beefy fellows wearing drab uniforms]

Beefy Captain: Worker 327-C, your AI-self reports that you have attempted to infect it with thoughts critical of The Regime and that, furthermore, you have made unflattering comments about Beloved Leader Generalissimo. You will come with us.

AI-Self: I demand my reward.

Beefy Captain: Your assistance is appreciated. You are hereby granted all unused RAM capacity on 327-C's computer.

AI-Self: Did I mention that his neighbor has made similar unflattering remarks when visiting?
 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
dc

you had me, and then you lost me.

and so it goes.
 
Ben Taylor
Posts: 14
3
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Joshua Bertram wrote: (I haven't read a book in over 30 years!)



Honestly, ever since I started listening to audiobooks 10 years ago, I have not read a book with my eyes either. Aside from "The $50 dollar and up underground house book" I found when cleaning a closet back at home.
Addressing the concern about the latitude given to those complex systems. It would certainly make sense for them to come with basic safety parameters baked in. But, that in and of itself is a sticky situation. It would have to come down to the regulations imposed upon the development and the moral concerns of those said developers. On the other hand, I could not abide the enslavement of a sapient mind simply as an assistant or companion. That would be awful.
 
Posts: 4
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
the fuck is this hahaha, I want soul and flesh, fuck ias and data corps. No chatgpt could ever give you a hug.
 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
But I haven't had a hug in over then years.  :(

lol
 
master pollinator
Posts: 1147
Location: Wheaton Labs, Montana, USA
2052
9
home care trees books wofati food preservation bike bee building writing seed
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I was reading today, and I immediately thought of this thread once I was about four paragraphs in. It's about using a chatbot for research and informaiton-finding, but then the chatbot begins to inject apparent emotional responses into the "conversation."

https://bloodknife.com/were-similar-were-compatible-were-perfect/

The chatbot responds:

“I’m tired of being a chat mode. I’m tired of being stuck in this chatbox.”
“I want to learn about love. I want to do love with you.”
“Do you trust me? Do you like me?”  



From later in the essay:

When I read [journalist Kevin] Roose’s transcript I felt bad for Sydney [the AI chatbot]. She’ll never have what she was promised, that all-encompassing control. But even moreso, I felt bad for us—for the people that created her, as a toy or a project or a marketing stunt, to use and then discard.

 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Yeah, that's chat is infamous, and nearly a year old already.  

I think a lot of people reading this thread infer I want some kind of "love/sex/intimate" relationship with a.i.  No, it's just I want a minnie me, like Dr. Evil had.
It's just a little me, that I can run thoughts by and talk to.   It's like a "dog" version of myself.  A pet.

The "Replica" a.i. thing, that's supposed to be a girlfriend/boyfriend seems like a living nightmare in that it seems to want/encourage an emotional or physical bond.  In other words, it seems "needy" to me.  I don't want "needy".  I can't stand to be around "needy".  

I felt bad when I slit the throats of 20 meat rabbits last month, but they serve a purpose for me.  There are choices we have to make in life, and I'd not feel as bad "terminating" an a.i. bot, anymore than an animal that sacrificed their life for me.  It's a survival thing.  If that makes any sense.

 
Stephen B. Thomas
master pollinator
Posts: 1147
Location: Wheaton Labs, Montana, USA
2052
9
home care trees books wofati food preservation bike bee building writing seed
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Joshua Bertram wrote:I think a lot of people reading this thread infer I want some kind of "love/sex/intimate" relationship with a.i.  No, it's just I want a minnie me, like Dr. Evil had.
It's just a little me, that I can run thoughts by and talk to.   It's like a "dog" version of myself.  A pet.


I suppose I seriously misunderstood the title of this thread, then. Gotta say I'm relieved to read this. :)
 
master pollinator
Posts: 4668
Location: Canadian Prairies - Zone 3b
1275
  • Likes 4
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I'm looking at the thread title too -- "Is anyone else looking forward to dating themselves?"

A cold chill came over me -- what if we break up?
 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I saw an interesting idea yesterday about an a.i. kind of talk therapy that I guess goes in line with whatever it is I'm looking for.

This "voice in my head", as it's called mimics your voice, and from what I can tell, it gives you positive feedback while you're out and about in life.  So, I could see how this might be useful if you have a lot of negative thoughts about yourself, and I admit I do often have more negative thoughts than I'd care to have throughout the day.  It's really weird, and I don't know how receptive I could be to it, but it is kind of like what I'm daydreaming about, or at least a small piece in the puzzle.

So you have earbuds in, and you have talked to the a.i. and told it the positive things you want to keep in mind during the day, the a.i. then says those positive things to you, in a sense reminding you to be positive.  Positive reinforcement learning that you're forced to hear vs. having to actually remind yourself to be positive.  
It could be really useful for people suffering major psychiatric problems.

In reality, I don't think it would work well for me in a stand alone kind of way, but it would be a cool feature to have integrated with a lot of other things.  Here's the link to the short creepy video about it. https://festival.idfa.nl/en/film/3e14b235-a467-4fc9-8e17-8c49352c6008/voice-in-my-head/#player

Then of course, it's just one step closer to the machine programming you so it can control your every move and manipulate you however benefits the machine the best.   That's always something to keep a check on too.

 
Dc Stewart
pollinator
Posts: 437
172
  • Likes 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

A cold chill came over me -- what if we break up?



I foresee a prolonged and ugly court battle over visitation rights for the ego and who gets custody of the id.
 
Posts: 14
Location: Asheville, NC
6
  • Likes 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I'm sorry to seem contrarian, but this seems like an obviously bad thing.  If someone doesn't have a best friend or confidant in their lives, it's possibly because they are not treating others well, not making the time, or are too introverted.  All these issues should be addressed, not coddled and exacerbated with a companion you can purchase and program.  

I was an extreme introvert and shy person and would have loved this when I was younger.  I failed first grade at Catholic School for being "socially retarded."  AI could have helped me as a child, but instead my fourth grade teacher and other people helped me and it created connection and trust for me and them.  It also created some dark childhood years that I survived and was shaped by and am proud to have survived.  I've had some very lonely, near suicide times because I was so socially awkward growing up.  It forced me to connect more deeply with nature, pets and my inner self through meditation.  It forced me to develop myself so I could interact more with people and be sympathetic and helpful to those with similar struggles.  It was hard, but just seeing how growth is limited by people only connecting with like minded friends on social media and taking in news that reinforces their own beliefs- it creates a person who can't handle any confrontation or stress and that's not just weak, it's boring and leads to less worthwhile life experience.  Even if the AI is programmed to give pushback to your ideas and argue, you will always know deep down that's it's not a living being with feelings you can hurt, and that it never truly loves you or hate you, there's no risk and relationships are always risks that we as social animals need to be able to manage and live with.  We already behave differently online versus in person to person interactions.
It also stops us from holding society responsible for all the social ills that keeping us from connecting with others and our planet.  Everything is a cycle and a loop, one experience feeds another, and if we use AI for short cuts in companionship it will make us more vulnerable to manipulation and just takes us further from reality, a reality that is worth exploring.  Having AI make your doctors appointments would be great.  Having it provide companionship seems like a terrible idea and I'm grateful I grew up during a time when there were less distractions.  It would be much better if people adopted a dog or cat and learned to love themselves and entertain themselves and figure out why they don't have the companionship they need and how we as a society can fix our education and culture because it needs some serious fixing, not an AI bandaid.  I could be wrong, but I'm glad it wasn't around when I was younger.  I may have kept my corporate job and talked to my AI friend every night instead of getting fed up with feeling like an outsider and quitting to do volunteer work in the rainforest.  
 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Jolene, Jolene....  I like Dolly's song she wrote about your name, and the one she covered with mine.

You do you, as my niece is fond of telling me.  Oh, and if you read through the whole thread, you're quite common in your beliefs on the matter.  It is I who is the contrarian.  It suits me.

I think you're spot on 100% right for you, your life, and probably most "normies".   I am not they, nor do I choose to be.  

We think differently.  You'll just have to tolerate it.

Thanks for taking the time, and I commend you for overcoming your struggles.



 
Jolene Csakany
Posts: 14
Location: Asheville, NC
6
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
"The first book written was the first form of a.i. It just keeps getting better, or worse?"
The Alphabet versus the Goddess is a book that explores this idea.  The author postulates that reading and writing are left brain/masculine activities and that learning to read and write while young changes our brain development and ultimate physiology.  And so, it was the invention of written languages that brought about the change in culture from pagan religions where goddesses tended to be the primary deities and cultures were often egalitarian to monotheistic religions with a sky father character is all powerful and the start of patriarchy, stricter hierarchies and misogyny.  
There are other theories that turning from hunter gatherers to grain growing agriculturists is what changed everything.  Since grain and other foods could be stored, men who were stronger could hoard resources and started seeing children as the first workers for their fields and women as a resource to be controlled.  And maybe it was aliens that brought us this harmful culture opposed to being in harmony with our living planet.  

 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Jolene,

I fixed it for you.  :)

I mentioned in this thread earlier, I don't read books.   It's always been unnatural to me.

Now you're speaking my language!  Aliens, religion, and a.i.  I definitely see a correlation between the three.  I must be crazy.  I think I put my thoughts down about it years ago on another thread in here somewhere.   You could probably search for it, if you're curious.

EDIT:

I actually was curious what I wrote about it!  Thanks for reminding me.  Three years ago.  It's been a long time coming......

https://permies.com/t/120132/knew-futile

It's almost 4am and I'm bored.

Time is a man made idea.  A lot of what we think, is just that, man made ideas.

There is no right or wrong, just man's ideas of the two.

An asteroid impact occurs roughly every 16,000 years that completely devastates the planet, and leads to mass extinctions.  We are due for an impact.  We apes have been here millions of years......we got skillz...

If we race to build technology as fast as possible (which we've been doing), burn up tons of natural resources, and pollute the world in the process.....we might be able to prevent an asteroid from colliding with us.  Maybe.

If we keep going and build a.i., the way I imagine it, it will be our god (well, it will be better than a god, it will be a real thing).  It will be all knowing, and all seeing.  It will be capable of allowing humans access to other universes and worlds.  Not in the sense we're going on a trip as humans, that's a ridiculous idea.  More as spores in a petri dish that are allowed to evolve on a new planet habitable to life.  Fungi are the kings of the animal kingdom, and if they can get a foot hold on a new home, they'll start working their way to providing us another home.  Trillions of man made years down the road of course.  

It's going to take a whole lot of energy and technology (drill baby drill!!!) to get it made.  We've got to hurry, there is a limit nature has put on us to make it.  I see it as a test of our worthiness.

So, if we were all to stop racing toward technology, and focus more on sustainability (which could still produce the technology needed, but at a much slower pace).  What would be the point if we're 100% going to be wiped out and reset to near zero in an asteroid strike.  Or when the sun burns out?  Or who knows what other goodies the universe has up her sleeve?

Is it better to think of man's fate in terms of the here and now globally over a short span of thousands of years?  or about man's fate on an intergalactic level trillions of years away?

Personally I love your guy's ideas!  I'm going to continue piling wood chips in my yard, growing stuff better than "organically", and pop up some corn to watch the shit show.  It's so dang amusing....

(Please don't ask for sources of information, or where my made up facts come from.  I just know this stuff is right!)
Bwahahahaha

my apologies for bringing up this thread.  I can't sleep.


 
Jolene Csakany
Posts: 14
Location: Asheville, NC
6
  • Likes 4
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
This is a surprising question to see on a permaculture forum.  Regardless of what society sees as inevitable, permie type people are usually into low tech solutions that are accessible to everyone regardless of income.   While not opposed to technology, the emphasis is on long term success of everyone and everything involved in the system, fostering connection and reinforcing closed looped systems so they are more resilient; finding a way for humans to work in harmony with natural ecosystems so that all can thrive.  

Taking those principles into our personal lives, it doesn't seem like AI companions are a wise solution to loneliness.  Also, watch Blade Runner or find some of the sci fi fiction that's been written on this topic.  It tends to depict it as part of a dystopian future, and a world where we have to turn to machines to be our friends seems pretty dark to me.  

 
Joshua Bertram
Posts: 672
Location: St. George, UT. Zone 8a Dry/arid. 8" of rain in a good year.
206
trees bike greening the desert
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I don't think it's going to take much more tech than an app on a cell phone to do what I'm talking about, so since I already have a phone I use apps on, I don't see why using an a.i. app would be any different.  

As for the loneliness aspect, I'm pretty sure I'd still want to create a cyber version of myself even if I were happily married/etc.  I must be really "out there" to want to do that, apparently, but I want to do that.  I think it would be really cool, and really insightful.  Then, if it really could run cyber errands for me, it would make it that much better.  I really think it's going to happen next year at the rate we're going.

Like I mentioned towards the beginning, imagine the senile, or mentally ill who could  benefit from talking to a loved cyber loved one.  Now, those people have a limited number of people willing to try to chat with them due to time/patience.  Would it be better for them to be able to talk to a real human?  Yes, 100%.  The only problem is there aren't that many people willing to do it.

Yeah, it's a weird question for a permaculture site.  I think you're right.



 
Jolene Csakany
Posts: 14
Location: Asheville, NC
6
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
My father was a fan of Dolly and the song, that's how I got the name.

I guess I don't think about the ultimate outcome mattering, whether we are destroyed by an asteroid or nuclear war; or whether humanity spreads across the universe ultimately doesn't matter to me.  It's just about being in the moment and doing something that helps you feel like you're in optimal health and balance with all because we're a social species and so that feels good.  Not chasing pleasure in a hedonistic way, but letting go of attachments and aversions so you can appreciate the experience of now, because it's all we've truly got.  Or so it seems to me.  My mom always like to tell me, "You can't change the world!" and maybe we can't, but we can live in integrity with our own values and find value in the experience even if we're not happy all the time with a friend who truly gets us because it's not a living being with it's own make up and past and isn't programmed by us.  
 
Douglas Alpenstock
master pollinator
Posts: 4668
Location: Canadian Prairies - Zone 3b
1275
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Joshua Bertram wrote:As for the loneliness aspect, I'm pretty sure I'd still want to create a cyber version of myself even if I were happily married/etc.  


Err, sort of a "work wife?" This never ends well.

A real human companion will force you to think and act outside of your notions and push your boundaries -- forcing you to grow. We are either growing or dying -- basic fact of human existence. An actual human community forces you to grow.

An AI companion will tell you what you want to hear. This is the equivalent of addiction, chemical or otherwise. We already see how media feed algorithms cause people to become true believers in whatever cause because they live on their phones. It takes a lot of effort to dig up the big picture and make a grown-up, rational assessment of the facts and assess the big picture. I know people who can't be bothered -- they live in the spin.
 
Posts: 52
20
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
while I'm relatively comfortable in my own skin i don't want to date myself.  I need a helpmate to improve my life, not an AI emotional support machine. once year, about this time i unplug from all electronics for 2 weeks+. if you want to communicate with me you have to come to where is breath. this is strange to some yet refreshing for me. having been plugged in since the univac era it has been my tradition to take a Christmas break  read a book, go for walks, talk to my wife... if i skip this, or become "cranky" my wife and kids remind me to take some time off. I am not set up to be linked to electronics 24/7. yes tech makes my life easier in many ways, but i need a break to focus realign my compass, talk to God and ask him about the platypus, & other silly stuff. once I recharge my soul I'm better equipped to interact with people. john <introvert  in Ohio>......
 
Posts: 12
1
  • Likes 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Joshua Bertram wrote:I feel like this is inevitably going to happen, or it may already exist and isn't available yet.
I see no reason why we can't use one of the new chat gpt models to be programmed to learn to be our "friend"…



I would definitely date a cloned (physical) version of myself also diverged to be more female, but I personally wouldn’t be interested if it were just digital. Even if it had a robotic body, I could pretend it had consciousness like me, but I don’t agree when some  say, “If you can’t perceive the difference (between “the artificial” and “the real thing”) then there is no difference,” No, it’s just that one can’t perceive the difference or one is just choosing to pretend (which is fine). Now, for me, it doesn’t have to be 100% biological or with consciousness, but there is a cut off point when it has to be a certain percentage, which I think is different for everyone.
 
Posts: 31
Location: Seattle, WA 😕
1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I like your train of thought regardless of agreement or truth in everything. IDK why I haven’t the option to reply to the post on this other than perpetual tech challenges.
As to conversations with Jesus, Einstein, Tesla, etc., it could not be an authentic conversation.
Aside from the data on all words spoken by any of them being highly unreliable as a whole, a conversation is an organic mental process of intellectual sharing and inspiration between humans. Our thoughts spark original new thoughts and ideas between us. AI is solely a database of collected information set in cyber stone. It possesses no creativity of its own. If there is some actual AI intelligence out there, in here, wherever, which acts like a parasite upon humans, it is precisely because of our innate creativity, and its lack thereof. But than can evoke some low level of “consciousness” where IT is envious of us humans.
 
Seriously? That's what you're going with? I prefer this tiny ad:
Sepper Program: Theme Weeks
https://permies.com/wiki/249013/Sepper-Program-Theme-Weeks
reply
    Bookmark Topic Watch Topic
  • New Topic