Page 1 of 2
#1
Could you "date" (however you want to interpret that) an artificial intelligence?

What would have to exist before you would consider it? Bodies indistinguishable from real human bodies? Would a program need to be installed that forces occasional illogical decisions to be in line with "normal" human behaviour? Does the AI need to be indistinguishable from human intelligence? Would you have
What are reasons why you would prefer a human born as one over an artificially created human?

These are all questions that may be answered by you.

I have been playing a lot of Mass Effect recently, and let's just say I don't have brittle bone disease *wink*
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#2
The question is not so much "could I date an artificial intelligence?" so much as "would an artificial intelligence date me?"

It probably wouldn't
Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#3
I was gonna make a joke about Ex Machina but I didn't want to spoil it since it's a really really good film and you should all see it
dirtbag ballet by the bins down the alley
as i walk through the chalet of the shadow of death
everything that you've come to expect


#4
I'm probably going to watch that one this weekend. I'm excited
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#5
yea sure.


I mean like, curie was my #1 wife in fallout 4.
O.K.

“There's never enough time to do all the nothing you want.”
~ Bill Watterson


O__o
#6
Quote by ultimate-slash
I'm probably going to watch that one this weekend. I'm excited

Just remember the lotion and the Kleenex.

You're gonna need them.
Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#7
Quote by slapsymcdougal
Just remember the lotion and the Kleenex.

You're gonna need them.

Don't be silly...


I live on my own, no one is gonna complain if I don't use Kleenex
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#8
it depends on if the ai has scarlett johansson's voice or not


also jack = best ME girl
will someone carry me across ten thousand miles under the silence
#9
Quote by Baby Joel
it depends on if the ai has scarlett johansson's voice or not


also jack = best ME girl

In ME2 I was flirting up Jack, and at the last second I went to her and said "Nah, never mind." She died 5 minutes later
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#12
Quote by ultimate-slash
Does the AI need to be indistinguishable from human intelligence?


If this was a prerequisite for me, I wouldn't be married to my wife, HAYO!


Honestly though, while everyone is like "hell yeah I'd date an AI" I think it would become very problematic in the face of a disagreement or an "agree to disagree" situation. It would be difficult to accept not being right (either morally or intellectually) against a machine. It's difficult enough with another human, with an AI it would be very challenging to the point that I think the first big argument would be a dealbreaker in 99% of cases
#13
Quote by flexiblemile
If this was a prerequisite for me, I wouldn't be married to my wife, HAYO!


Honestly though, while everyone is like "hell yeah I'd date an AI" I think it would become very problematic in the face of a disagreement or an "agree to disagree" situation. It would be difficult to accept not being right (either morally or intellectually) against a machine. It's difficult enough with another human, with an AI it would be very challenging to the point that I think the first big argument would be a dealbreaker in 99% of cases

But if the AI proves that something is factually right the way it tells you, what is the point in agreeing to disagree? If anything, I'd say this would be a great opportunity for people to learn to be more rational. And this issue also seems like something a good AI can take into account, by which I mean it could be programmed to respect the fragile human ego.
As far as moral matters go, if a machine can provide a "correct" answer, I'd happily accept that, but I think an AI would have just as much trouble with it as a human, seeing as there is no definitive morality. AI's might be able to wade through some of the nonsense surrounding moral issue however.
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


Last edited by ultimate-slash at Jun 8, 2016,
#14
Quote by ultimate-slash
But if the AI proves that something is factually right the way it tells you, what is the point in agreeing to disagree? If anything, I'd say this would be a great opportunity for people to learn to be more rational. And this issue also seems like something a good AI can take into account, by which I mean it could be programmed to respect the fragile human ego.
As far as moral matters go, if a machine can provide a "correct" answer, I'd happily accept that, but I think an AI would have just as much trouble with it as a human, seeing as there is no definitive morality. AI's might be able to wade through some of the nonsense surrounding moral issue however.

The real question is will the Konami code work on an AI girlfriend?
Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#15
Quote by slapsymcdougal
The real question is will the Konami code work on an AI girlfriend?

Would make for an interesting bedroom situation...

"Tell me what you like?"
- Up, up, down, down, left, right, left, right, B, A
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#16
Quote by ultimate-slash
But if the AI proves that something is factually right the way it tells you, what is the point in agreeing to disagree?


Why is it different if a machine is doing it as opposed to a person? Well presented evidence should be self explanatory... but it's not. As a species, we really are not as rational as we think we are. Not even close

Quote by ultimate-slash

If anything, I'd say this would be a great opportunity for people to learn to be more rational.


Haha, good luck with that!

We often can't be convinced of very simple things by the people we love the most and who love us the most. If I'm going to argue with my wife when she says I drink too much, why would I listen more to a robot?

I think that the uncanny valley will prove to be a much bigger obstacle than it seems. We will be able to let AIs do our work, design our buildings, write our music, make love to us but I really can not see how Joe six pack will allow an AI to tell him he's wrong when most of them can't even accept it when their wives do it.

The concept of an All Knowing and All Powerful intelligence is not a new concept in matrimonial affairs. I don't think that making that intelligence synthetic is going to change anything. Not in a positive way anyways
Last edited by flexiblemile at Jun 8, 2016,
#17
Quote by ultimate-slash
Would make for an interesting bedroom situation...

"Tell me what you like?"
- Up, up, down, down, left, right, left, right, B, A

Just make sure not to do down up left left A right down. It's a sure way to put her in a mood.
Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#18
Quote by flexiblemile
We often can't be convinced of very simple things by the people we love the most and who love us the most. If I'm going to argue with my wife when she says I drink too much, why would I listen more to a robot?

Why would an AI claim you drink too much if you aren't drinking too much? Perhaps if an AI tells you you're drinking too much, it's worth listening?
I'm very, very stubborn. I know that about myself. But I'd like to think that if someone calls me out on my bullshit and I have no way of arguing against I don't just shut them out. I might behave irrational, become angry, whatever, but I really hope that after a while I'll admit that, alright, I was being an ass.

I'm not saying humans are rational beings. Far from it. But if we reach the point where we can create an AI, I think it could very easily take that into account. Not call us out on our bullshit if the only results of it is us getting upset about it. Not tell us we're drinking too much just because it has a headache or something.

That being said, I'd very much like to think that I try my best to be as rational as I can be, and I believe an AI could be a great support in bringing about such a change. Relationship dynamic would change, definitely, and the overall societal mindset regarding relationships will also have to change for it to be a common thing. But I don't think the issues are insurmountable.
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#19
But Tali isn't a robot. You must be doing something wrong.
Quote by SGstriker
If KFC is finger-licking good, then people would probably suck dicks for Popeyes. That's how good it is.


There's nothing left here to be saved
Just barreling dogs and barking trains
Another year lost to the blue line
#20
Quote by Joshua Garcia
But Tali isn't a robot. You must be doing something wrong.

I was obviously talking about Legion. Have you seen the ass on that guy?
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


#21
You know, that's fair.

Miranda eat your heart out.
Quote by SGstriker
If KFC is finger-licking good, then people would probably suck dicks for Popeyes. That's how good it is.


There's nothing left here to be saved
Just barreling dogs and barking trains
Another year lost to the blue line
#22
Quote by ultimate-slash
Why would an AI claim you drink too much if you aren't drinking too much? Perhaps if an AI tells you you're drinking too much, it's worth listening?



Well if your wife is telling you that you drink too much or that you're spending too much money on Fabergé eggs, is it worth listening?

Why is the opinion of a hypothetical AI worth more than the opinion of your wife? Don't forget that we're not talking about an AI with the processing power of a god, we're talking about an AI with the reasoning of an average person (let's say slightly higher than average).

You know how dumb people hate smart people? It's not because they wish they themselves were smarter but because smart people will often make stupid people reflect on their stupidity and that makes them uncomfortable to the extent that they wish to terminate (get it?) the relationship. It won't matter that the AI can print you a million reasons why you drink too much or give you live readouts of your liver functions or talk about the direct correlation between alcohol abuse and length of life or whatever. At the end of the day, the vas majority of people willingly choose to ignore all (or most) uncomfortable truths, no matter the source.


I'm mostly on board with what you're saying but social stigmas and psychological issues inherant to the human brain prevent me from thinking this issue could be accepted within at the very least 5 or 6 generations of the invention of said AI
#23
Quote by flexiblemile
Well if your wife is telling you that you drink too much or that you're spending too much money on Fabergé eggs, is it worth listening?

Yes. In my experience, most people in functional relationships generally would.

Why is the opinion of a hypothetical AI worth more than the opinion of your wife? Don't forget that we're not talking about an AI with the processing power of a god, we're talking about an AI with the reasoning of an average person (let's say slightly higher than average).

I would be impressed if someone can build an AI as irrational/illogical as a human being. That's where you're going to need the processing power of a god
I think an AI would be vastly more intelligent than the average person anyway, exactly because it can cut through most bullshit.

You know how dumb people hate smart people? It's not because they wish they themselves were smarter but because smart people will often make stupid people reflect on their stupidity and that makes them uncomfortable to the extent that they wish to terminate (get it?) the relationship. It won't matter that the AI can print you a million reasons why you drink too much or give you live readouts of your liver functions or talk about the direct correlation between alcohol abuse and length of life or whatever. At the end of the day, the vas majority of people willingly choose to ignore all (or most) uncomfortable truths, no matter the source.

For this I'd again like to bring up my point that if we can create an AI, we can create an AI that doesn't call us out on every little thing.

There's no point in having it correct our grammar, but if it's important, I'd very much like to be told. Screw my ego.
And if I'm a drunk who's going to drink until I fall down dead on the ground regardless, then I don't see what the difference is between not listening to my human wife and not listening to an AI. If anything, the AI will just be a more obvious indication that I'm being the ass here.
I'm mostly on board with what you're saying but social stigmas and psychological issues inherant to the human brain prevent me from thinking this issue could be accepted within at the very least 5 or 6 generations of the invention of said AI

Please note that I'm not making any statements about the whole of society dating AI's. That's also why I asked the questions I asked in the OP (asking individuals specifically). It's going to take certain views on life, and it will take a certain kind of person to be able to cope with it.

There are plenty of dysfunctional relationships in the world, and I simply don't see one with an AI being by definition more dysfunctional than a lot of those that already exist.
Tiger got to hunt, bird got to fly; man got to sit and wonder, 'Why, why, why?'
Tiger got to sleep, bird got to land; man got to tell himself he understand.


Last edited by ultimate-slash at Jun 8, 2016,
#24
I'd probably do it just to entertain the idea, but I couldn't see myself being serious with it unless said AI's were so indistinguishable from the rest of society that they had a life of their own prior to encounter. Pretty much like a normal person who grows up and and dies (and has an understanding of that). Then I'd consider more seriously.
Quote by SGstriker
If KFC is finger-licking good, then people would probably suck dicks for Popeyes. That's how good it is.


There's nothing left here to be saved
Just barreling dogs and barking trains
Another year lost to the blue line
Last edited by Joshua Garcia at Jun 8, 2016,
#26
I'm having trouble wrapping my head around the thought.


As people, we're predisposed, socially conditioned, and/or naturalized into different preferences. I believe this to be one of the reasons people in larger communities have a harder time finding someone (diversity of option and opinion). But - because of this, our preference and tolerance values vary greatly when making isolated comparisons.

If the AI were just a plain Jane shell and an unbiased, opinionated deprived thinger - they'd be a fuck-puppet that can structure a sentence. I don't see this working.

That said - I'm also unsure I want the ability to reprogram the AI's personality to suit my needs. It'd conceptually be easy to have the SJW/Patriarchy/Liberal/Conservative section of AI shut down - where identifying that in a mate might be a point of a contention and possibly a deal-breaking issue. AI would remove the idea of scarcity and the difficulty of holding a relationship.....and with the ability to program their needs/thoughts/tastes in things - we'd likely be able to water them down so much that they turn into an interactive fuck-puppet.

While that idea may appeal to some (trophy wife, anyone?) - I'm not sure I'd fancy the option.
Real knowledge is to know the extent of one's ignorance - Confucius
#27
Quote by flexiblemile at #34000380
If this was a prerequisite for me, I wouldn't be married to my wife, HAYO! amirite fellas

Fixed.
#28
Quote by ultimate-slash


Please note that I'm not making any statements about the whole of society dating AI's. That's also why I asked the questions I asked in the OP (asking individuals specifically). It's going to take certain views on life, and it will take a certain kind of person to be able to cope with it.


That's a fair point but I think the social pressure about dating an AI would put a big obstacle in the way of even the most liberal minded people.

For example, when I played dungeons and dragons, I would make absolutely sure to not tell a lot of people because I'd get laughed at. There were also quite a few player who stopped playing because they got "bullied".

So the same principle that makes people hesitant to play D&D will be absolutely present when people walk home with an AI bride but thousandfold. Therefore the concept will be immediately marginalized and limited to "freaks and geeks" for a very long time.

However, and in my eyes this is the clincher, making an AI in a cyborg body will be a billion times more expensive than writing a D&D game. Therefore the tech will likely not develop until it's generally accepted by society but society won't accept it until it's already implanted in its ranks (the same way D&D is better accepted now than 20 years ago)

It's a bit of a vicious circle and I can't see it happening within a few generations.
Last edited by flexiblemile at Jun 8, 2016,
#29
Quote by flexiblemile
That's a fair point but I think the social pressure about dating an AI would put a big obstacle in the way of even the most liberal minded people.

For example, when I played dungeons and dragons, I would make absolutely sure to not tell a lot of people because I'd get laughed at. There were also quite a few player who stopped playing because they got "bullied".

So the same principle that makes people hesitant to play D&D will be absolutely present when people walk home with an AI bride but thousandfold. Therefore the concept will be immediately marginalized and limited to "freaks and geeks" for a very long time.

However, and in my eyes this is the clincher, making an AI in a cyborg body will be a billion times more expensive than writing a D&D game. Therefore the tech will likely not develop until it's generally accepted by society but society won't accept it until it's already implanted in its ranks (the same way D&D is better accepted now than 20 years ago)

It's a bit of a vicious circle and I can't see it happening within a few generations.
To be fair, D&D is rubbish and there are many superior PnP RPG rulesets out there that would be far more respectable.

Nerd.

Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#30
Quote by slapsymcdougal
To be fair, D&D is rubbish and there are many superior PnP RPG rulesets out there that would be far more respectable.

Nerd.



haha. My favorite was actually star wars saga D20 but D&D is more widely known and so better illustrates my point
#31
No, I can't ever see it being a real thing.

Because they wouldn't have feelings.

Like, they'd be programmed to respond in appropriate ways, but they wouldn't actually have feelings - in that if you insulted them they wouldn't actually be offended and if you gave them a gift or something they wouldn't actually feel joy. As much as they could be programmed to give convincing replies to your actions to suggest otherwise, deep down you'd know otherwise.

For that reason you wouldn't truly be able to care for them as you'd know, even if they were convincing, that there was no need to consider their feelings, and so you could just be completely selfish. They'd just become a sexual servant, more than anything else.
#32
Quote by matt bickerton
No, I can't ever see it being a real thing.

Because they wouldn't have feelings.

Like, they'd be programmed to respond in appropriate ways, but they wouldn't actually have feelings - in that if you insulted them they wouldn't actually be offended and if you gave them a gift or something they wouldn't actually feel joy. As much as they could be programmed to give convincing replies to your actions to suggest otherwise, deep down you'd know otherwise.

For that reason you wouldn't truly be able to care for them as you'd know, even if they were convincing, that there was no need to consider their feelings, and so you could just be completely selfish. They'd just become a sexual servant, more than anything else.


Not a bad point but how is it different from humans? Human emotions are either hardcoded (pain makes me sad, boobs make me happy) or learned (I feel shame when someone sees my genital, immigrants make me mad)

So ultimately the only difference between a human and an AI is that one can meet its maker and one cannot.

I'm assuming that for this product to be legit, it would need to be a closed programming in the sense that once it's turned on, it can't be modified, only shut down. So in that sense, how is it different than a human?

I don't think it is... Both are real
Last edited by flexiblemile at Jun 8, 2016,
#33
Quote by matt bickerton
No, I can't ever see it being a real thing.

Because they wouldn't have feelings.

Like, they'd be programmed to respond in appropriate ways, but they wouldn't actually have feelings - in that if you insulted them they wouldn't actually be offended and if you gave them a gift or something they wouldn't actually feel joy. As much as they could be programmed to give convincing replies to your actions to suggest otherwise, deep down you'd know otherwise.

For that reason you wouldn't truly be able to care for them as you'd know, even if they were convincing, that there was no need to consider their feelings, and so you could just be completely selfish. They'd just become a sexual servant, more than anything else.

This is how very primitive attempts at artificial intelligence work.

If you recall Microsoft's Tay(the racist robot), a 'true'(sorry, but it's shorthand) AI would be something that has very low-level 'encoded' behaviours, and would have to learn the rest.
Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#34
I think it's an issue of empathy.

We as humans can relate to how other humans are feeling as we essentially will just put ourselves in their shoes.

'I wouldn't like it if my partner did that to me, so I won't do that to them'.

With a robot/computer however, you wouldn't get that. We wouldn't truly know what the robot 'felt' inside if say, we told it to fuck off and leave us alone. Did that hurt it? Has it essentially just seen that as a command to which one of many automated responses is given?

I guess the only way it'd potentially work if it was given some sort of believable emotional scale. But to get to the stage where that's even remotely passable as actual emotion and people willing to treat it as such accordingly, I don't think we'll see such a thing in our lifetime.
#35
Quote by matt bickerton
I think it's an issue of empathy.

We as humans can relate to how other humans are feeling as we essentially will just put ourselves in their shoes.

'I wouldn't like it if my partner did that to me, so I won't do that to them'.

With a robot/computer however, you wouldn't get that. We wouldn't truly know what the robot 'felt' inside if say, we told it to fuck off and leave us alone. Did that hurt it? Has it essentially just seen that as a command to which one of many automated responses is given?

I guess the only way it'd potentially work if it was given some sort of believable emotional scale. But to get to the stage where that's even remotely passable as actual emotion and people willing to treat it as such accordingly, I don't think we'll see such a thing in our lifetime.

We'd know when the T-800's came for us.
Quote by Diemon Dave
Don't go ninjerin nobody don't need ninjerin'
#36
Quote by Trowzaa
I was gonna make a joke about Ex Machina but I didn't want to spoil it since it's a really really good film and you should all see it

I was also going to make a joke about ex machina but I didnt because of your restraint
#38
Quote by matt bickerton

With a robot/computer however, you wouldn't get that. We wouldn't truly know what the robot 'felt' inside if say, we told it to fuck off and leave us alone. Did that hurt it? Has it essentially just seen that as a command to which one of many automated responses is given?



Well a true AI wouldn't have a list of automatic responses. If that were the case it wouldn't an AI but a puppet. I can't think of better term but their reponse would need to be procedurally generated as opposed to manually encoded beforehand. Which is very similar to the way humans think, act and formulate responses.

Also the uncertainty about wether or not you can trust a persons response is something that is very prevalent with humans. Ever go to a board meeting or negociate a deal? Did your girlfriend ever tell you that it was fine if you went fishing with your buddies, that it was, in fact, just perfect.

I would trust my AI wife to be more forthcoming than all my ex girlfriends
#39
Watch the movie "Her". It's pretty good.
'93 Gibson LP Studio (498T/490R)-Ebony
'14 Gibson LP Standard (JB/Jazz)-Ocean Water Perimeter
Epi MKH LP Custom-7 (SD Custom Shop JB-7)-Ebony
+More

Maxon od808|Boss NS-2|Boss CE-5|
Line6 G55|Korg Pitchblack Pro

JVM 210h|1960a(V30/G12t-75)
Page 1 of 2