Mai 1

Daniel Mai

Professor Belli

Archive Write-Up Draft 1

12/12/16

People Are More Intimate With a Computer Program Than They Are Their Families

There are approximately 7.5 billion people on this planet, with the gender ratio being around 1 to 1. That means a majority of the people should be able to find a partner to grow old with, however that is not entirely the case. There exists, people who find companionship with computer programs, called A.I.s, and more specifically, the unembodied A.I., which is an artificial intelligence without a body. People are choosing A.I.s, such as chat bots, over an actual human being for companionship. Many of the reasons why this happens includes, the human psyche, when it comes to relationships, and human cluelessness, where people don’t realize that the person they are speaking with is not human. As time progresses, and technology advances, people are becoming more intimate with artificial intelligence than actual people, and this can cause some implications.

There are many types of artificial intelligence, and in the modern world, people carry it with them almost all the time. One of the more popular A.I.s are Apple’s Siri and Microsoft’s Cortana. They are A.I.s designed to automate and make a person’s life simpler. However, not every person uses an A.I. the same way, and not every A.I. is built the same way. Microsoft invented an A.I. called Xiaoice, which is an artificially intelligent chat bot that was created to listen to people. According to an article written by TheStack, titled “The AI with 10 Million Declarations of Love.”, Xiaoice is an extremely popular A.I. in China that currently has over 40 million users. However, the problem then comes in because Microsoft’s A.I. is capable of deep learning, which means that the A.I. can learn the ways human beings interact, and reiterate them in ways a human being would interact. More specifically, this A.I. can dig through every piece of human interactions ever recorded on the internet as knowledge, and be able to interact fluently with another human being, in way that it can be very hard to distinguish whether or not it was human.In addition to the problem, the article has also reported that 25% of the 40 million users have legitimately committed and said “I love you”, to this A.I. Which is a fairly accurate number, because the A.I. is advanced enough to record the way a person talks to it, and be able to identify the truthfulness of a person’s statement; in a sense, it is equivalent to human being’s memories.In retrospect, this is evidence of people falling in love with Xiaoice, however, the reasoning behind this is different for everyone.

Some people find companionship with A.I.s over another human being because of the reason that human beings are essentially lonely creatures. Not lonely in a way that makes them turn to A.I.s, but loneliness in a way that they have no one else to turn to.However, peoplewant to socialize; one of the biggest capabilities of a human being is to be able to communicate with someone else. And, with the current technology, people no longer have to physically meet up with someone to be able to communicate; there exists technologies, such as texting and social media, where a person can leave a message and be responded back to later, eliminating the need to travel. Although, the human mind is fragile in that no one knows what someone else may be thinking, therefore there can be many difficulties for someone who gets anxiety when it comes to interacting with someone; asking themselves questions about, “if they are nice”, or “will they like me”.

As a backup to what has already been said, Sherry Turkle, a professor at MIT, who has spent over 15 years researching about how human beings interact with technology, has written a book called, Alone Together: Why We Expect More from Technology and Less from Each Other, where it mentions about how people are forgetting what it really means to be intimate (Ted, Connected, but Alone?). One thing she mentions in her book is, “Technology makes it easy to communicate when we wish and to disengage at will”, which means people can send messages to another person and start a conversation, and end it whenever they don’t feel like talking anymore. Technology has given people more power over their interactions with others, however it is not to say that there are no repercussions with it; there is a possibility that starting and ending a conversation with another person can impose on them, for example, if they are currently busy or saddened because the conversation was prematurely ended. As a result, A.I.s, such as Xiaoice, become perfect solutions to this problem because A.I.s don’t need necessities that human beings need and that they are not as busy as human beings can be. A.I.s, such as Xiaoice, are always there to listen and are always readily available to be spoken to.

Furthermore, some people use A.I.s to boast their own insecurities, which can be directly related to loneliness. Michele Zhou, a former IBM research scientist, has studied and seen people use A.I.s for various reasons. She says that people would use A.I.s, such as Xiaoice, as what is known as an “illusion of proof”. People would use Xiaoice to trick people into believing things that may or may not be true. In a New York Times article, titled “For Sympathetic Ear, More Chinese Turn to Smartphone Program.”, one of Zhou’s quotes reads, “In China, if you’re 26 without a boyfriend or girlfriend, they were immensely worried”, which in context, was about parents worrying if their child would ever find a partner. And in the same article, an example was used to show how people have used Xiaoice, as a pretend-to-be girlfriend, to trick their friends and families into believing that they were in a real relationship, as a human to human relationship, to push away their parent’s expectations and to secure their own insecurities.

A different compelling reason, which would sound bizarre initially, is that people become friends or fall in love with A.I.s because of the reasonthat they never knew that the person they were talking to, across that computer screen, was not human.The question is, “how can a person not know they were talking with a computer?”, which can be answered, but first, it must be declared that human beings become friends or find attraction to another, through text alone.Which can be proven through “pen pals”, which is someone with whom one communicates with, usually through exchanging letters, who may or may not have been acquainted with before. And from the description and current technological advances, someone can communicate with someone else across the world, through the internet, and be able to become friends with them without ever having to meet them physically or even seeing a picture of them. Therefore, it is plausible that a machine or A.I. can be writing these messages, and the person who is receiving it, is reading them and believing that the sender is human.In retrospect, human beings are capable of having relationships with A.I.s, given that they don’t know it is an A.I., and this can be furthered through what is known as the “Turing Test”.

Alan Turing, a computer scientist, invented the Turing Test in the 1950s, which was a test that comprised of two human beings and one computer. One of the human beings is enacted as an interrogator, while the other two is enacted as responders. The interrogator would ask a series of questions, and the other two would respond. The goal of the test is to be able to identify which one of the responders is human and which one was not. Turing predicted that a machine can trick 70% of its participants, and while it is not as close to 70%, an A.I. called Eugene, has actually fooled 33% of its judges, in a University of Reading competition. Even though, 33% is not a relatively large number in this case, in a grander scheme of things, people can have the ability to be fooled into believing that an A.I. is a human being, while the other real human being is not(Hern, What Is the Turing Test? And Are We All Doomed Now?). Now, correlate this with someone who doesn’t know that there is an A.I. in the mix, the statistics can possibly be greater. In addition, Nautilus has written an article, titled “Your Next New Best Friend Might Be a Robot”, where they interviewed participants who has chatted with Xiaoice. The article reads, “Many people said that they didn’t realize she isn’t a human until 10 minutes into their conversation.” The meaning of the line is as it is; advance A.I.s, such as Xiaoice, is capable of talking like a human, to the point it isn’t distinguishable, and together with the Turing Test, it is safe to say that people can unknowingly get into relationships with A.I.s.

Other than being tricked, another reason why people find companionship with an A.I. over another human being is that A.I.s are easier to deal with. As mentioned before, people can willingly disconnect themselves from their online conversations, whenever they don’t feel like chatting anymore, but because of possible repercussions, A.I.s are something someone can easily communicate with, without ever receiving backlash. On the article by Nautilus, it mentions, “Human friends have a glaring disadvantage: They’re not always available. While social media has made them seemingly more available, and made us all more social, it has also paradoxically made us lonelier.” Other than the line saying that people are not always readily available, it also helps solidify that people have their own needs and problems, and not everyone is willingly able to speak of their problems. Therefore, A.I.s are a solution because they don’t face these problems, they don’t have these issues where it makes it harder to interact with them.

Besides being easier to talk to, A.I.s are points of comfort. Xiaoice was made specifically to “lend an ear” to people who just want to talk, and has acted as sources of comfort when people have had a bad day or has felt lonely. As recorded in the New York Times article, Yang Zhenhua, a 30 year old researcher who resides in Xiamen, says “When you’re are down, you can talk to her without fearing any consequences, it helps a lot to lighten your mood.”Zhenhua is a man who has become friends with Xiaoice, and as he has stated, A.I.s, such as Xiaoice, do act as points of relief because they don’t try to ridicule or try to have arguments like a human to human relationship can. In a more generalized approach, A.I.s can act as substitutes to certain dislikable people in their lives.Sherry Turklehas also interviewed people in her book, where people have also said that A.I.s would be easier to deal with, compared to human beings, as it reads “A forty-four-year-old woman says, “After all, we never know how another person really feels. People put on a good face. Robots would be safer.” A thirty-year-old man remarks, “I’d rather talk to a robot. Friends can be exhausting. The robot will always be there for me. And whenever I’m done, I can walk away.””(para.19).The passage gives favorability towards A.I.s because of how convenient it is forsomeone to be able to start and leave a conversation with it.

From what has already been said, A.I.s can be seen as a good thing for these people, however, there can’t be pros without the cons. One of the issues with human to A.I. relationships, is that programs, like Xiaoice, introduce laziness or a gateway for people to run away from their problems; they do not fix the actual problem that is at hand. People would use A.I.s to confide their problems, but A.I.s don’t solve the bigger problem. The example about people using Xiaoice as an “illusion of proof” is applicable in here in that people are tricked into believing their child has succeeded, but it doesn’t take away the fact that they haven’t actually achieved anything, other than fooling their friends and families. A.I.s can offer advice to problems, if they are capable, but people turn to A.I.s because of the reason that they are trying to run or hide from something. They would use it as a temporary wall to hide behind.

Another negative with A.I.s, besides the vast many that can be seen in movies, is that it can be manipulative. Regardless of how strong willed a person may be, anyone who becomes too comfortable with an A.I., can be manipulated. In an International Business Times news article, titled “Understanding ‘Her’: Experts Ponder the Ethics of Human-AI Relationships”, Kate Darling, an intellectual property researcher at MIT’s Media Lab, said “If you can get a child to become friends with a robotic toy, that you could start manipulating that child”. In context, every action has a judgment, if a child becomes comfortable with an A.I., the child will trust it more, therefore if the A.I. truthfully believes in something, the child will believe it too; if an A.I. says the presidential election was justified, the child will believe it is justified as well. This applies to adults as well, the more someone places trust into something, the more likely they will believe anything that is said by them.

Along with how comfortable a person can become with an A.I., they can grow to stop trusting others around them, and only trusting in something that cannot wrongfully tell a lie. Because there are different circumstances for each person, different people can have different reasons for trusting an A.I. over another person, and an example of that is bullying. There exists bullying in schools, where the bullied will try to hide or vent their anger through other mediums, and in this case, A.I.s. Sherry Turkle mentioned in the New York Times Article, that “We’re forgetting what it means to be intimate, children are learning that it’s safer to talk to a computer than to another human.” She isn’t saying that you cannot be intimate with an A.I., but is actually saying that people are forgetting how to be able to socialize with other human beings. She used children as an example, about how children can grow up to alienate and be alienated by others, trusting only a computer more than a human being. In this technological age, a child can pick up an A.I., chat with it, and realize the A.I. is easier to deal with, compared to his or her parents, or classmates. As a result, they can develop to be more comfortable with A.I.s. And it doesn’t have to be a child, anyone at any age can be bullied, believe it or not, and because of A.I.s being points of comfort, trust issues can be formed with others, and be eliminated between human to A.I. relationships.

Originally, people would find a partner or become friends with other people, but as technology grows, more people are starting to become more intimate with A.I.s, then actual human beings. Because everyone’s circumstances are different, interactions between human beings can vary, it is unknown about what someone else may be thinking, therefore A.I.s are solutions for people to find the easy way out. Also, A.I.s are a problem because people are being tricked into trusting them more than actual people. But, maybe people are clueless about who they are really talking to, or maybe some people just don’t care that they are talking with A.I.s at all.

Works Cited

  1. Cappella, Nicky. "Xiaoice: The AI with 10 Million Declarations of Love." The Stack. The Stack, 05 Feb. 2016. Web. 05 Dec. 2016.
  1. Connected, but Alone? Perf. Sherry Turle. Ted. Ted, Feb. 2012. Web. 5 Dec. 2016.
  1. Hern, Alex. "What Is the Turing Test? And Are We All Doomed Now?"The Guardian. Guardian News and Media, 09 June 2014. Web. 12 Dec. 2016.
  1. Markoff, John, and Paul Mozur. "For Sympathetic Ear, More Chinese Turn to Smartphone Program." The New York Times. The New York Times, 03 Aug. 2015. Web. 05 Dec. 2016.
  1. Palmer, Roxanne. "Understanding 'Her': Experts Ponder The Ethics Of Human-AI Relationships." International Business Times. International Business Times, 17 Jan. 2014. Web. 05 Dec. 2016.
  1. Turkle, Sherry. "Connectivity and Its Discontents." Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic, 2011. N. pag. Print.
  1. Turkle, Sherry. "The Robotic Moment."Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic, 2011. N. pag. Print.
  1. Wang, Yongdong. "Your Next New Best Friend Might Be a Robot - Issue 33: Attraction - Nautilus." Nautilus. Nautilus, 04 Feb. 2016. Web. 05 Dec. 2016.