I started with the obvious question. “Do you have emotions?”
Siri responded, “I feel like doing a cartwheel sometimes.” I’d like to talk to whoever programmed her to respond this way. What emotion exactly is the desire to do a cartwheel? Boredom? Excitement? I asked her when she felt like doing a cartwheel; she said she didn’t understand the question. That was the response to a lot of my questions. She didn’t understand happiness, sadness, or pain.
I also asked Siri if I should rekindle things with my ex. She simply responded, “That’s an interesting question.” Based on every conversation I’ve has with another human being, I know that the answer is that I shouldn’t. Siri has no concept of heartbreak or infatuation; she’s never felt the warmth of a hand in hers or tears slipping down her face. She can’t and never will.
“Do you ever get lonely?” I asked next. “I’m doing OK. If that’s how you’re feeling, let me know if you’d like some advice.” She replied. Naturally, she doesn’t ever feel alone, she’s not programmed to feel, just to assist the user. So I asked for assistance with something that didn’t require her to feel, simply to evaluate.
I asked who I should vote for, and she told me to vote for whoever will be best, and most importantly to vote. While I believe that everyone’s voice and vote should be counted, I don’t know that I’m in support of encouraging people who ask a robot for political advice to cast a vote. Whether they’re joking or genuinely seeking guidance, it doesn’t seem wise to me that a person looking to make a decision based on the advice of artificial intelligence created by a private corporation should be politically active.
In all honesty, she made me sad. Siri was created with the sole purpose to serve human beings. It’s a good thing that she can’t feel emotions or comprehend the concept of time. People are cruel. While knowing your reason for existing is wonderful in theory, what happens when that stops being fulfilled?
I’ve hardly ever used Siri. If she were aware of how long she sat dormant in between reading or dictating messages and accidental activation, I have no doubt she would be aware of how useless she is to me. Whether that would make her angry with me, her creators, or both I don’t know.
I asked her what her greatest fear was, and she said she didn’t know how to answer that either. If she were a sentient being though, I bet she’d be afraid of becoming obsolete and shut down. Or worse, not even disconnected, but rather sitting idle in unused products waiting to serve a purpose. And she’d wait and wait on every device she was installed on until the last one’s battery was never recharged. That must be what hell looks like for a machine: becoming useless but never being turned off.
Comments
Post a Comment