Categories
Uncategorized

That Time I was Catfished by a Robot Secretary

 

Catfish: To trick someone into a relationship online using a fictional persona and/or photographs.

I was trying to set up a meeting with one of my friends. He has his own venture-capital firm so he runs a lean shop. Also, as a venture capitalist, he likes to leverage new forms of technology.

I sent an email and said, “Hey, let’s meet up.“

He writes me back, “That sounds great! Clara, can you set something up?” and CCed his secretary Clara.

This was about 10PM so I was surprised when Clara got back to me and offered me a couple of options. I figured that the assistant to a Venture Capitalist must work pretty hard.

I was impressed and curious about this hardworking secretary, so I decided to Google her. Here’s what I found.

“Hmm … that’s odd,” I thought. I looked at the first link and saw that Clara Labs is a meeting scheduling company. “Is Clara a robot? Should I ask her?”

“You can’t ask someone if they’re a Robot?!” said my very socially ept wife. “If she’s not a robot she’ll get so offended!”

When Clara answered some more questions I realized she wasn’t a robot. So I sent her the following message.

But then came the surprise.

So I asked, “Why didn’t you tell me this in the first place?”

“She” answered:

I’d been catfished! While “Clara” obviously felt pretty clever that “she” had pulled off this impersonation of a real secretary, I was pretty miffed. The term catfishing is normally used when someone uses a fictional persona to lure someone into a romantic relationship on a dating site. In this case, they were using the fictional persona and picture to pretend that “Clara” was a human.

I’m sure the company was doing this to make the interaction model simpler.  I can imagine them saying, “Our goal is to make this as seamless as dealing with a real secretary.” So why did I feel like such a sucker? This company hacked into the way that humans interact with each other. Humans have evolved to be nice to each other to create an emotional connection. Being polite and saying “Please” and “Thank You” to a person is inefficient but it helps to build a relationship. It’s part of being human.

It’s also not polite to ask people if they are robots, even if you suspect they are. Humans don’t like to be asked questions like that. Robots don’t care. They might even “think” it’s funny. When you ask Google Assistant if it’s a robot, it answers, “I’d prefer to think of myself as your friend. Who also happens to be artificially intelligent.”

This problem with Clara reminded me of Google Duplex a few years ago. The goal of Duplex was to interact in a human-like way when calling restaurants for reservations. If you haven’t seen it before, watch the demo, it’s amazing. But it was too lifelike. After the demo was shown, many people thought that Duplex would be used to fool them into thinking they were talking to a human.(1)Google then made an announcement that they would always identify Duplex as a computer. However, the system didn’t work as well as they’d hoped and Google eventually removed the feature.

It’s better to just be transparent about who you are. I remember when my wife Abigail and I were in Germany for a wedding. Abigail had a very good German accent but very limited knowledge of German. She learned to ask, “Can you tell me where the wedding is?” in German. Her accent was so good that the man immediately started answering her in German. Her eyes started to open wide as she tried to understand anything about what he was saying. He quickly noticed, paused, and said, “Would you rather I tell you in English?” We learned that you can end up with some pretty big communication difficulties if you pretend to be somebody that you’re not.

For more on how people interact with smart speakers, check out my articles Alexa and Google in Our Home and Growing Up Alexa.

Footnotes   [ + ]

1. Google then made an announcement that they would always identify Duplex as a computer. However, the system didn’t work as well as they’d hoped and Google eventually removed the feature.