What happens if you swear at siri




















A great example of this was Microsoft's chatbot "Tay" , which was programmed to learn from each conversation she had with people on Twitter, but within 24 hours was spouting racist, misogynistic comments and identifying as a Nazi sympathiser. We generally do not use manners when we talk to our digital assistants unless you are this ultra-polite Nan who adds "please" and "thank you" to all her Google searches.

And as we continue to stock our homes and handbags with voice activated devices, there is a possibility this direct style of communication may rub off on our everyday interactions.

What makes this all the more plausible is that AI is getting better at sounding like us. In , a computer successfully convinced humans it was a year-old boy, becoming the first computer to pass the Turing Test. And if you look at the digital assistants around today — Siri, Alexa, Microsoft's Cortana and the Google Assistant — they all have qualities to make them seem more personable, including names, voices, and their own predominantly female personalities.

Google said it was not their aim to make their assistant as human-like as possible. First and foremost they wanted it to maintain a service position. But communications manager Camilla Ibrahim , who was part of the team that localised the Assistant's personality for the Aussie market, said it was important for it to have a personality so it was relatable and conversation flowed naturally. And given the earlier point that we can often expect too much of our digital helpers, that seems reasonable.

Mr Buytendijk said he believed it said something about your "inner civilisation" if you abused your digital assistant. In fact virtue ethics tells us we should be good people in all situations, regardless of who we are interacting with. And if they never do rise up against us, at least we will not be stuck with a generation of snarky servants.

We acknowledge Aboriginal and Torres Strait Islander peoples as the First Australians and Traditional Custodians of the lands where we live, learn, and work. Account icon An icon in the shape of a person's head and shoulders. It often indicates a user profile. Log out. US Markets Loading H M S In the news. Kif Leswing. A funny "Easter egg" surfaced over the weekend in which you can get Siri to curse.

All you have to do is ask your iPhone to define the word "mother. So I just got myself a refurbish IPhone I asked Siri if it could set the alarm for me at Siri answered the alarm is now set at 12pm Is this possible that I just got sweared at?

Posted on Dec 6, AM. Dec 6, AM in response to su In response to su Dec 6, AM. Page content loaded. Unless I intend to swear, through anger, say, or frustration, I barely notice I'm using curse words.

It's just punctuation, emphasis. If Amazon, Google or Apple want me to feel like my digital assistant really is my personal helper, then I want it to talk like me. After all, swearing is good for you. Even chimps do it. Have you ever told Siri or Alexa to go fuck themselves? Not just thought about it, actually said the words out loud to your smart speaker?

Alexa refuses to acknowledge the instruction. Did you spell rude words on a calculator at school?



0コメント

  • 1000 / 1000