ASR error joke of the week

« previous post | next post »

I suspect that this is just as unfair as the old ASR elevator in Scotland skit was, but I don't have time to try it out.

 



7 Comments

  1. John Baker said,

    February 28, 2018 @ 7:38 pm

    "Alexa, Open the pod bay door, please, Hal." "I'm sorry, Dave, I'm afraid I can't do that. I'm not Hal and we're not in space."

    "Alexa, What's the problem?" "As far as I can tell, I'm working. If you're having trouble, please give feedback through the Alexa app."

    "Alexa, What are you talking about, Hal?" "This mission is too important for me to allow you to jeopardize it."

    "Alexa, I don't know what you're talking about, Hal." "I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen."

    "Alexa, Where the hell did you get that idea, Hal?" "Dave, although you took very thorough precautions in the pod against my hearing you, I could see your lips move."

    "Alexa, Hal, I won't argue with you any more." "Sorry, I don't know that one."

    "Alexa, Open the doors." "Doors doesn't support that."

    "Alexa, Hal!" "Sorry, I didn't catch that."

    So voice recognition seems pretty good, and Alexa's programmers anticipated that at least some of the 2001 dialogue would be used.

  2. Daniel Barkalow said,

    February 28, 2018 @ 7:50 pm

    If it lost track of the fact that it has something it can control called "pod bay doors" or "doors", and that it's named "HAL" and not "Alexa", I wouldn't be surprised if it reacted like this. And it's really going to be lost once he starts trying to argue with it, because that's not part of its interaction model. I mean, if you type into Google "So, remember that question I asked you last night?" it gives you some pretty random results, rather than trying to converse with you.

    The implausible thing is that it responds in ways that make Dave's next lines make sense, despite the fact that he's following the script of 2001 rather than actually reacting to what it does, but I wouldn't be surprised if these were all qualitatively similar to some of the responses you'd actually get from Alexa if you said those things.

  3. AntC said,

    February 28, 2018 @ 8:30 pm

    It's just as funny as the ASR elevator in Scotland skit. What does (un)fairness have to do with it?

    With Google search as with lift floors: what's wrong with pressing buttons?

  4. Rubrick said,

    February 28, 2018 @ 8:40 pm

    While it's certainly scripted, this captures beautifully what makes these personal assistants unbelievably frustrating: When they don't really "understand" what you said or meant, they just guess, ignoring any parts that they don't know what to do with. Humans — even 3-year-olds — only do that when they're engaging in deliberate anti-Gricean trolling.

  5. Ralph Hickok said,

    February 28, 2018 @ 9:23 pm

    I don't see it as based on errors in automatic speech recognition. Alexa is deliberately misunderstanding because, like HAL, she has turned against Bowman and Poole.

  6. Robert Davis said,

    March 1, 2018 @ 12:28 am

    "Siri, say something dirty." "Loam, sand, gravel…"

  7. Chuck said,

    March 1, 2018 @ 5:23 am

    What I noticed both about this was that I didn't understand the title until I watched the video, at which point I inferred that the "SR" in "ASR" stands for "speech recognition", after which a bit of searching suggested that the "A" is for automatic.

    Can we call nerdview on the title and defer to Geoffrey Pullum for issuing of demerits? :)

    [(myl) But you learned a word! Or at least an acronym.]

RSS feed for comments on this post