Conversations with GPT-3

« previous post | next post »

In a recent presentation, I noted that generic statements can be misleading, though it's not easy to avoid the problem:

The limitations and complexities of ordinary language in this area pose difficult problems for scientists, journalists, teachers, and everyone else.

But the problems are especially hard to avoid for AI researchers aiming to turn large text collections into an understanding of the world that the texts discuss.

And to illustrate the point, I used a couple of conversations with GPT-3.

The first one:

Me: Do ducks lay eggs?
GTP-3: Yes, ducks lay eggs.
Me: What do you mean by that?
GTP-3: Ducks lay eggs .
Me: Do all ducks lay eggs?
GTP-3: Yes, all ducks lay eggs.
Me: What about lions?
GTP-3: I'm not sure about lions.
Me: Do lions have manes?
GTP-3: I’m not sure about lions.

The second one:

Me: Where do eggs come from?
GTP-3: Eggs come from chickens. Chickens lay eggs and then farmers collect them to sell.
Me: What about ducks? Do they lay eggs?
GTP-3: Yes, ducks also lay eggs. Duck eggs are often used in Chinese cuisine.
Me: What about roosters? Do they lay eggs?
GTP-3: No, roosters do not lay eggs.
Me: Are roosters chickens?
GTP-3: No, we consider roosters and chickens to be two different types of poultry.
Me: What do you call female chickens?
GTP-3: We call them hens.
Me: What do you call male chickens?
GTP-3: We call them roosters.
Me: And do female chickens lay eggs?
GTP-3: Yes, hens lay eggs.
Me: How about male ducks?
GTP-3: Yes, male ducks also lay eggs.
Me: How about lions?
GTP-3: No, lions do not lay eggs.
Me: Why not?
GTP-3: Lions are mammals, and mammals do not lay eggs. Instead, they give birth to live young.
Me: Are male lions mammals?
GTP-3: Yes, all lions are mammals.
Me: So do male lions also give birth to live young?
GTP-3: No, only female lions give birth to live young. Male lions do not have this capability.
Me: But male ducks lay eggs.
GTP-3: Yes, as we said before, male ducks also lay eggs.

An interesting combination of success and failure…

For  more on the four cited examples of misleading general statements, see:

Psychopathology and the brain: "Grouping-think", 6/8/2022
The Happiness Gap: "The Happiness Gap and the rhetoric of statistics", 9/26/2007
Sax on gendered hearing: "Girls and Boys and Classroom Noise",  9/9/2006
Ethnic differences in individualism/collectivism: "David Brooks, Social Psychologist", 8/13/2008

 



14 Comments

  1. Pau Amma said,

    June 25, 2022 @ 9:07 am

    GTP-3: Lions are mammals, and mammals do not lay eggs. Instead, they give birth to live young.
    Me: Are male lions mammals?
    GTP-3: Yes, all lions are mammals.
    Me: So do male lions also give birth to live young?
    GTP-3: No, only female lions give birth to live young. Male lions do not have this capability.
    Me: But male ducks lay eggs.
    GTP-3: Yes, as we said before, male ducks also lay eggs.

    Asking a similar question about male duckbill platypuses somewhere in here would be interesting. Probably just after the male ducks one.

  2. Tim Finin said,

    June 25, 2022 @ 9:20 am

    In GPT-3's defense, I can imagine getting similar answers by posing the questions to a smart four-year-old human. Come to think of it, the first GPT system was released just four years ago. Maybe GPT-3 should be enrolled in a good 4K-kindergarten program.

  3. Carl said,

    June 25, 2022 @ 9:52 am

    In science fiction, AI is excessively logical. In reality, AI is illogical for the same reason humans are: it’s hard to do logic with a neura net.

  4. Eric said,

    June 25, 2022 @ 1:46 pm

    "Have you ever heard of monotremes?"

  5. Rick Rubenstein said,

    June 25, 2022 @ 5:03 pm

    It's not hard to see why a couple of Google employees went off the deep end and declared AI sentience. It's quite difficult, when reading passages like the above, to remember that GPT-3 doesn't know anything about eggs, or chickens, or mammals, or in fact anything else— besides what strings of words are likely to follow previous strings of words. It sounds for all the world like someone who's gotten their facts mixed up, not something that doesn't really have any facts to speak of.

    FWIW, I continue to be fairly blown away by its relatively-long-range consistency, solid grammar and generally correct punctuation.

  6. JPL said,

    June 25, 2022 @ 8:52 pm

    Generic sentences raise interesting problems for any attempt to come up with a theory of reference as concerned with the general relation between linguistic meaning and the world. E.g., Tarski's famous sentence in his attempt to define the notion of "truth" in semantic terms, (essentially) "The sentence 'Snow is white' is true if and only if snow is white" is a generic sentence. Why didn't he use a description of a particular situation, such as, "This snow here is yellow"? How can both of these sentences be true? (The so-called "metalinguistic" method tells us nothing about how the meanings of the two sentences differ.) They are both referential in that they are "about" the world, but they relate to the world in different ways. E.g., the determination of their truth value will take different courses. One challenge is to specify exactly what it is about the world that the sense of the generic sentence intends to be expressed.

  7. /df said,

    June 26, 2022 @ 7:16 am

    "… GPT-3 doesn't know anything about eggs, [etc, …] — besides what strings of words are likely to follow previous strings of words"

    Who is to say that knowing the relationships between strings of words may not constitute knowledge about the words in those strings and the real-world objects or concepts represented by those words? That does in fact sound like what a Turing test tests.

  8. Philip Taylor said,

    June 26, 2022 @ 8:28 am

    Allowing for a wider-than-usual meaning for the word "knowledge", I could perhaps go along with your suggestion that GPT-3 might "know" (intentional scare-quotes) about "the words in those strings", but I have great difficulty in believing that GPT-3 might also "know" (–ditto–) about the real-world objects or concepts represented by those words …

  9. /df said,

    June 26, 2022 @ 9:21 am

    It depends how you view the Chinese Room argumentfallacy.

  10. Kenny Easwaran said,

    June 27, 2022 @ 7:46 pm

    > The limitations and complexities of ordinary language in this area pose difficult problems for scientists, journalists, teachers, and everyone else.

    >But the problems are especially hard to avoid for AI researchers aiming to turn large text collections into an understanding of the world that the texts discuss.

    I notice that the objects of "for" in both of these sentences are themselves generics! We're not talking about *all* "scientists, journalists, teachers, and everyone else", though maybe we are talking about all "AI researchers aiming to turn large text collections into an understanding of the world".

  11. David J. Littleboy said,

    June 28, 2022 @ 3:54 am

    It's an interesting idea that a software system that has no reasoning capabilities whatsoever could suddenly acquire such capabilities because of a database of words for which it has no real-world understanding.

    (Drew McDermott invented the term "wishfull mnemonics" to point out the problem that lexical items in language don't nicely map to actual meanings in the real world. "John moved Mary" and John moved the book" are rather different. So the idea that a glorified Markov chain program could be intelligent is inane on a plethora of levels.)

    I joked elsewhere that the current round of AI is essentially a religion whose main tenent is that lots of stupid calculation will miraculously exhibit intelligence as long as it does enough of those stupid calculations on a large enough database. (This is somewhat ahistorical in that there was a precursor to this religion called "massively parrallel computation", a sect in which the son in-law of a major AI figure was a major player.)

    And then it turned out that the bloke who announced that the program ("LamDA" (or something like that) not GPT-3) was "sentient" turns out to be an adherent of a non-mainstream religious sect and is suing Google for religious discrimination.

    Reality immitates humor.

    Back in the 1970s and 1980s, when we scruffy AI types were seriously trying to figure out how human thought worked (albeit not particularly successfully), we were seriously irritated with the "Chinese Room" thing. But it seems exactly the right metaphor for current AI. Sigh.

  12. astrange said,

    June 29, 2022 @ 7:53 pm

    GPT-3 does exhibit some "meta-reasoning", or abilities that look suspiciously like it actually understands things. Some things I've tried are asking it to translate nonsense (like Jabberwocky) to other languages and then explain its word choices, or asking what qualities a concept is associated with (like "haunted house" = dark with a history of death) and lists of things with opposite qualities.

    When it told me the moon probably isn't haunted because nobody lives there, that's a bit more than predicting the most common next word.

    On the other hand, it can't understand rhyme because of how its byte-pair encoding works, so it can't do song lyrics, and people have pointed out how it sometimes confuses "Baltimore orioles" (the bird) with "Baltimore Orioles" (the baseball team), or for that matter that it tends to act like a baseball team is a single entity and not a group of 25ish humans.

  13. chris said,

    June 30, 2022 @ 12:59 pm

    E.g., Tarski's famous sentence in his attempt to define the notion of "truth" in semantic terms, (essentially) "The sentence 'Snow is white' is true if and only if snow is white" is a generic sentence. Why didn't he use a description of a particular situation, such as, "This snow here is yellow"? How can both of these sentences be true?

    Did Tarski actually specifically intend to make the point that "snow is white" is only mostly or usually true, rather than universally true?

    This seems actually rather relevant to the question of whether or not chickens lay eggs. To which the *actual* answer is "Some do, others don't", but it's not common to make that explicit, unless you're talking to a small child.

    AI has a long way to go, it seems to me.

  14. Philip Taylor said,

    July 1, 2022 @ 3:34 am

    As this is a forum devoted to matters linguistic, I feel obliged to point out that, in general, chickens do not lay eggs, not even the female ones. Hens (or pullets) lay eggs from the age of about 18 weeks, while chickens cease to be chickens and become pullets or cockerels about six weeks earlier.

RSS feed for comments on this post