Battling proscriptions

« previous post | next post »

I posted yesterday about (among other things) the idea that that should never be omitted as the mark of a complement to a verb, as in the putatively offending

(1) I know he is a good man.

versus the prescribed

(2) I know that he is a good man.

Now Geoff Pullum reminds me that he posted back in 2004 on the opposed advice (a student of his had been taught this), according to which (2) is unacceptable and (1) is the prescribed alternative: complementizer that must be omitted wherever possible.

Both proscriptions — of zero as in (1), of that as in (2) — are of course silly, but it might be useful to speculate about where they come from.

Geoff Pullum titled his 2004 posting "Omit stupid grammar teaching", pointing to Strunk's maxim Omit Needless Words (ONW) as the source of the No That advice. Someone took ONW as a first principle of grammar (not just style) and applied it to the fullest, disregarding actual practice in favor of a stipulation about the way the language should be.

The No Zero advice might have its source in a different line of reasoning from first principles, namely via the maxim Avoid Ambiguity (even if only potential, even if only temporary). That is, since there's a temporary potential ambiguity in sentences like

(1') I know George is a good man.

(where the first three words might be parsed as the sentence "I know George"), but no such ambiguity in

(2') I know that George is a good man.

the complementizer that should not be omitted in such cases — and, by generalization, not in any cases. (This is extraordinarily tenuous reasoning, since "I know that" and "I know that George" (with demonstrative that) are both sentences.) But potential ambiguity (including temporary potential ambiguity) is everywhere and is troublesome only in certain circumstances, so Avoid Ambiguity is worthless if it's understood as cautioning against ambiguity wherever it can be found.

Note that this account of No Zero involves an extension of Avoid Ambiguity to examples like (1), where in fact it doesn't apply (since "I know he" is not a sentence). The idea would be that if a configuration is sometimes problematic, it should be banned in general — a principle that is often appealed to in grammatical advice, and might contribute to No Zero in an additional way.

As I pointed out in my earlier posting, there are verbs (like add) that strongly prefer, or even require, that in their complements. (It's also true that nouns taking clausal complements — like observation — generally require that.) So this constraint might be extended to all verbs taking clausal complements. (I'm not recommending this step, only reporting on a line of reasoning that might lead someone to espouse No Zero.)

No That and No Zero are two different (and opposed) ways of maintaining the principle One Right Way — regulating the language so as to eliminate variation. I've never understood the antipathy some people have towards variation, but a drive to eliminate variation lies behind a great many pieces of bad advice on grammar, usage, and style.


  1. RNB said,

    January 4, 2009 @ 2:53 pm

    It's surely not being "prescriptivist" to suggest that, usually, inclusion of the word "that" is less likely to give cause to unintentional ambiguity and consequently should be "recommended", as it is in German, French and Spanish so far as I know.

  2. Karen said,

    January 4, 2009 @ 3:59 pm

    It's "prescriptivist" to insist that it never be omitted, though.

  3. Tim Silverman said,

    January 4, 2009 @ 4:55 pm

    It's prescriptivist to make claims about how likely it is that a particular construction will result in ambiguity, without backing up those claims with any evidence. It's prescriptivist to assume that a construction in widespread use makes it difficult to communicate, despite the fact that the people who produce it and hear it are surely in an excellent position to judge whether they are communicating successfully. It's prescriptivist to assume that potential ambiguity actually causes problems in real conversations or writing, even though people are capable of adjusting for problematic contexts as they see fit.

    In general, it's prescriptivist to issue one-size-fits-all advice which runs against widespread current usage.

  4. Bob Ladd said,

    January 4, 2009 @ 6:18 pm

    @RNB: It's not a question of "recommended" in German, French and Spanish; those languages have different grammars. It's simply ungrammatical in French to say Je sais Georges est un honnête homme, and no native speaker says it; you have to say Je sais que Georges est un honnête homme. In English, both I know George is a good man and I know that George is a good man are grammatical. In German there are further complications, and I don't know enough Spanish to say how it works, but in no case are the differences simply due to different stylistic recommendations, or to different sensitivities to potential ambiguity.

  5. The other Mark P said,

    January 4, 2009 @ 7:12 pm

    This is extraordinarily tenuous reasoning, since "I know that" and "I know that George" (with demonstrative that) are both sentences.

    Not if you parse the sentences as needing to keep going, they're not. Since few readers only work word by word, this would be very rare.

    "I know George .. " can be a leader into two quite different structures:

    1) I know George from way back
    2) I know George is a Lutheran.

    Adding a "that" gives two different alternative short sentences:

    3)"I know that George" and

    4)"I know that , George"
    but each would have a comma or suchlike if they are to continue (say with ".., you nincompoop"). If there was no comma then the omission would be the cause of the ambiguity, not the added "that".

    Basically I feel it is wrong to look for ambiguity without taking into account the knowledge a reader has of following words or punctuation. This is why we trip up so often with sentences that end halfway though due to a page turn which wouldn't cause us a moment's hesitation if laid out normally — those vital signals are missing.

    I personally omit "that" as much as I can.

  6. Ran Ari-Gur said,

    January 4, 2009 @ 8:52 pm

    @Tim Silverman: Are you using "prescriptivist" to mean "wrongheaded", "idiosyncratic", or "unscientific"? Because that's not what it means; a single statement statement can be descriptivist, baseless, and incorrect, or prescriptivist, well-supported, and sound. (I don't say "correct", because prescriptions don't really have truth values in the way that descriptions do, since the former are deontic while the latter are epistemic.)

  7. Arnold Zwicky said,

    January 4, 2009 @ 8:56 pm

    To "the other Mark P": you're making exactly my point about potential temporary ambiguity, which is that it is in general innocuous. See my posting on the matter here (a link I gave in my earlier posting). I wasn't recommending this way of thinking, but reporting (critically) on it in others.

  8. Mark Liberman said,

    January 4, 2009 @ 9:12 pm

    Ran Ari-Gur: … a single statement statement can be […] prescriptivist, well-supported, and sound.

    I've occasionally aspired to the role of scientific prescriptivist, despite the skepticism of some. But I'll confess that I haven't aspired very vigorously.

  9. Simon Spero said,

    January 5, 2009 @ 12:28 am

    Don't give in to the dark side!
    Let your Descriptivist Flag Fly.

  10. jk said,

    January 5, 2009 @ 12:32 am

    I would only add to the general "that" discussion a note that, even without Strunk & White, a rule to avoid unneeded words would have an application among print journalists, where space is at a premium. This, I think, might in some way contribute to its application in education, given the crossover between English and journalism teachers in high schools. Hard to imagine who, other than the peevish, would weigh in on the side of always using "that."

    Re "add": I don't think my normal usage would be as strict about "that." Similarly to "say," I would find it unremarkable to replace "that" with a pause in spoken sentences — and, at least in some circumstances, to replace it with a comma in written ones: "Let me say, this is a bad idea. And, I might add, a dangerous one."

  11. The other Mark P said,

    January 5, 2009 @ 4:08 am

    you're making exactly my point about potential temporary ambiguity, which is that it is in general innocuous.

    I agree with your general point that most sentences can be deliberately misread, but in practice aren't.

    But I think you are wrong in this particular case when you describe the reasoning as "exceptionally tenuous" because the amended form can potentially be misread too. I don't believe anyone could ever possibly misread the "I know that George …" in the way you suggest because readers know that more of the sentence is to follow. Because people read in blocks of text, this effectively prohibits the readings you suggest, neither of which make sense except as stand-alone phrases.

    For example. You don't have any ambiguity with "I love Jim" being read as "I love, Jim", because people read the comma as part of the sentence and process it as a whole.

    Conversely "I know George .. " can lead into two quite different sentence structures. This sort of thing trips readers all the time. We do a doubletake and then read it properly.

  12. Aaron Davies said,

    January 5, 2009 @ 9:37 am

    i, for one, did stumble, mentally, over (1), as i was beginning to read this post–i hit the "he" and had a couple milliseconds of "hang on, shouldn't that be 'him'?" before i picked up the rest of the sentence, went back, and corrected myself. (in dialog, of course, it'd really be, "I know he's a good man", which doesn't present the same problem.) (i suppose this constitutes defense of "no zero" account one, and of The other Mark P)

  13. Arnold Zwicky said,

    January 5, 2009 @ 10:56 am

    The other Mark P: "But I think you are wrong in this particular case when you describe the reasoning as "exceptionally tenuous" because the amended form can potentially be misread too. I don't believe anyone could ever possibly misread the "I know that George …" in the way you suggest because readers know that more of the sentence is to follow."

    I wasn't espousing the position that "I know that George …" is actually problematic, only pointing out that the reasoning people have used to deprecate zero-marked complements like "I know George is a good man" (because "I know George" is a sentence) can be applied as well to "I know that George is a good man". The argument is equally stupid in the two cases.

  14. Arnold Zwicky said,

    January 5, 2009 @ 11:29 am

    Aaron Davies: "i, for one, did stumble, mentally, over (1), as i was beginning to read this post–i hit the "he" and had a couple milliseconds of "hang on, shouldn't that be 'him'?" before i picked up the rest of the sentence, went back, and corrected myself."

    I've gotten similar reports (of "stumbling" in processing) in many other cases, even where the examples are of very frequent types and strike me as unproblematic. I suspect that what's going on here is that some people, when they are reading (or hearing) sentences *as examples*, are inclined to be hypervigilant. Yes, people do sometimes get hung up when they're reading or listening — I've reported on a number of such cases myself — but thinking about sentences as examples can cause people to reflect on their processing, rather than just processing them.

    Googling on {"I know he"} gets ca. 8,040,000 raw hits. Some are irrelevant, and there are certainly dupes, but there's still an enormous number of relevant hits, and they strike me as utterly unremarkable. I'd imagine that everyone has read such sentences many times without experiencing any problem.

  15. Jonathan said,

    January 5, 2009 @ 11:35 am

    I suppose the ambiguity is with cases like this.

    –Is Felipe coming?

    –Which Felipe?

    –The one we met in Buenos Aires.

    –María said that Felipe is coming, but probably not the other one. [María said that that Felipe is coming; less ambiguous?]

  16. Arnold Zwicky said,

    January 5, 2009 @ 12:11 pm

    To jk: of course, it's good advice to omit unnecessary words, and every writing guide gives this advice. ONW was scarcely a novel insight of Strunk's. But what counts as (un)necessary? As I've pointed out any number of times, ONW has been wielded as a club against various non-standard usages, and it's also been used as the basis for advice insisting on shorter variants over longer (when both are standard), across the board.

    As for "add", my discussion was about this verb taking clausal complements. The verb has a variety of other uses — as a quotative ("He added, "I'm leaving now"" and in parentheticals ("This is a foolish idea, I might add"), for instance. The claim is *not* that "add" must be followed by "that", in general, but that when it has a clausal complement, this complement is marked by "that" rather than zero-marked.

  17. James Wimberley said,

    January 5, 2009 @ 12:17 pm

    Three cheers for Arnold's closing shot against Procrustian uniformity. English was the richer in Shakespeare's day for allowing both the endings wrong'd and wrongéd, a great convenience for poetry.

  18. Coby Lubliner said,

    January 5, 2009 @ 3:55 pm

    @Bob Ladd:
    In French, omission of que is common in the Cajun dialect (which, by the way, in not a patois but a variant of français populaire), as in the song J'ai fait une grosse erreur: Je croyais j'avais raison and Moi je savais j'avais fait une grosse erreur. This may be due to English influence; I don't know if it also occurs in Acadian French.
    In Spanish, omission of que is characteristic of a somewhat formulaic epistolary or legalistic style; espero estés bien is a common formula.

  19. Marc A. Pelletier said,

    January 5, 2009 @ 4:34 pm

    My ESL ears see both of those statements (I know he is/I know that he is) as equally valid but with a difference in meaning:

    To me, the former appears to be a observation of fundamental nature of the man and the latter simply states one property of the man (I keep expecting the statement to continue with "… but X".

  20. Tim Silverman said,

    January 5, 2009 @ 4:36 pm

    @Ran Ari-Gur: Well, "prescriptivist" is often used basically as a perjorative (at least around these parts) so I guess I was reacting (maybe a bit too sharply) to that sense.

    Beyond that, however, I think that there are certain fatal characteristics of the sort of thing that tends to get called "prescriptivism", that mark it off from the more general category of what we might (more neutrally) call "writing advice", the sort of thing writers receive every day from editors, teachers, colleagues etc. And I think these characteristics doom any attempt to construct a "sound" or "scientific" prescriptivism, as opposed to sound (if perhaps not "scientific") advice.

    The first big problem is over-generalisation.

    For example: one well-known problem with some novice writers, which you might encounter teaching a composition or creative writing class, is that they write absurdly flowery, overblown, adjective-laden descriptions. There are several possible responses to this sort of problem, for instance—

    a) tell those students to lay off the adjectives and adverbs a little;

    b) set those students some writing exercises in which they have to write descriptions without using any adjectives and adverbs;

    c) set the whole class some exercises of that sort (such exercises can be quite interesting even you don't have that particular writing problem); or

    d) put on your cloth-of-gold robes, climb onto your mile-high throne, and issue a stern dictat to the entire human race forbidding the use of any adjectives or adverbs at all, for any reason whatsoever, in perpetuity.

    Obviously, ac are reasonable and possibly helpful responses, while d is stark raving bonkers (even without the robes and throne). There are two aspects to this. On the one hand, how could it possibly be that an essential tool that everybody uses constantly, like adjectives, could be hopelessly defective, and yet hardly anybody has noticed? It would be like failing to notice that cars were soluble in water, or some equally obvious practical problem.

    But even granting that complete proscription is deliberately hyperbolic, and the advice really intended is something more mild, there is another fundamental reason why prescriptivism of this sort is bonkers even in soft form. The reason, which should really be no less obvious than the first, is that, while "too many adjectives" may be a real problem for a particular writer or a particular piece of writing, it can't be a problem for all writers, all the time, or for all pieces of writing. In fact, there's another well-known problem of (different) novice writers, where their descriptions are so skimpy and generic that all their stories appear to take place in an empty white room. Obviously, for them, advice to cut down on description is likely to be useless or harmful.

    This problem is compounded by two further problems, of correct diagnosis and correct prescription.

    As to diagnosis, it is generally the case that it is much easier to detect that there is a problem with a piece of text than to identify what the problem actually is—and this holds for all sorts of problems ranging from some screwy interaction of negation with modality in a subordinate clause up to contradictory motivation of the hero's love-interest in the scenes leading up to the denouement.

    And as to correct prescription, even once we have correctly identified the problem, it is a whole separate issue to say how to fix it. Blanket bans on adjectives or complementizers are clearly not the answer. In general, for a given problem, there are multiple solutions which get rid of (or at least sufficiently mitigate) the original issue, but each of them will have ramifying effects on the style, tone, clarity etc, depending on their interaction with the surrounding text. It is this fact that really makes writing hard. Obviously the best solution, even if one exists, will be particular to the passage of text, and what that particular writer (perhaps in consultation with their editor) wants to do with it.

    Also, what counts as the problem may depend on what sort of solution we have in mind.

    By way of example, we've seen that here just recently with an example of ambiguous phrasing ("the fact he married") which someone, at first, put down to a missing "that"—but, behold! inserting "that" merely causes another, related, ambiguity. Perhaps the problem is not the fact that there is a missing complementizer; or perhaps inserting "that" is not a good solution. Or perhaps it is—the second ambiguity may be less severe than the first (or perhaps it is more severe! It's a judgement call).

    On top of all this, there is a second whole major class of problem with generalised prescriptions, also a kind of problem of over-generalisation, but this time involving readers and their reading, rather than writers and their texts.

    Readers differ. They have different tastes. They have different backgrounds. They have different grammatical and stylistic idiosyncracies and preferences. They have different preconceptions. They have different criteria for deciding what to pay attention to and what to skip or skim. They make different mistakes. They differ in what they will reliably infer and what they have to be told or guided to.

    The extreme case of this is the various examples we've seen here of completely idiosyncratic quirks of individual grammars or stylistic preferences being promoted as universal rules of writing. But the problem is present even with more widely shared preferences. In this thread, we've seen mention prescriptions demanding the presence of "that" and other prescriptions forbidding it. These readers are not both going to be satisfied. By anything.

    Of course, readers (including editors, colleagues, critics and reviewers) are entitled to complain about writing they don't like, just as any consumer is entitled to grumble about any product. But some of those grumbles are likely to be both more widely shared and more easily addressed than others. Consequently, a sensible reader is going to pay some attention to weighing the importance of each of their particular gripes, before wading in handing out advice in all directions. And a silly reader is going to demand attention to their least whims. But no reader is going to be able to claim that any of their complaints is the True, Scientific Complaint. Even widely shared grumbles are not universally shared, and few grumbles are all that widely shared. To some extent, readers and writers have to adapt to each other. Not only should writers try to please their audiences (or at least avoid driving them away), but readers need to (and do!) select what they want to read, and adapt their expectations to what is available.

    I think these two problems are fatal for any attempt to come up with a Scientific Prescriptivism. This is not to say there aren't interesting facts to discover about the way writers achieve their effects and readers respond, or about they way that writers try to convey meanings and indicate grammatical and discourse structure, while readers try to extract meanings and structure from the stream of words in front of them. But I do not see how discoveries in these areas can result in universal prescriptions (as opposed to piecemeal critique). Any really accurate "rules" one might extract would surely be too complex to follow in practice, involving some huge chain of decisions based on what the writer is trying to achieve and what they expect their audience to be like (which is itself rather unpredictable).

    So truly general writing advice is not going to be like "don't use adjectives" (which is plainly silly) or even "that before a complement prevents ambiguity" (nothing truly prevents ambiguity, and ambiguity arises in complex and variable ways which depend subtly on surrounding context and particular readers). It's going to be things like "don't patronise your audience by talking down to them", "don't try to impress your audience by talking over their heads", "read through what you've written and make sure it makes sense", "say something substantive rather than reaching for stock phrases", "don't pretend to understand things when you don't", etc. These are, I think, good pieces of advice (large chunks of Fowler seem, to me, to be trying to impress maxims like these on the reader). But they are not so much writing advice as applications to writing of general maxims on how to be a decent human being—don't be rude, arrogant or lazy, or try to pass off shoddy goods as quality stuff. Which is advice better directed at prescriptivists than likely to issue from them.

  21. Ran Ari-Gur said,

    January 5, 2009 @ 7:31 pm

    @Tim Silverman: I think you've accurately summarized how some descriptivists use the term "prescriptivist", and while I don't agree with your use of that term, I do agree with your comments about what you call prescriptivism (and its promulgators). But prescriptivism is a lot broader than that, and I think you risk attacking a straw-man position and thereby rendering your comments irrelevant.

  22. John Cowan said,

    January 5, 2009 @ 8:18 pm

    But if my character is tall, florid, and overbearing, how would I describe him except as "a tall, florid, and overbearing man"?

  23. sptrashcan said,

    January 6, 2009 @ 10:20 am

    He had legs like stilts, a face like a Braeburn apple, and the manners of a bull?

    (Maybe I'm not getting the joke…)

  24. Arnold Zwicky said,

    January 6, 2009 @ 11:40 am

    John Cowan's "tall, florid, and overbearing" comment would appear to be a follow-up to the part of Tim Silverman's long comment that dealt with adjectives and adverbs.

  25. Tim Silverman said,

    January 6, 2009 @ 11:58 am

    And sptrashcan's comment is notably adjective-free …

  26. Dal Jeanis said,

    January 7, 2009 @ 4:26 pm

    The "stilts/Braeburn/bull" sentence avoids adjectives, but it consists of three different similes that do not work together to form a single impression.

    Fad editors who complain about use of adjectives would also complain about the "tall, florid and overbearing" sentence because it is "telling, not showing". Of course, A smart writer should "tell" any point that isn't important enough to dramatize.

    The best rule is, like in the "that" discussion, use words when they are appropriate and helpful to the reader, and when they make sentences that sound well in the mind.

    Although current fad is to omit anything that can be done without, I return the word "that" to any sentence that it makes the meaning more clear, or the sound more flowing.

  27. David Marjanović said,

    January 9, 2009 @ 6:50 pm

    It's surely not being "prescriptivist" to suggest that, usually, inclusion of the word "that" is less likely to give cause to unintentional ambiguity and consequently should be "recommended", as it is in German, French and Spanish so far as I know.

    As mentioned above, it's not merely recommended in German and French (no idea about Spanish). In these two languages, you can say "I know, George is a good man", but that's something different, with different intonation, usually a pause, and different intended emphasis ("yes, sure, I know, George is a good man, that goes without saying, no need to remind me all the time").

    The complication in German that is alluded to above is that "that" triggers word-final word order: ich weiß, dass Georg ein guter Mann ist — as opposed to ich weiß, Georg ist ein guter Mann.

  28. Merri said,

    January 13, 2009 @ 11:05 am

    I feel -and I'm not alone- that (!) there is a pragmatic distinction between 'I know George is a good man' and 'I know that George is a good man'.
    The word 'that' sounds rather sharp ; it makes sentences with non-necessary 'that' stronger in meaning.

    For this reason, if no other, disallowing one or the other construction would be wrong.

RSS feed for comments on this post