For a teenage boy, according to this joke, the idea of cleaning up his own messes is so alien that learning to understand its expression in simple English is part of learning a foreign language. I suspect that the stereotype is at least somewhat unfair, in terms of age as well as sex; but this comic strip also mocks (and thus illustrates) a common tendency to equate language and thought.
The Sapir-Whorf hypothesis, as advanced by its original authors, involved the rather limited idea that the obligatory morpho-syntactic categories of a language influence the habitual thought patterns of its speakers. Even in this form, the hypothesis remains controversial. But it's often distorted into various extreme and easily-refuted forms, such as the idea that a culture's interest in X can reliably be calibrated in terms of its number of monomorphemic words for X-related concepts; or worse, the truly bizarre idea that if you can't express a concept in a single word, then you can't understand it.
It's easy to see why people fall into these fallacies, because there are closely-related concepts that are pretty clearly true. I encountered one example a few months ago in reading James Flynn's book What is Intelligence? (for the context, see "One question, two answers, three interpretations", 8/14/2008).
Flynn writes (p. 146):
IQ gains are less than half the story of the cognitive history of the twentieth century. There are other intellectual qualities, namely, critical acumen and wisdom, that IQ tests were not designed to measure and do not measure and these are equally worthy of attention. […]
There is one encouraging development. Over the last century and a half, science and philosophy have invaded the language of educated people … by giving them words and phrases that can greatly increase their critical acumen. Each of these terms stands for a cluster of interrelated ideas that virtually spell out a method of critical analysis applicable to social and moral issues. I will call them shorthand abstractions (or SHAs), it being understood that they are abstractions with peculiar analytic significance.
He gives a sort of top-ten list of SHAs along with "the date they entered educated usage" according to the OED. His list is market, percentage, natural selection, control group, random sample, naturalistic fallacy, charisma effect, placebo, falsifiable/tautology, and tolerance school fallacy.
[The last of these terms somewhat undermines Flynn's optimism about the historical trend, since it's his own invention, and as he wryly notes, "somehow my coining this term has not made it into common currency". It means "the fallacy of concluding that we should respect the good of all because nothing can be shown to be good".]
Anyhow, Flynn's idea is that these words and phrases are the outward and visible sign of increasing "critical acumen and wisdom" in the culture that developed them. Some people will surely disagree, but I'm not among them.
In fact, I'd like to add a few things to Flynn's list, starting with a list of shorthand abstractions that have to do with ways of thinking and talking about properties of sets, and have been featured in many Language Log posts over the years. The outward and visible signs of this set of interrelated concepts are terms like odds ratio, percentile, standard deviation, effect size, and contingency table. Here's an illustrative sample of relevant posts: "Thou shalt not report odds ratios", 7/30/2007; "Gabby guys: the effect size", 9/23/2006; The 'gender happiness gap'", 10/4/2007; The Pirahã and us", 10/6/2007; "Scrupulously avoiding sigma", 3/2/2008; "Is autism the symptom of an 'extreme white brain'?", 3/26/2008; "Steven D. Levitt: pwned by the base rate fallacy?", 4/10/2008.
Although these are all statistical terms, I'd argue that the associated concepts belong to linguistics in the same sense that the concepts developed by logicians for talking about truth and consequences do. (These include things like sense and reference, de re vs. de dicto, entailment, implicature, quantifier scope, and so on.)
The crucial ways of thinking and talking about the properties of sets don't require anything beyond junior-high-school math to understand. But the associated methods of critical analysis seem still to be mostly lacking among the sample of educated people represented by journalists in general, science journalists in particular, and even scientists writing for a general audience. As a result, genuine scientific results are garbled, and emerge as misleading or entirely false general statements that guide public policy choices as well decisions in private life.
Paradoxically, you could argue that the net impact of statistical concepts on public discourse in our society has been negative. They underlie scientifically valid forms of argument, but the authority of these arguments is then abused in discussions among people who overwhelmingly misunderstand and misuse them.
I wouldn't go that far — statistical reasoning has surely done far more good than harm, overall — but it's certainly long past time to increase the percentage of the population that understands it. However, it probably takes just about as much systematic practice to get comfortable with these ways of thinking and talking about the properties of groups as it does to learn to count and do basic arithmetic. So we'd be talking about a fairly large change in our culture's educational system.