Computers to become 10 times more powerful in 2015

« previous post | next post »

With respect to to a headline in the Washington Post yesterday (Jason Samenow, "Weather Service forecasting computers to become 10 times more powerful in 2015", Washington Post 1/5/2015), Eugene Volokh writes:

My first thought:  Come now – how would computers generally become 10 times more powerful just in the span of a year?  (In the span of five years, according to Moore’s law, maybe).  

My second thought:  Since when is the Weather Service forecasting trends in computing technology?  

My third thought, shamefully after I clicked on the link:  Ah, it’s the Service’s computers used for forecasting that are going to be upgraded to top-of-the-line models.

The start of the article:

The National Weather Service’s primary computer model trails competitors in Europe in overall forecasting accuracy.  But today it announced upgrades to its supercomputers that hold great promise to improve its predictions.  

By October this year, the capacity of the two National Weather Service (NWS) supercomputers will increase by nearly a factor of ten it said.  

“By increasing our overall capacity, we’ll be able to process quadrillions of calculations per second that all feed into our forecasts and predictions,” said Louis Uccellini, director of the National Weather Service in a press release. “This boost in processing power is essential as we work to improve our numerical prediction models for more accurate and consistent forecasts required to build a Weather Ready Nation.”



  1. Rich Daley said,

    January 6, 2015 @ 7:36 am

    A minor note: Moore's law would predict a 10x increase in less than 4 years (2x^4years =16x).

    [(myl) We need log2(10) = 3.321928 doublings to get an improvement of 10 times. With Gordon Moore's estimate of transistor-number doubling every 2 years, we'd need 2*3.321928 or about 6.6 years for a factor of 10. But I think that Prof. Volokh is working with the doubling period of 18 months, as estimated by David House [Wikipedia: "The period is often quoted as 18 months because of Intel executive David House, who predicted that chip performance would double every 18 months (being a combination of the effect of more transistors and their being faster)"].

    1.5*log2(10) = 4.982892, which is close enough to "five years"…

    You're assuming that the doubling period is one year, which is faster than any credible estimate that I'm aware of.]

  2. Brett said,

    January 6, 2015 @ 10:36 am

    Besides increases in the speeds of individual transistors, there's another reason why the processing power increases more rapidly than the density of transistors on a chip (which was what Moore originally said was doubling every two years). Chips get physically bigger over time at well.

  3. David J. Littleboy said,

    January 6, 2015 @ 10:51 am

    As a computer nerd, I had no trouble with the title of the article. That's because I realize that big organizations replace their computer infrastructure on a rather longer time scale than peecee/cell phone users, and the supercomputer blokes with infinite budgets, unlike the peecee processor mfrs, actually are making faster and faster machines. The first two words of the title make it linguistically fine for people who have even a vague understanding of supercomputers and the horrendous computational requirements involved in weather forecasting.

    From the peecee user's perspective, though, Moore's law has been on hold for an age: the quad-core i86 running at 3 GHz has been the best you could get in a PC/Mac for about 10 years. This is all very depressing, since for a while (1990 to 2004), the peecee was beating the pants off the infinitely expensive supercomputers (at least in terms of relative improvements), and that was quite fun.

    Of course, Moore's law isn't about the _power_ of computers, it's about the number of transistors on a chip, and it's harder to run circuits fast when there's more of them on a chip. That's why it's mobile devices (which I mostly don't use) that have seen the fun advances in the last few years.

    There's also some nasty gotchas in basic computer science, such as doubling the size of the on-chip memory (caches) only provides a very small increase in performance once you get to a certain size, and doubling the number of processors only doubles performance for a specific class of applications (chess programs, for example, only see small (but still significant) gains from extra processors), and even then, memory bandwidth quickly limits even those gains. So Moore's law (which was originally stated as 18 months to double, but quickly became 2 years to double in real life) doesn't double _processing_ power.

    Sorry about the nerdiness of the above; a bit overmuch for language log, I'm afraid.

  4. chips mackinolty said,

    January 6, 2015 @ 11:08 am

    "infinitely expensive supercomputers"?

    Surely a nerdy exaggeration?

  5. Howard Oakley said,

    January 6, 2015 @ 11:43 am

    Bad news for the NWS: late last year, the UK Met Office announced that it was replacing its now 'slow' supercomputer with the latest, fastest beast. It sounds like the NWS is chasing a moving target…

    [(myl) In the digital hardware arena, we're all chasing a moving target.]

  6. EndlessWaves said,

    January 6, 2015 @ 5:39 pm

    @David J. Littleboy
    "From the peecee user's perspective, though, Moore's law has been on hold for an age: the quad-core i86 running at 3 GHz has been the best you could get in a PC/Mac for about 10 years. "

    The start of the art from Intel on the 6th of January 2005 was the single core Pentium 4 Extreme Edition 3.73Ghz.

    True, later that year would see the launch of dual cores, and two years later the launch of dual cores with the first 3Ghz quad core, the Core 2 Extreme QX6850, appearing in mid-2007.

    But that was an £800, 286mm² chip, the current equivalent of which is the eight core 3.5Ghz i7-5960X.

    The first sensibly priced 3Ghz quad core showed up around 2009/2010, five years ago, with the i5-750 and Phenom II X4 940.

    Performance since then hasn't improved as much as you'd expect, but look at the advancements elsewhere.

    Eight years ago the lowest power Quad Core was the 2.4Ghz Q6600 which required a cooling system capable of handling 125W, these days you can pick up a tablet with a 2.4Ghz Quad Core Z3795 which will run in a 2W thermal envelope.

    Granted, it's 'only' half as fast but for 1/50th of the heat output (and similar power consumption) that's some pretty impressive technological advancement.

    And that's where the priorities have been in the last few years. Lighter, smaller and thinner devices are now a much more obvious improvement for many people than more processing power.

    Once the technology has been pushed to (or past) the point of usefulness in that direction the focus might return to performance or it might go off in a third direction but there's certainly been no halt in advancement.

  7. Bob Ladd said,

    January 6, 2015 @ 6:08 pm

    Surely this post should be classified under "Crash blossoms" as well as "Language and the media", no?

    [(myl) I thought about that, but decided that in the original sense (where a "crash blossom" creates an unintended noun phrase), this one didn't count. But if we generalize the term to include any garden-pathy headline, then OK. So I added "crash blossom", since we don't have any better category, despite having way too many categories.]

  8. TheStrawMan said,

    January 6, 2015 @ 6:25 pm

    I had no trouble parsing this headline.

    Also: "peecee"?

  9. Gregory Kusnick said,

    January 6, 2015 @ 9:38 pm

    I'm surprised nobody has yet suggested that the headline should be read in binary.

  10. David J. Littleboy said,

    January 6, 2015 @ 9:59 pm

    "Lighter, smaller and thinner devices are now a much more obvious improvement for many people than more processing power."

    Well, yes. But if the claim is that there's some increase in processing power, then that's not being coughed up. (Thanks for the i86 history detail; you're right, it's not quite as bad as I think) I have folks like Kurzweil and hyperventilating tech reporters in mind. I translate press releases that overstate the claims for increased computational power quite regularly…

    ""infinitely expensive supercomputers"? Surely a nerdy exaggeration?

    Not by much. These things are in the hundreds of million dollars range for generic commercial ones (with running costs in the tens of millions per year range), and who wins this round of who's got the fastest machine game is largely a matter of how much your government thinks bragging rights are worth (the machine at RIKEN set the Japanese government back $1 billion). And they go obsolete about as fast as iPhones.

  11. maidhc said,

    January 6, 2015 @ 11:55 pm

    The big advance in computing power in the last few years has been the use of graphics processors in non-graphics applications (general purpose graphics processing units, GPGPUs). This makes it possible to apply massive amounts of parallelism at a relatively low price. (Example)

    In a system like this, general-purpose microprocessors are only used to set up the system and the GPUs do the real number-crunching.

    This approach is only applicable to certain types of problems, but weather forecasting would certainly be one of those.

  12. errorr said,

    January 7, 2015 @ 12:26 am

    The NWS has been horribly behind the times due to budget constraints. Most of the NOAA comps went to climate change research and not operational forecasts.

    This blew up when they had the worst Sandy forecast compared to the Euro. By law all the runs are public as well while the joint Euro funds itself by user fees and can afford the new toys regularly.

    The benefit will mostly be in the 5-7 day range and perhaps 2nd week forecasts.

  13. Alan Palmer said,

    January 7, 2015 @ 7:09 am

    I'm British and had no trouble understanding. There are probably a couple of reasons for this, though. As already mentioned in the comments above the UK Met Office only recently announced the replacement of its supercomputers (in fact I thought at first this was the same story). Also the British media tend to produce crash blossoms and the like in headlines quite frequently, so I'm used to decoding them.

  14. George said,

    January 7, 2015 @ 7:49 am

    I got this right away too and, like others, didn't immediately see where the misunderstanding could come from. I think that is at least partly due to the fact that had the headline been intended to express the idea that the Weather Service is predicting something, then the '-ing' would have been lopped off as well as the 'is'.

  15. Catherine Lincoln said,

    January 7, 2015 @ 11:54 am

    "Weather Service's forecasting computers" would have quickly fixed any crash blossoming.

  16. Steve said,

    January 7, 2015 @ 2:35 pm

    I am easily led astray by garden paths as a general matter, but was not by this one.

    In fact, I began to get paranoid and wonder if the seemingly obvious reading (that the weather service was replacing outdated computers with new ones, ten times more powerful than the old ones) WAS itself a garden path, and if the true meaning was that the weather services's computers were predicting that something weather-related (but not actually mentioned in the title) was going to be ten times stronger in '15. But I couldn't see a plausible way to get there from the title.

  17. Mike Maxwell said,

    January 7, 2015 @ 7:34 pm

    @Gregory Kusnick: with this posting, 10 of us have suggested binary.

  18. chh said,

    January 8, 2015 @ 1:34 am

    Mark, can "forecast" normally take a to-infinitive complement for you?

    Something like "They're forecasting/They forecast the temperature to drop" is not so good for me. I totally get the wrong reading you're talking about here, although I'm not sure it's the first reading I'd go for.

    It's funny how we can initially parse something in a way that leads to both a weird meaning and a weird syntactic structure.

  19. Rich Daley said,

    January 8, 2015 @ 6:28 am

    Mark: you are right, of course. Please ignore my hurried and inaccurate pedantry.

  20. Elonkareon said,

    January 8, 2015 @ 7:19 am

    Catherine Lincoln: Except that headlinese normally eschews possessive markers in favour of attributive nouns, as here.

  21. Elonkareon said,

    January 8, 2015 @ 7:22 am

    Also, the possessive marker here does nothing to solve the problem, since "Weather Service's forecasting computers" can also be read "Weather service *is* forecasting computers", which is the same alternate reading that engendered this post.

  22. Azimuth said,

    January 12, 2015 @ 8:54 pm

    I take it the increasingly formidable weather calculations will require more electricity to perform—perhaps not ten times as much, but there is also the air conditioning. So these new computers will themselves help to generate the very climatic changes they labor to predict….

  23. Asad said,

    January 13, 2015 @ 3:58 am

    As a software engineer. i had a bit difficulty with the title because i was writing a content as well as reading your article. finally i read the whole article you are right about the idea that you presented, and i also agree with @azimuth

  24. Elonkareon said,

    January 14, 2015 @ 9:45 am

    @Azimuth Because of Dennard scaling and other improvements in efficiency, these new systems are unlikely to require more power than the old ones. They may even consume less electricity than the old machines.

  25. Ellie Kesselman said,

    January 16, 2015 @ 10:45 am

    David Littleboy: I like your website! You were clearly a very sweet child:

    peecee was novel usage to me. I "parsed" it as a way of typing "politically correct" in order to evade the spam filter at first glance. Has LL done any posts on l337 speak, I wonder….

RSS feed for comments on this post