Saturday, 30 October 2010

Comparing HPC across China, USA and Europe

In my earlier blog post today on China announcing the world's faster supercomputer, I said I'd be back with more later on the comparisons with the USA, Europe and others. In this morning's blog, I made the point that the world's fastest supercomputer, in itself, is not world changing. But leading supercomputers, critically matched with appropriate expertise in programming and using them, togther with the vision to ensure use across basic research, industry and defence applications can indeed be strategically beneficial to a nation - including real economic impact.

There are plenty of reports and studies describing the strategic impact of HPC within a given organisation or at national levels (some are catalogued by IDC here), so let's take it as a premise for the following thoughts.

With this in mind, there are some comparisons to be made between the approaches to supercomputing across the USA, Europe and China.

The USA has long enjoyed near total dominance of the hardware technology underlying the leading supercomputers. The USA has invested repeatedly in ensuring that American supercomputer manufacturers have the technology to deploy the world's largest supercomputers. The last time the USA lost the public leadership crown of fastest supercomputer, a huge investment in Amercian supercomputer technology followed. As well as national support for the development, the strong implicit requirement for USA organisations to "buy American" ensures a continued USA supercomputer manufacturing industry. As a result of the sheer size of the country, the USA has a large HPC R&D community. There is also significant usage of HPC in American industry and additional support for this through government initiatives like INCITE.

Despite the recent growth of Bull, and their highly-rated supercomputers and in-house HPC expertise, Europe has not seen it essential to have a home-grown supercomputer manufacturing industry (beyond the component level R&D which is strong in Europe). Europe has always highlighted its expertise in applied HPC software development and in software-related HPC R&D as its distinguishing strength on the international stage. European organisations have always bought supercomputers from around the world - and since they are likely to continue to enjoy access to products from around the world, have no need to develop independent supply from wholly European sources.

But the news of China's investment in hardware, software and people (and stated ambition of independence of supply) should make a clear message to USA, Europe and others that they cannot rely on their continued leadership of HPC. And thus the economic benefits of HPC might soon be driving Chinese growth rather than the European or American economies.

There is one other potentially killer advantage that China might have. All the predictions of the path to the next levels of supercomputer performance (Exascale) involve major changes in the technology - much greater levels of parallelism (seeing this now with GPUs), memory performance challenegs, resilience, etc. As Dave Turek of IBM has said before, China is mostly not hindered by legacy code - they can start anew with the best HPC ideas and methods of today, looking to the future. In USA and Europe, our obsession with "protecting our investment" in established applications, means we first have to figure out how to get from yesterday's software technology to current methods, then to the future. Is "protecting our investment" actually constraining our future?


[More on legacy code and revolutions vs evolution in a blog coming here soon ...]

Friday, 29 October 2010

Why does the China supercomputer matter to western governments?

There is a lot of fuss in the mainstream media (BBC, FT, CNET, even the Daily Mail!) the last few days about the world's fastest supercomputer being in China for the first time. And much ado on Twitter (me too - @hpcnotes).

But much of the mainstream reporting, twitter-fest, and blogging is missing the point I think. China deploying the world's fastest supercomputer is news (the fastest supercomputer has almost always been American for decades, with the occasional Japanese crown). But the machine alone is not the big news.

Imagine that China announced a new prototype passenger aircraft, half the cost of the latest Boeing or Airbus. It has 50% greater fuel efficiency too. And an order of magnitude greater predicted reliability statistics. That would be major news. Sure it uses a lot of US designed components too.

But what if China announced this new aircraft wasn't just a prototype. It was a commercially available product now. And they have the capacity to make lots of them - faster than Boeing or Airbus. And they have a plan to train huge numbers of future aircraft maintenance engineers, aerodynamic designers, structural engineers, etc. In other words, China can not only build a world-beating passenger aircraft, but it is building the capability to do so without US designed components in the future. And it is building the expertise capacity to be a world leader in aircraft maintenance.

That would be very important.

And, while we are not quite there yet, that is where this China supercomputer news is going. American scientists and HPC professionals have been calling for rounded investment in people and software not just hardware for years. Europe has been proud of it's relative HPC software expertise, but the recent IDC led EU HPC strategic recommendations report shows that much more investment is needed.

The mass commentary talks about the Chinese hardware milestone. But public material from Chinese experts also talks about a plan to deploy several top supercomputers, to train huge numbers of HPC programmers, to invest in applications development and commercial use of HPC and to develop end-to-end nationally independent supercomputing technology.

If that happens, then China will have the ability to develop that super aircraft industry. And automobile. And the many household products that are designed with supercomputers. And materials science. And ...

Get the point? It's not the world's fastest supercomputer that matters most. It's not just national pride. It's the ambition and comphrensive plan behind the world's leading supercomputer that matters.




[More later today on the comparison with USA, EU and others]

Friday, 22 October 2010

Carbon Footprint

In these austere times I am always on the lookout for ways to reduce our expenditure without reducing our service to customers. NAG have just been awarded a matched funding grant from http://www.sustainableroutes.co.uk/ for £1000 to help us reduce our carbon footprint and we intend to use the grant (and some) to improve our video conferencing and cut down on some travelling costs and CO2 for meetings and training. Currently we are looking at some state of the art video conference equipment which we hope to start trialling with our Manchester office. There are some other useful tips and grants available on the Sustainable Routes website so I would recommend go taking a look.

Tuesday, 19 October 2010

Source-level debugging of Python in Emacs

To help me track problems in Python code (yes, even sometimes in my own...) I usually rely on good-old print/trace debugging. Owing to Python's speed of interpretation and execution, this is a pretty convenient approach. Python does have its own interactive debugger though— pdb—for those odd occasions when it's desirable to poke about in a program while it's running. The debugging mode in Emacs even supports pdb by default, but there's a little snag: you probably don't have a pdb in your path, so M-x pdb will just fail. Solution? Add a pdb script to your path and make it executable
#!/bin/sh
pdb_path=`python -c "import pdb, sys; sys.stdout.write(pdb.__file__ + '\n')"`
exec python ${pdb_path} "$@" 

Thursday, 7 October 2010

2010: A Retail Store Odyssey

Stanley Kubrick’s film “2001: A Space Odyssey” was the epic 1968 science fiction film that explored human evolution, technology and artificial intelligence with both realism (and surrealism) and remains one of the top films of all time. In it, two astronauts battle the computer HAL for control of their spaceship and for their lives while investigating a series of strange monoliths left from an earlier civilization. For many years and for many people, the film has been symbolic of our struggle to master, and not be mastered by, computers. Kubrick and his co-author Sir Arthur C. Clark were both brilliant and far ahead of their time. In many respects, they still are.


In 2010 we are certainly wrestling with computers that occasionally seem to get the better of us. In our world, computers are ubiquitous and the software behind them has a pervasive impact on our daily lives but hardly in the way Kubrick and Clark envisioned. Consider this absolutely mundane sequence of events, at least in the developed world. We hop into our car to go shopping for groceries. As we turn the key one or more microprocessors start, employing sophisticated software to optimize efficiency, performance and emissions. Our cars talk to us, connecting phone calls routed through a cellular network. They entertain us with hundreds of channels from a satellite radio connection and give us visual and voice directions to where we want to go. All of this amazing hardware comes to life through the software that makes it work. And, of course the software makes considerable use of mathematics to accomplish what it does for us. So, you may ask, what’s the reference to a “retail odyssey”? We haven’t even gotten to the grocery store yet.

In my view, the most amazing thing at the Sainsbury I frequent in the UK or the Dominick’s near home isn’t the check out where they let me scan my items and coupons and pay with my credit card, all with a few taps of the touch-screen. It’s the realization that I’ve just walked through a store with literally thousands of unique items to meet my needs, each residing in a database linking the bar code on the package with a price, an inventory level, cost, supplier and even a “loyalty card” database that permits analysis of which shoppers bought which products and in which combinations. While you are pondering this miracle of modern technology, ponder this question: who set the price of the 2-pound package of Folgers coffee at the end of Aisle 2 and how did they do it?

The answer, if it’s not already obvious, is sophisticated software from companies like NAG partner DemandTec (NASDAQ: DMAN) whose demand management software is helping retailers and manufacturers worldwide optimize revenues, prices and inventories. We’ve worked with DemandTec since 2004, providing them sophisticated mathematical and statistical software to enable their application to help retailers manage demand.

One of the benefits of working with cutting edge companies like DemandTec is that we get to participate in events such as I have been involved with lately. NAG has partnered with DemandTec to sponsor the Chicago Regional scholarship competition in a national event called the DemandTec Retail Challenge. In it, high school seniors in teams of two get to apply their problem-solving and mathematical skills as pricing analysts managing an assortment of products in a grocery store. They set the price, order inventory and make decisions about promotions in a simulation-based competition with other teams in the Chicago area. The eventual winners will have maximized profitability and successfully communicated their approach to the problem to experts in the field. The winners earn a college scholarship and the right to compete in the national version of the contest at NASDAQ in New York City in January 2011. The national champions get a significant additional scholarship and the right to ring the closing bell for the trading day. For those of us at NAG it’s both a way of giving back to the community and helping the next generation apply their academic skills to real-world problems. From the conversations I’ve had with them thus far I suspect that the computer HAL would be no match for them.