Wednesday, 24 October 2012

Now the games are over – what’s our legacy? How can we inspire a generation?

London 2012 Olympic Park
I really didn’t expect to love the Olympics as much as I did. I’m not really into sport and have never taken much interest before, but this time I was absolutely enthralled. London, being the host city, obviously had a bearing on my enthusiasm and enjoyment, but what now the games are over? A lot of talk centred around the Olympics 2012 ethos of ‘inspiring a generation’ and this got me thinking about NAG’s legacy and how we too can inspire.

NAG has had a busy year as well; we've managed to produce significant new releases of our four main numerical software products, the C and Fortran Libraries, the Library for SMP & Multicore and the Toolbox for MATLAB®. If you study the history of the Library you start to understand how truly amazing it is. So is the Library NAG’s legacy? Well, funnily enough, our legacy is right there in each of the new releases. That’s why the NAG Library is amazing. It’s an ever changing, ever improving set of codes that have been developed by people that work here or contributed by well-known and maybe not so quite well-known numerical analysts, statisticians and computer scientists from all around the world.

Back in the day – 1970 to be precise – Brian Ford, NAG’s Founding Director, was inspired with a vision of a collective inter-university numerical library and set about creating the first NAG Library. Mentored by the legendary Jim Wilkinson, Brian established NAG as an organisation that was, and still is not-for-profit. Brian said of Jim,
“he gave us his invaluable numerical linear algebra software, and his contacts".
Brian's key ideals were voluntary collaboration and quality in every phase of the activity. Over the years mathematical and statistical routines have been contributed by some highly regarded people:

Friday, 12 October 2012

The making of “1000x” – unbalanced supercomputing

I had a bit of a rant in my article published at HPCwire this week - “Chasing1000x: The future of supercomputing is unbalanced”.

The gist of my rant is that the supercomputing community pays great attention to the next factor of 1000x in performance – and I firmly agree that next 1000x is highly valuable to the HPC community and the wider economy. But, we should give equal attention to 1000x in other areas, notably ease-of-use and growth of the user-base. And, critically, give equal peer recognition to those promoting that growth and pursuing ease-of-use and ease-of-access, not reserve all our “visionary” accolades for those figuring out the details of the first exascale computers.

However, I planted an appropriate pun in the title of the article itself. The obvious meaning, in the context of the article, is that the future of supercomputing is unbalanced with respect to the focus on performance versus community growth etc. However, the double meaning should be readily recognizable to anyone active watching or developing the HPC technology roadmaps. The future of supercomputers is unbalanced – i.e., the broad summary of the many technology roadmaps out there is that future supercomputers (at all scales) will be less balanced in some key performance attributes than current technology.