Wednesday, 24 March 2010

My early career at NAG…

“Don’t worry you’ve done a mathematics and finance degree and have an engineering masters and you’ve programmed in Pascal and can write VB macros you’ll be just fine.”

“You’ve far more technical experience than any previous sales persons (the last manager used to sell used cars!). If you’re not careful you’ll end up in the technical division.”

Well with such resounding support and confidence from my manager how could I fail? Buoyed by a good track record at previous companies I now felt less daunted by this technology shift… industrial and electronic weighing scales to telecoms to numerical software… what could be more natural?

Thinking about my days selling weighing scales, I can recall a previous manager, John Ruskin Utley’s wise words (that isn’t quite his real name, in fact I suspect his middle name didn’t even begin with R).

“Listen to your customers, listen to your distributors/resellers and ensure those in technical development get the market feedback. Rest assured laddy if you don’t hit your sales target and you blame the product those in development will be asking why you didn’t communicate what you wanted when you had the chance.”

What’s so complicated about weighing scales I hear you ask? Let me give you a simple example… in 1998 £1 = 380,000 Turkish lira… clearly an electronic supermarket scale in Turkey is going to have the capability to display a few more digits than an electronic weighing scale in England!

How does this all relate to NAG? Well it’s all about listening to customers and distributors and feeding back market needs or not as you will see!

One of my first overseas trips for NAG was to one of our European distributors (who have now changed ownership and sales staff). Giving the distributor’s sales manager a hard time about lack of sales we chatted through the NAG product portfolio. “Give me your SWOT analysis on our products” I prompted. Distributor said “Well I can’t sell the compiler as the users complain that it throws up too many error messages and won’t allow them to compile their code.”

“I’ll get to the bottom of this”, I said firmly. “Let me feed this back to technical management. I’m sure they’ll want to fix this especially if it is impacting the reputation of our product and limiting sales.”

Tee hee, ha ha. Well you can imagine the reaction of my technical colleagues. Oh how they must have laughed.

The NAG Fortran Compiler is the world’s best checking compiler. Clearly my distributor’s client was an inexperienced programmer probably used to programming with a compiler that had limited error checking capabilities. Of course the distributor (and enthusiastic salesmen = me) were subsequently given training about compilers generally and in particular the NAG Fortran Compiler.

“How fast do you want the wrong answer?” Well I get it now… fast code is only any good if the answer is correct. NAG's Numerical Libraries are very much based on the philosophy… Accuracy first Performance second Of course many of you will know and be grateful that NAG generally achieve both.

A few more anecdotes about compilers. Ian Chivers, Rhymney Consulting and Fortranplus (known Fortran specialist) once told me “of course any decent programmer has at least two compilers. The NAG Fortran Compiler for checking and perhaps another for comparison or perhaps speed.”

Finally some background behind the NAG Fortran Compiler, NAG developed this in its early days of Numerical Library development not because NAG wanted to become a compiler company, but because the compilers available were inadequate to build accurate, robust numerical libraries and by producing our own we were able to advance the Fortan language. In fact NAG produced the world’s first Fortran 90 compiler and still keeps pace with the latest Fortran Standards with lead developer Malcolm Cohen playing an active role in the Fortran Standards committee.

Well that’s quite enough from me in my first blog. More stories from an enthusiastic salesmen next time ;-)

Tuesday, 23 March 2010

What’s the next revolution in technical computing?

It’s a question that absorbs the attention of the technical computing community, especially those working at the leading edge of technology and performance (high performance computing, HPC). What is the next disruptive technology? In other words, what is the next technology that will replace a currently dominant technology? Usually a disruptive technology presents a step-change in performance, cost or ease-of use (or a combination of these) compared to the established technology. The new technology may or may not be disruptive in the sense of discontinuous change in user experience.

Why is identifying disruptive technology so important? First, those who spot the right change early enough and deploy it effectively can attain a significant advantage over competitors as a result of a substantial improvement in technical computing capability or reduction in cost. Second, identifying the right technology change in time can help ensure that future investments (whether software engineering, procurement planning, or HPC product development) are optimally spent.

However, in a field as fast moving as technical computing, spotting the next disruptive technologies of specific relevance to your individual needs can easily become a full time activity (which is why NAG helps to do this for others).

One very credible candidate for disruptive change in HPC right now is GPU computing (or related products that might be in development). However, at the Newport conference recently, the discussion turned to what the next disruptive technology to hit HPC would be (after the possible GPU disruption). One suggestion, made by John West (of InsideHPC fame), was that the next disruptive technology could be in software, especially programming tools and interfaces. This builds on the fact that parallel computing is no longer a specialist activity unique to the HPC crowd – parallel processors are becoming pervasive across all areas of computing from embedded to personal to workgroup technical computing. Parallel programming is thus heading towards a mass market activity – and the mass market is unlikely to view what we have in HPC currently (Fortran plus MPI and/or OpenMP, or limited tools, etc) with much favour. I’m not knocking any of these, but they are not mass-market interfaces to parallel computing. So perhaps the mass market, through volume of people in need – and companies driven by economics will come up with a “better” solution for interfacing with supercomputers.

As a HPC community we lost control of much of our hardware to the commodity market some years ago. Maybe we now face losing control of our software to the commodity community too.

Thursday, 18 March 2010

Extending, imitating and collaborating

As the name suggests, NAG is most widely-known for its library of numerical algorithms - in fact, it's been around for nearly forty years. Back when the first version was written, the coding language of choice was Fortran and, although it's still widely-used (particularly in high-performance computing), most developers today are working in a myriad of other languages and packages. Accordingly, we've put a lot of work into making the routines from our libraries accessible from these environments, in order to (hopefully) extend the areas in which our algorithms can be used by programmers when building their applications.

I was reminded of this extension into other environments recently when I saw a demo written by my colleague Marcin Krzysztofik. Marcin's background is in financial computation, and his application nicely demonstrated some of the models that have been developed for calculating the prices of financial options. New routines for evaluating the results from these models were added to the latest release of the NAG Library, but Marcin wanted to build his demo as an Excel spreadsheet, knowing that would be a familiar environment for users who'd be interested in this type of application. Calling NAG routines from within Excel - more specifically, from a Visual Basic for Applications (VBA) program - is straightforward, and Marcin's demo (see the screenshot above) allows the user to select the pricing model and set values for dependent parameters before calling the appropriate NAG routine. It then uses Excel's plotting capabilities to visualize the behaviour of both the price and the so-called Greeks, which provide a measure of the sensitivity of the price to the dependent parameters of interest.

I was so impressed with Marcin's demo that I did what any good software developer would do: I copied the idea, and tried to implement it in MATLAB. Here, I was able to make use of the NAG Toolbox for MATLAB, which makes routines from the NAG Library accessible from within that popular environment. The resultant demo (see screenshot below) has a similar functionality to Marcin's Excel application, with a couple of extensions - for example, parameter values can be automatically swept back and forth within a user-defined range, which produces animated plots of the price and Greeks that might be useful for teaching purposes (or perhaps merely provides an eye-catching demo on a NAG exhibition stand).

Having created the demo, I wanted to write it up as the latest installment of an occasional series of articles which highlights the functionality of the NAG Toolbox (the previous article in this series, written in collaboration with Nicolas Esteves and Nathaniel Fenton, described an application that uses several NAG routines to solve a partial differential equation over a two-dimensional region) and it was at this point I realized that, for some time, I'd - not uncharacteristically - been working beyond the limits of my knowledge in this field. Accordingly, I went back to Marcin, who patiently explained the theory of financial option pricing to me (yet again). The eventual result of these collaborative endeavours was a jointly-written white paper which contains an introduction to option pricing that's elementary enough for me to understand, technical details of how NAG routines can be called from VBA, as demonstrated by Marcin's application, and information on Excel examples that can be downloaded from the NAG website.

We aim to follow this note up with an description of the MATLAB option pricing demo, which should be made available later this year. I enjoy collaborating with Marcin, even though I discovered during the course of our work together that he's exactly half my age, and hence has probably never been near a Fortran program (or a Fortran programmer - until now, that is).

Tuesday, 16 March 2010

Using the NAG Library for .NET from F#

NAG is currently running a beta test of a NAG Library for .NET. One noticeable feature of the comments received so far is the relatively large number of users interfacing to the library from F# rather than C# or VB.NET.

F# is a functional programming language derived from OCAML, which in turn has a shared ancestry with the Standard ML programming language that I used over 20 years ago while working on theorem proving systems at Manchester university…

The .NET area of the NAG website has been updated with a discussion article and example programs showing how to write console applications and windows form applications from F#.

Using small wrappers one may provide functional interfaces to the NAG library that work well with the interactive loop of the F# interpreter, allowing calls to the NAG library with essentially no syntactic overhead. The following is a complete F# expression (integrating the sin function between 0 and pi) with the results being echoed back by the F# top level, using a read-eval-print interface which should be familiar to users of lisp, or MATLAB and similar systems.

// using myd01ah
> myd01ah Math.Sin (0.0,Math.PI) 0.01 1000 ;;
val it : d01ahresults = {result = 2.0;
                         npts = 7;
                         relerr = 0.0006944567037;
                         ifail = 0;}

The fsharp article has several more examples, together with a discussion of the differences between F# and C# interfaces. Feel free to use use the comment feature here on the blog to discuss any issues related to calling the NAG library from F#.

Wednesday, 10 March 2010

What makes a good software developer?

In NAG's development division we hire a variety of people from a variety of backgrounds - in fact we have a very diverse community of technical staff. At the time we hire them, of course usually we have a role in mind for them, whether it's someone with a specialism that enables them to work on a particular piece of software, or someone we want as an "all-rounder" who needs to make software work on lots of different hardware.

However, it's not always clear what direction their talents will ultimately take them, and when we're interviewing we can't always tell what virtues a person has. It would be nice if we could know in advance, but usually we just can't.

I got to thinking - what are the attributes of a good software developer? I came up with the following list.

  1. A logical mind
  2. Impatience
  3. Patience
  4. Curiosity
  5. Ability to transfer knowledge
  6. Knowing when to let go
  7. Knowing the right people

Number 1 seems obvious. You can't write software without using logic; or at least, not the kind of software that we build.

Number 2. A degree of impatience is vital. It's often what drives us to make improvements to software. Is the software build system too slow for your liking? Find a way to speed it up so you don't need to wait around so much for it to complete.

Number 3 - patience. Does this contradict number 2? Not at all; sometimes you need to take your time. When you're wading through a mountain of code trying to track down an obscure bug in, for example, an optimization routine, you might need to re-start the debugger, or re-compile your code, a hundred times before you pin down what's going wrong. (Of course you might never pin it down at all - then you might need to call on attribute 7).

Number 4. Curiosity - never mind that it kills cats - it can be very useful. How can a routine that multiplies a vector by a scalar sometimes give slightly different results when called with the same data? Well, it can if it turns out that the location of the data in memory might affect the result - for example if a fast algorithm using 64 bit registers is used when the data is nicely aligned, but a slower algorithm using 80-bit registers is used when it's not. If the vector and scalar contain complex rather than real numbers, the arithmetic operations involved are sufficiently complicated that the results can differ by about one unit in the last place. Knowing this enables us to make a judgement on whether the difference is important or not.

Number 5 - ability to transfer knowledge. At NAG it's no use being an expert in one thing. Good at Fortran? Well, our customers want to use C, C++, C#, Java, Basic, Python, and anything else you can think of, so you'd better be flexible. And you need to be able to show other people how to do it too. (As an aside, I wonder if anyone knows a way of calling a NAG library routine from emacs lisp? Not that I'd want to do it - just curious.)

Number 6 - knowing when to let go. Sometimes a developer doesn't want to stop work on a piece of software until it's absolutely right. Being a perfectionist is no bad thing, but sometimes - you just need to stop. Otherwise no-one else will ever get chance to use it.

Number 7 - knowing the right people. When you get stuck on something, or just need a bit of advice, there's nothing like being able to ask your colleagues for help at tea time. And a good way to make them happy to help you is by trying to be helpful to them in return when they need it.

How do I myself fit into this list of attributes? Well, I have to admit that I'm very good on the impatience front. I can't stand the few seconds pause while Visual Studio fires up. But my best attribute is probably number 7. I'm lucky to be able to work with a whole bunch of talented people who don't mind helping me out when I'm stuck.

Calling the NAG Fortran Library from Freemat

Black-Scholes equation solved with d03ndf in Freemat
We like to make our libraries easily accessible from other languages and environments other than Fortran and C, for example Excel in VB or Matlab through our toolbox. It allows users to quickly and simply access Library functionality without the hassle of making and compiling a C or Fortran program. Environments like Matlab or Octave are particularly interesting as we can combine their strong codebase and powerful visualisation tools with our libraries (we provide technical reports on how to use the NAG Libraries from Scilab and Octave).
A Few days ago, I discovered Freemat, which is another prototyping tool inspired by Matlab and IDL. As Octave or SciLab, Freemat is an open source program which supports Matlab scripting, Mex interfaces and more. But the particularly interesting feature it has is their import function. It allows users to dynamically load a shared library into Freemat and use it directly, without having to write interface script or C code (as in Matlab, Scilab or Octave). I then decided to try to call our Fortran Library from here, and I will describe here how I did it and the results.

Starting easy

When we try to interface our Library with something else, we start easy. I usually start with a00aaf because it's the simplest we have: no arguments, no return value. It only prints the Library description to stdout. By looking at the import doc of Freemat, I can see that the call is really simple, I just need to make two lists of types, one for the arguments and one for the return value:
Saved it in load_fll3a22dfl.m and I can test it.
--> source load_fll3a22df.m

--> a00aa
ans =

As expected I don't get any result. But disappointingly I don't see my library information anywhere. That is because a00aaf prints it to stdout, not in the freemat window. By looking my calling shell, I can see it.
[nicolas@stretton:~]$ freemat *** Start of NAG Library implementation details *** Implementation title: Linux/gfortran 32-bit Precision: FORTRAN double precision Product Code: FLL3A22DFL Mark: 22.0 (self-contained) *** End of NAG Library implementation details ***

The S chapter: Bessel functions
a00aaf is nice, but not really useful. My next step when calling the library is always the S chapter and especially s17acf. It takes two arguments and returns a double number. That is the perfect candidate for my next test.
import(lib,'s17acf_','s17acf','double','double &x, int32 &ifail');
The important thing to remember when calling a Fortran library is that all argument are passed by references. That means we have to give pointers from Freemat even if the values are declared as INTENT(IN), hence the & before x and ifail.
Doing the same for the other Bessel functions gives a nice graph.
Bessel functions from the S chapter in Freemat
The E02 chapter: Curve fitting functions
For this example, I adapted the curve fitting demo from our Nag Toolbox for Matlab.
The routines I'm using here are e02bef and e02bbf. Why these? Because e02bef takes a character as its first argument, and passing character arrays are always tricky.
e02bef takes 15 arguments when called from Fortran, but a 16th argument is hidden from the user when doing this: the size of the character string. If this last argument isn't specified, it causes a bad segmentation fault. The proper imports are

import(lib,'e02bbf_','e02bb','void',' int32 &ncap7, double &lamda, double &c, double &x, double &s, int32 &ifail');

import(lib,'e02bef_','e02be','void',' string start, int32 &m, double &x, double &y, double &w, double &s, int32 &nest, int32 &n, double &lamda, double &c, double &fp, double &wrk, int32 &lwrk, int32 &iwrk, int32 &ifail, int32 &start_Len');
The rest is pretty straight forward and the demo works after some modifications.
Curve fitting with E02 chapter in Freemat


Unfortunately, I couldn't import routines expecting a user callback. It doesn't seem to be support for passing pointers to Freemat user defined functions to the routines in the library, but I might be wrong. Freemat limits the import function to void problems of badly allocated memory for example, and keeping a clean interface while allowing callbacks like we would like to have is not easy to do.
Another limitation in our scenario, is getting the routines output inside of Freemat. They are pretty smart and have something already to do it, by exporting a special function in our library. But in our case where we want to use the library without additional modifications/code, it's not working. If you want to experiment with that, it's described in the import function documentation.
Again, my experience with Freemat is limited and I haven't yet explored all their code yet. If any experienced user of Freemat have solution or suggestions to these points, feel free to post them here.


I have to say that I was really impressed when trying that the first time. When I read the doc I thought "Well, that can't be that easy", but in fact it was. I quickly made a script to generate imports from almost all of our routines (except the ones taking callbacks) and make them work. This approach is really interesting, compared to generating mex files and get them to work, as we avoid the hassle of converting types or copying data.
Here is the file containing the imports for ~1560 routines I generated for FLL3A32DFL. You can use it to try the library from a friendly environment. With slight modifications, you should be able to use this file for every version of the FL22, but if you have trouble doing it, I can certainly help you. I haven't tested them all, but it should work. Be aware that if one the import is wrong, it will crash Freemat. If you test it, it would be nice to have some feedback on errors or usage example. If you manage to do anything with it, share it in the comments! Is anyone interested in having the routines callbacks as well?
Here are some links to things mentioned in this post
Links to technical reports from Nag explaining how to call our libraries from similar environments
Source codes of the examples above. Change the lib variable to point to your installed library

Wednesday, 3 March 2010

Beta testing and feedback

It is easy to see the value of software Beta releases – they provide the opportunity for potential users to get new functions early and they give the development team excellent feedback about what needs to be altered to advance from a potentially good product to an excellent product that meets users’ needs precisely. However, perhaps like many aspects of life, the way to gather most benefit from a Beta program is to listen really hard to what Beta users are saying. This can be time consuming and quite difficult. The current Beta program that we are running is for the NAG Library for .NET. We have received excellent, detailed feedback that covers issues from logging exceptions from Excel to looking at how to further tune some routines for particular environments (all things that the NAG developers can cover). While this is great, and let me take a moment to thank those in the Beta program for their feedback so far, the question remains: are we listening hard enough? So, for the first time, we are planning to use an on-line survey tool (called Survey Monkey) to try to draw out more comments. It will be interesting to know how it is received, because what any Beta program is looking for the ‘testers’ to think of one more thing that was noticed about the release but has not yet been mentioned. So the ‘any other comments?’ question may be most important. I’ll be sending the questionnaire to the test sites in a few weeks and I’m looking forward to the feedback. However it’s certain that I’ll want to have a few relaxed chats by phone to really uncover the most valuable details. After all it’s the unsolicited comments from a Beta test that tend to help create the very best products and perhaps this feedback can only happen if enough time is given to the process. Thanks again to all test sites for their help.