## Archive for July, 2012

One of my hobbies is retro video games and tonight’s opening ceremony for the 2012 Olympics inspired me to take a look at Olympic video games over my lifetime. Where games were released on multiple platforms I’ve simply chosen the one that was most relevant to me.

**Video Olympics (Atari VCS2600, 1977)**

Released in the year of my birth, the Atari VCS-2600 holds a special place in my heart. The hardware was incredibly primitive and yet some of the games were surprisingly playable. I must have spent weeks of my childhood playing Combat for instance. Sadly,Video Olympics is one of the less playable games for the 2600 and should really be renamed ‘Variations on the theme of Pong.’

**Hyper Sports (Arcade, 1984)**

One of my abiding memories of the early 80s was spending Sunday afternoons in the children’s room of our village’s local pub. This particular village pub was a geek child’s paradise as the kids room included up to 3 arcade games at any one time. My brother and I would be given 20p each to play on these games, a sum of money that would be expected to last us at least an hour, while dad enjoyed a quiet pint in the bar.

I remember Konami’s Hyper Sports very clearly and the youtube above brings back a flood of memories for me. Hyper Sports was released in time for the 1984 Los Angeles Olympics and was the sequel to Konami’s superb Track and Field.

**Micro Olympics (BBC Micro, 1984)**

If you had walked into any UK primary school in the early 80s you’d have found a BBC Micro, an 8 bit computer developed by Acorn Computers (the guys who went on to develop the ARM processor used in the vast majority of mobile devices). My primary school had exactly one of these high powered beasts and each pupil only got a few minutes on it a month on average. I remember that my dad had a chat with the head master though and scored me a lot of extra time on it. As long as I didn’t make any noise whatsoever, I could use the computer just outside the headmasters office for an hour after school and I used the time to work through my collection of Marshall Cavendish Input magazines. Happy days.

The BBC wasn’t known for its games however. Micro Olympics was rubbish!

**Daley Thompson’s Olympic Challenge (Sinclair Spectrum, 1988)**

Ahhh the humble speccy— Oh how I loved thee! The spectrum was my first ‘proper’ computer and I received it for my 8th birthday. All I wanted to do was play games but my father insisted that I also learn how to program it and so I probably owe my career to dear old dad and Sinclair’s 48K wonder.

Released in time for the 1988 Seoul Olympics, Daley Thompson’s Olympic Challenge was a joystick waggler pure and simple. The game included several events: 100 metres, Long Jump, Shot Putt, High Jump and 400 metres, 110 metres Hurdles, Discus, Pole Vault, Javelin and 1500 metres but gameplay consisted of nothing more than frantically waggling your joystick side to side and occasionally pressing the fire button.

**Olympic Gold (Sega MegaDrive, 1992)**

I remember reading articles that previewed Sega’s megadrive. Back then its power seemed nothing short of astonishing but, sadly, I didn’t have one. One of my friends, however, did have one and many a happy hour was spent over at his house playing Mortal Kombat and Sonic the Hedgehog.

Olympic gold was the first officially licensed Olympic video game and was released in time for the Barcelona Olympics. Although the graphics are much better than older games, the game mechanic is essentially exactly the same, mash buttons as fast as you can.

**1996 and beyond**

By the time the 1996 Atlanta games came around, I had better things to do than play video games. That summer was my last before starting my undergraduate studies in theoretical physics. Many Olympic video games have since been released of course but I haven’t played them and neither do I want to.

So, I’ll hand over to The Complete History of Official Olympic Video Games which picks up where I left off.

A MATLAB user at Manchester University contacted me recently asking about Black-Scholes option pricing. The MATLAB Financial Toolbox has a range of functions that can calculate Black-Scholes put and call option prices along with several of the sensitivities (or ‘greeks‘) such as blsprice, blsdelta and so on.

The user’s problem is that we don’t have any site-wide licenses for the Financial Toolbox. We do, however, have a full site license for the NAG Toolbox for MATLAB which has a nice set of option pricing routines. Even though they calculate the same things, NAG Toolbox option pricing functions look very different to the Financial Toolbox ones and so I felt that a Rosetta Stone type article might be useful.

For Black-Scholes option pricing, there are three main differences between the two systems:

- The Financial Toolbox has separate functions for calculating the option price and each greek (e.g. blsprice, blsgamma, blsdelta etc) whereas NAG calculates the price and all greeks simultaneously with a single function call.
- Where appropriate, The MATLAB functions calculate Put and Call values with one function call whereas with NAG you need to explicitly specify Call or Put.
- NAG calculates more greeks than MATLAB.

The following code example pretty much says it all. Any variable calculated with the NAG Toolbox is prefixed NAG_ whereas anything calculated with the financial toolbox is prefixed MW_. When I developed this, I was using MATLAB 2012a with NAG Toolbox Mark 22.

%Input parameters for both NAG and MATLAB. Price=50; Strike=40; Rate=0.1; Time=0.25; Volatility=0.3; Yield=0; %calculate all greeks for a put using NAG [NAG_Put, NAG_PutDelta, NAG_Gamma, NAG_Vega, NAG_PutTheta, NAG_PutRho, NAG_PutCrho, NAG_PutVanna,... NAG_PutCharm, NAG_PutSpeed, NAG_PutColour, NAG_PutZomma,NAG_PutVomma, ifail] =... s30ab('p', Strike, Price, Time, Volatility, Rate, Yield); %calculate all greeks for a Call using NAG [NAG_Call, NAG_CallDelta, NAG_Gamma, NAG_Vega, NAG_CallTheta, NAG_CallRho, NAG_CallCrho, NAG_CallVanna,... NAG_CallCharm, NAG_CallSpeed, NAG_CallColour,NAG_CallZomma, NAG_CallVomma, ifail] = ... s30ab('c', Strike, Price, Time, Volatility, Rate, Yield); %Calculate the Elasticity (Lambda) NAG_CallLambda = Price/NAG_Call*NAG_CallDelta; NAG_PutLambda = Price/NAG_Put*NAG_PutDelta; %Calculate the same set of prices and greeks using the MATLAB Finance Toolbox [MW_Call, MW_Put] = blsprice(Price, Strike, Rate, Time, Volatility, Yield); [MW_CallDelta, MW_PutDelta] = blsdelta(Price, Strike, Rate, Time, Volatility, Yield); MW_Gamma = blsgamma(Price, Strike, Rate, Time, Volatility, Yield); MW_Vega = blsvega(Price, Strike, Rate, Time, Volatility, Yield); [MW_CallTheta, MW_PutTheta] = blstheta(Price, Strike, Rate, Time,Volatility, Yield); [MW_CallRho, MW_PutRho]= blsrho(Price, Strike, Rate, Time, Volatility,Yield); [MW_CallLambda,MW_PutLambda]=blslambda(Price, Strike, Rate, Time, Volatility,Yield);

Note that NAG doesn’t output the elasticity (Lambda) directly but it is trivial to obtain it using values that it does output. Also note that as far as I can tell, NAG outputs more Greeks than the Financial Toolbox does.

I’m not going to show the entire output of the above program because there are a lot of numbers. However, here are the Put values as calculated by NAG shown to 4 decimal places. I have checked and they agree with the Financial Toolbox to within numerical noise.

NAG_Put =0.1350 NAG_PutDelta =-0.0419 NAG_PutLambda =-15.5066 NAG_Gamma =0.0119 NAG_Vega =2.2361 NAG_PutTheta =-1.1187 NAG_PutRho =-0.5572 NAG_PutCrho = -0.5235 NAG_PutVanna =-0.4709 NAG_PutCharm =0.2229 NAG_PutSpeed =-0.0030 NAG_PutColour =-0.0275 NAG_PutZomma =0.0688 NAG_PutVomma =20.3560

This is a guest post by colleague and friend, Ian Cottam of The University of Manchester.

I recently implemented a lint program for Condor (unsurprisingly called: condor_lint), analogous to the famous one for C of days of yore. That is, it points out the fluff in a Condor script and suggests improvements. It is based on our local knowledge of how our Condor Pool is set up, here in Manchester, and also reflects recent changes to Condor.

Why I did it is interesting and may have wider applicability. Everything it reports is already written up on our extensive internal web site which users rarely read. I suspect the usual modus operandi of our users is to find, or be given, a Condor script, relevant to their domain, and make the minimum modifications to it that means it ‘works’. Subsequently, its basic structure is never updated (apart from referencing new data files, etc).

To be fair, that’s what we all do — is it not?

Ignoring our continually updated documentation means that a user’s job may make poor use of the Condor Pool, affecting others, and costing real money (in wasted energy) through such “bad throughput”.

Now, although we always run the user’s job if Condor’s basic condor_submit command accepts it, we first automatically run condor_lint. This directly tells them any “bad news” and also, in many cases, gives them the link to the specific web page that explains the issue in detail.

Clearly, even such “in your face” advice can still be ignored, but we are starting to see improvements.

Obviously such an approach is not limited to Condor, and we would be interested in hearing of “lint approaches” with other systems.

**Links**

**Other WalkingRandomly posts by Ian Cottam**

If you have an interest in mathematics, you’ve almost certainly stumbled across The Wolfram Demonstrations Project at some time or other. Based upon Wolfram Research’s proprietary Mathematica software and containing over 8000 interactive demonstrations, The Wolfram Demonstrations Project is a fantastic resource for anyone interested in mathematics and related sciences; and now it has some competition.

Sage is a free, open source alternative to software such as Mathematica and, thanks to its interact function, it is fully capable of producing advanced, interactive mathematical demonstrations with just a few lines of code. The Sage language is based on Python and is incredibly easy to learn.

The Sage Interactive Database has been launched to showcase this functionality and its looking great. There’s currently only 31 demonstrations available but, since anyone can sign up and contribute, I expect this number to increase rapidly. For example, I took the simple applet I created back in 2009 and had it up on the database in less than 10 minutes! Unlike the Wolfram Demonstrations Project, you don’t need to purchase an expensive piece of software before you can start writing Sage Interactions….Sage is free to everyone.

Not everything is perfect, however. For example, there is no native Windows version of Sage. Windows users have to make use of a Virtualbox virtual machine which puts off many people from trying this great piece of software. Furthermore, the interactive ‘applets’ produced from Sage’s interact function are not as smooth running as those produced by Mathematica’s Manipulate function. Finally, Sage’s interact doesn’t have as many control options as Mathematica’s Manipulate (There’s no Locator control for example and my bounty still stands).

The Sage Interactive Database is a great new project and I encourage all of you to head over there, take a look around and maybe contribute something.

I work for The University of Manchester where, among other things, I assist in the support of various high performance and high throughput computing systems. Exchanges such as the following are, sadly, becoming all too commonplace

**Researcher:** “Hi, I have an embarrassingly parallel research problem that needs a lot of compute resource. Can you help?”

**Support: **“Almost certainly, you could have access to our 2500 core Condor pool or maybe our 2000 core HPC system or any number of smaller systems depending on the department you are in. Let’s meet to discuss your requirements in more detail”

**Researcher:** “Sounds great. I am using [insert expensive commercial package here], could we install that on your systems?”

**Support: **“Not unless you pay a HUGE amount of money because you’ll need dozens or maybe hundreds of licenses. The licenses will cost more than our machines! Could you use [insert open source equivalent here] instead?”

**Researcher: **“A member of your team suggested that about 2 years ago but [insert expensive commercial package here] is easier to use, looks pretty and a single license didn’t seem all that expensive. It’ll take me ages to convert to [insert open source equivalent here]. Instead of splitting the job up and spreading it around lots of machines, can’t I just run it on a faster machine?”

**Support: **“Sorry but parallelism is the only real game in town when it comes to making stuff faster these days. I’m afraid that you’ll have to convert to [insert open source equivalent here], open your chequebook or wait until 2076 for your results to complete on your workstation.”

The moral of the story is that if you want your compute to scale, you need to ensure that your licensing scales too.

As everyone knows, Graphene is the Chuck Norris of materials. Sometimes pure, sometimes mixed with a little of this or that, it has an astonishing array of uses. Here’s a list of some of the things it can do

- Amplify light
- Cool microchips
- Detect diseases
- Enable CPUs that can operate at 100s of Ghz
- Desalinate water
- Generate electricity from flowing water
- Generate electicty from heat
- Generate electricity from light
- Generate electricity by being squeezed (Piezoelectricity)
- Kill Bacteria
- Make alcohol stronger
- Make an invisibility cloak
- Measure the fine structure constant
- Produce gas sensors capable of detecting a single atom
- Produce a better MRI contrast agent
- Support hyperfast optical networks

Here’s a brief list of some of its more interesting properties

- Highest known thermal conductivity
- Longest mean free path of any material at room temperature
- Repair Itself
- Stiffest known material
- Strongest known material
- Thinnest known material (Since it is one atom thick)

Related post: Fun with Graphene from the Wolfram Demonstrations project