Earlier this year I was awarded a fellowship from the software sustainability institute, an organization that works to improve all aspects of research software. During their recent collaborations workshop in Oxford, it occurred to me that I was aware of only a relatively tiny number of software projects at my own institution, The University of Manchester. I decided to change that and started contacting our researchers to see what software they had released freely to the world as part of their research activities.
Research software comes in many forms; from small but useful MATLAB, Python or R scripts with just a handful of users and one developer right through to fully-fledged applications used by large communities of researchers and supported by teams of specialist developers. I’m interested in knowing about all of it. After all, we live in a time when even a mistake in an Excel spreadsheet can change the world.
The list below is what’s been sent to me so far and is a mirror of an internal list that’s been doing the rounds at Manchester. I’ll update it as more information becomes available. If you are at Manchester and know of a project that I’ve missed, feel free to contact me.
Faculty of Life Sciences
- antiSMASH – Genome annotation tool for secondary metabolite gene clusters.
- MultiMetEval – Flux-balance analysis tool for comparative and multi-objective analysis of genome-scale metabolic models.
- mzMatch/mzmatch.R/mzMatch.ISO – Comprehensive LC/MS metabolomics data processing toolbox.
- Rank Products – Statistical tool for the identification of differentially expressed entities in molecular profiles.
- openCDMS – The openCDMS project is a community effort to develop a robust, commercial-grade, full-featured and open source clinical data management system for studies and trials.
- idiffh – Research software can produce huge text files (e.g. logs). The GNU diff program needs to read the files into memory and therefore has an upper bound on file size. idiffh might only use a simple heuristic but is only bounded by the maximum file size (and free file store).
- ParaFEM – A portable library for parallel finite element analysis. Contributions from MACE, SEAES, School of Materials.
- Shadow – This is an Apple Mac OS X shell level application that can monitor Dropbox shared folders for file deletions and restore them.
- The Reality Grid Steering Library – A software library for steering and monitoring numerical simulations, APIs available for Fortran/C++/Java and steering clients available for installation on laptops and mobile devices. Developed in collaboration with the School of computer science.
Manchester Institute of Biotechnology
- Copasi – COPASI is a software application for simulation and analysis of biochemical networks and their dynamics.
- Condor Copasi – Condor-COPASI is a web-based interface for integrating COPASI with the Condor High Throughput Computing (HTC) environment.
School of Chemical Engineering & Analytical Science
- SurfaceSpectra Identity- is free software that allows you to view and export isotope patterns.
School of Chemistry
- DOSY Toolbox – A free, open source programme for processing PFG NMR diffusion data (a.k.a. DOSY data).
School of Computer Science
- Clinical NERC – Clinical NERC is a simple customizable state-of-the-art named entity recognition, and classification software for clinical concepts or entities.
- GPC – The University of Manchester GPC library is a flexible and highly robust polygon set operations library for use with C, C#, Delphi, Java, Perl, Python, Haskell, Lua, VB.Net (and other) applications.
- HiPLAR – High Performance Linear Algebra in R. A collaboration between Manchester and Imperial.
- INSEE – Interconnection Networks Simulation and Evaluation Environment
- KUPKB (The Kidney & Urinary Pathway Knowledge Base) – The KUPKB is a collection of omics datasets that have been extracted from scientific publications and other related renal databases. The iKUP browser provides a single point of entry for you to query and browse these datasets.
- ManTIME – ManTIME is an open-source machine learning pipeline for the extraction of temporal expressions from general domain texts.
- MethodBox – MethodBox provides a simple, easy to use environment for browsing and sharing surveys, methods and data.
- myExperiment – myExperiment makes it easy to find, use and share scientific workflows and other Research Objects, and to build communities.
- Open PHACTS Discovery Platform – Freely available, this platform integrates pharmacological data from a variety of information resources and provides tools and services to question this integrated data to support pharmacological research.
- OWL API – A Java API and reference implementation for creating, manipulating and serialising OWL Ontologies. The latest version of the API is focused towards OWL 2. The OWL API is open source and is available under either the LGPL or Apache Licenses.
- RightField – Semantic annotation by stealth. RightField is tool for adding ontology term selection to Excel spreadsheets to create templates which are then reused by Scientists to collect and annotate their data without any need to understand, or even be aware of, RightField or the ontologies used. Later the annotations can be collected as RDF
- SEEK – SEEK is a web-based platform, with associated tools, for finding, sharing and exchanging Data, Models and Processes in Systems Biology.
- ServiceCatalographer – ServiceCatalographer is an open-source Web-based platform for describing, annotating, searching and monitoring REST and SOAP Web services.
- Simple Spreadsheet Extractor – A simple ruby gem that provides a facility to read an XLS or XLSX Excel spreadsheet document and produce an XML representation of its content.
- Taverna – Taverna is an open source and domain-independent Workflow Management System – a suite of tools used to design and execute scientific workflows and aid in silico experimentation.
- TERN – TERN is a temporal expressions identification and normalisation software; designed for clinical data.
- Utopia Documents – Utopia Documents brings a fresh new perspective to reading the scientific literature, combining the convenience and reliability of the PDF with the flexibility and power of the web.
School of Electrical and Electronic Engineering
- LDLTS – Laplace transform Transient Processor and Deep Level Spectroscopy. A collaboration between Manchester and the Institute of Physics Polish Academy of Sciences in Warsaw
- Raspbian – Raspbian is a free operating system based on Debian optimized for the Raspberry Pi hardware.
- Signal Wizard – Digital signal processing software.
School of Mathematics
- EIDORS – Electrical Impedance Tomography and Diffuse Optical Tomography Reconstruction Software.
- Fractional Matrix Powers – MATLAB functions to compute fractional matrix powers with Frechet derivatives and condition number estimate
- IFISS – IFISS is a graphical package for the interactive numerical study of incompressible flow problems which can be run under Matlab or Octave.
- MARKOVFUNMV – An adaptive black-box rational Arnoldi method for the approximation of Markov functions.
- Matrix Computation Toolbox – The Matrix Computation Toolbox is a collection of MATLAB M-files containing functions for constructing test matrices, computing matrix factorizations, visualizing matrices, and carrying out direct search optimization.
- Matrix Function Toolbox – The Matrix Function Toolbox is a MATLAB toolbox connected with functions of matrices.
- Matrix Logarithm – MATLAB Files. Two functions for computing the matrix logarithm by the inverse scaling and squaring method.
- Matrix Logarithm with Frechet Derivatives and Condition Number – MATLAB files
- NLEVP A Collection of Nonlinear Eigenvalue Problems – This MATLAB Toolbox provides a collection of nonlinear eigenvalue problems.
- oomph-lib – An object-oriented, open-source finite-element library for the simulation of multiphysics problems.
- Simfit – Free software for simulation, curve fitting, statistics, and plotting.
- SmallOverlap – SmallOverlap is a GAP 4 package which implements new, highly efficient algorithms for computing with finitely presented semigroups and monoids whose defining presentations satisfy small overlap conditions (in the sense of J.H.Remmers)
- Symmetric eigenvalue decomposition and the SVD – MATLAB files
School of Mechanical, Aerospace and Civil Engineering (MACE)
- DualSPHysics – DualSPHysics is based on the Smoothed Particle Hydrodynamics model named SPHysics and makes use of GPUs.
- FLIGHT – FLIGHT specialises in the prediction and modelling of fixed-wing aircraft performance
- SPHYSICS – SPHysics is a platform of Smoothed Particle Hydrodynamics (SPH) codes inspired by the formulation of Monaghan (1992) developed jointly by researchers at the Johns Hopkins University (U.S.A.), the University of Vigo (Spain), the University of Manchester (U.K.) and the University of Rome La Sapienza (Italy).
- SWAB Online – Innovative and User Friendly Web Application in Running Fortran-based 1-D Shallow Water near Shore Wave Simulation Modelling
School of Physics and Astronomy
- Herwig++ – Herwig++ is a new event generator, written in C++, built on the experience gained with the well-known event generator HERWIG, which was used by the particle physics community for nearly 30 years. Herwig++ is used by the LHC experiments to predict the results of their collisions and as an essential component of their data analysis. It is developed by a consortium of four main nodes, including Manchester, and its published write-up has been cited over 500 times.
- im3shape - Im3shape measures the shapes of galaxies in astronomical survey images, taking into account that they have been distorted by a point-spread function.
- MAD8/madinput – Mathematica code and MAD8 installer for performing optics calculations for particle accelerator design.
- PolyParticleTracker – MATLAB code for particle tracking against complex optical backgrounds
One of the benefits of working at a university is that you are surrounded by a lot of smart people doing very interesting things and it usually doesn’t take much effort to get them to talk about their research. I work in the faculty of Engineering and Physical Sciences which means that I’m pretty well covered in subjects such as mathematics, physics, chemistry, engineering, computer science, materials and earth sciences but I have to go all the way to the other side of campus if I want to learn a little about the life sciences.
Last week, I attended a free event called The Rogue Cell which was arranged by The Wellcome Trust Centre for Cell-Matrix Research and hosted by The Manchester Museum as part of World Cancer Day. I had no idea what to expect out of the evening but if you were to press me I would have guessed that there was going to be a lot of power point slides and row upon row of gently dozing delegates. I could not have been more wrong.
The event was arranged in a workshop format where all of the attendees were split into five groups of six or so. Each group was then assigned a Wellcome Trust Researcher who’s job it was to explain to us one of five defining characteristics of a cancer cell which were
- Evading the immune system
- Angiogenesis (formation of blood vessels)
- Lack of apoptosis (programmed cell death).
Each group kept their researcher for 20 minutes or so before they got assigned a new one who discussed a different topic from the five. So, by the end of the evening we had covered the lot. The presentations were intimate, informal and highly interactive and it felt to me like I was having a good chat down my local pub with a group of people who just happened to be world-class cancer researchers. If only all learning experiences were like this one!
There was a great cross section of attendees from PhD biology students through to clinicians, undergraduates, random people off the street and, of course, the occasional math software geek. One of the great things about this event was the fact that everyone seemed to get a lot out of it, no matter what their background. I asked a lot of questions, many of which would have been blindingly obvious to a student of the life sciences but not once was I made to feel stupid or out of place. It must have been exhausting for the presenters but I can honestly say that it was one of the most enjoyable learning experiences I’ve had for quite some time.
I sincerely hope that The Wellcome Trust Centre for Cell-Matrix Research and The Manchester Museum will be arranging more events like this in the future.
The following links were sent to us following the event. I include them here for anyone who’s interested.
Evasion of Immune System:
- New cancer therapy:http://www.youtube.com/watch?v=t3bI9nBa4FY
- Herceptin mechanism of action:http://www.youtube.com/watch?v=IeE3K7U9fTQ
As everyone knows, Graphene is the Chuck Norris of materials. Sometimes pure, sometimes mixed with a little of this or that, it has an astonishing array of uses. Here’s a list of some of the things it can do
- Amplify light
- Cool microchips
- Detect diseases
- Enable CPUs that can operate at 100s of Ghz
- Desalinate water
- Generate electricity from flowing water
- Generate electicty from heat
- Generate electricity from light
- Generate electricity by being squeezed (Piezoelectricity)
- Kill Bacteria
- Make alcohol stronger
- Make an invisibility cloak
- Measure the fine structure constant
- Produce gas sensors capable of detecting a single atom
- Produce a better MRI contrast agent
- Support hyperfast optical networks
Here’s a brief list of some of its more interesting properties
- Highest known thermal conductivity
- Longest mean free path of any material at room temperature
- Repair Itself
- Stiffest known material
- Strongest known material
- Thinnest known material (Since it is one atom thick)
Back in the good old days when I was a freshly minted postgraduate student I had big plans– In short, I was going to change the world. Along with a couple of my friends I was going to revolutionize the field I was working in, win the Nobel prize and transform the way science and mathematics is taught at University. Fast forward four years and it pains me to say that my actual achievements fell rather short of these lofty ideals. I considered myself lucky to simply pass my PhD and land a job that didn’t involve querying members of the public on their preferences regarding potato based products. The four subjects of Laura Snyder’s latest book, The Philosophical Breakfast Club had broadly similar aims to my younger self but they actually delivered the goods and they did so in spades.
In this sweeping history of nineteenth century science, Snyder gives us not one biography but four — those of Charles Babbage, John Herschel, William Whewell and Richard Jones. You may not have heard of all of them but I’d be surprised if you didn’t know of some of their work. Between them they invented computing, modern economics, produced the most detailed astronomical maps of their age, co-invented photography, made important advances in tidology, invented the term scientist (among many other neologisms) and they are just the headliners! Under-achievers they were not.
These four men met while studying at Cambridge University way back in 1812 where they held weekly meetings which they called The Philosophical Breakfast Club. They took a look at how science was practiced in their day, found it wanting and decided to do something it. Remarkably, they succeeded!
I found Snyder’s combination of biography, history and science to be utterly compelling…so much so that during my time reading it, my beloved iPad stayed at home, lonely and forgotten, while I undertook my daily commute. This is no dry treatise on nineteenth century science; instead it is a living, breathing page-turner about a group of very colourful individuals who lived in a time where science was done rather differently from how it is practiced today. This was a time where ‘computer’ meant ‘a person who was good at arithmetic’ and professors would share afternoon champagne with their students after giving them advice. Who would have thought that a group of nineteenth century geeks could form the basis of one of the best books I’ve read all year?
In my previous blog post I mentioned that I am a member of a team that supports High Throughput Computing (HTC) at The University of Manchester via a 1600+ core ‘condor pool’. In order to make it as easy as possible for our researchers to make use of this resource one of my colleagues, Ian Cottam, created a system called DropAndCompute. In this guest blog post, Ian describes DropAndCompute and how it evolved into the system we use at Manchester today.
The Evolution of “DropAndCompute” by Ian Cottam
DropAndCompute, as used at The University of Manchester’s Faculty of Engineering and Physical Sciences, is an approach to using network (or grid or cloud based) computational resources without having to know the operating system of the resource’s gateway or any command line tools of either the resource itself –Condor in our case — or in general. Most such gateways run a flavour of Unix, often Linux. Many of our users are either unfamiliar with Linux or just prefer a drag-and-drop interface, as I do myself despite using various flavours of Unix since Version 6 in the late 70s.
Why did I invent it? On its original web site description page wiki.myexperiment.org/index.php/DropAndCompute the following reasons are given:
- A simple and uniform drag-and-drop graphical user interface, potentially, to many resource pools.
- No use of terminal windows or command lines.
- No need to login to remote hosts or install complicated grid-enabling software locally.
- No need for the user to have an account on the remote resources (instead they are accounted by having a shared folder allocated). Of course, nothing stops the users from having accounts should that be preferred.
- No need for complicated Virtual Private Networks, IP Tunnelling, connection brokers, or similar, in order to access grid resources on private subnets (provided at least one node is on the public Internet, which is the norm).
- Pop-ups notify users of important events (basically, log and output files being created when a job has been accepted, and when the generated result files arrive).
- Somewhat increased security as the user only has (indirect) access to a small subset of the computational resource’s commands.
The first version was used on a Condor Pool within our interdisciplinary biocentre (MIB). A video of it in use is shown below
Please do take the time to look at this video as it shows clearly how, for example, Condor can be used via this type of interface.
This version was notable for using the commercial service: Dropbox and, in fact, my being a Dropbox user inspired the approach and its name. Dropbox is trivial to install on any of the main platforms, on any number of computers owned by a user, and has a free version giving 2GB of synchronised and shared storage. In theory, only the computational resource supplier need pay for a 100GB account with Dropbox, have a local Condor submitting account, and share folders out with users of the free Dropbox-based service.
David De Roure, then at the University of Southampton and now Oxford, reviewed this approach here at blog.openwetware.org/deroure/?p=97, and offers his view as to why it is important in helping scientists start on the ‘ramp’ to using what can be daunting, if powerful, computational facilities.
Quickly the approach migrated to our full, faculty-wide Condor Pool and the first modification was made. Now we used separate accounts for each user of the service on our submitting nodes; Dropbox still made this sharing scheme trivial to set up and manage, whilst giving us much better usage accounting information. The first minor problem came when some users needed more –much more in fact– than 2GB of space. This was solved by them purchasing their own 50GB or 100GB accounts from Dropbox.
Problems and objections
However, two more serious problems impacted our Dropbox based approach. First, the large volume of network traffic across the world to Dropbox’s USA based servers and then back down to local machines here in Manchester resulted in severe bottlenecks once our Condor Pool had reached the dizzy heights of over a thousand processor cores. We could have ameliorated this by extra resources, such as multiple submit nodes, but the second problem proved to be more of a showstopper.
Since the introduction of DropAndCompute several people –at Manchester and beyond– have been concerned about research data passing through commercial, USA-based servers. In fact, the UK’s National Grid Service (NGS) who have implemented their own flavour of DropAndCompute did not use Dropbox for this very reason. The US Patriot Act means that US companies must surrender any data they hold if officially requested to do so by Federal Government agencies. Now one approach to this is to do user-level encryption of the data before it enters the user’s dropbox. I have demonstrated this approach, but it complicates the model and it is not so straightforward to use exactly the same method on all of the popular platforms (Windows, Mac, Linux).
To tackle the above issues we implemented a ‘local version’ of DropAndCompute that is not Dropbox based. It is similar to the NGS approach, but, in my opinion, much simpler to setup. The user merely has to mount a folder on the submit node on their local computer(s), and then use the same drag-and-drop approach to get the job initiated, debugged and run (or even killed, when necessary). This solves the above issues, but could be regarded as inferior to the Dropbox based approach in five ways:
1. The convenience and transparency of ‘offline’ use. That is, Dropbox jobs can be prepared on, say, a laptop with or without net access, and when the laptop next connects the job submissions just happens. Ditto for the results coming back.
2. When online and submitting or waiting for results with the local version, the folder windows do not update to give the user an indication of progress.
3. Users must remember to use an email notification that a job has finished, or poll to check its status.
4. The initial setup is a little harder for the local version compared with using Dropbox.
5. The computation’s result files are not copied back automatically.
So far, only item 5 has been remarked on by some of our users, and it, and the others, could be improved with some programming effort.
A movie of this version is shown below; it doesn’t have any commentary, but essentially follows the same steps as the Dropbox based video. You will see the network folder’s window having to be refreshed manually –this is necessary on a Mac (but could be scripted); other platforms may be better– and results having to be dragged back from the mounted folder.
I welcome comments on any aspect of this –still evolving– approach to easing the entry ‘cost’ to using distributed computing resources.
Our Condor Pool is supported by three colleagues besides myself: Mark Whidby, Mike Croucher and Chris Paul. Mark, inter alia, maintains the current version of DropAndCompute that can operate locally or via Dropbox. Thanks also to Mike for letting me be a guest on Walking Randomly.
A couple of years ago I wrote an article called Christmas gifts for math geeks and it has proven to be quite popular so I decided to write a follow up. As I started thinking about what I might include, however, I started to realise that I had produced a list for science geeks instead. So, here it is – my recommendations for gifts for the scientist in your life.
Mathematica 8 Home Edition – This is the full version of Mathematica, possibly my favourite piece of mathematical computer software, at the extremely low price of 195 pounds + VAT. I know what you are thinking ‘Over 200 quid is not an extremely low price.’ and I would tend to agree. It is, however, very good value since a commercial license costs several thousand pounds and Mathematica is as good as MATLAB with a whole slew of toolboxes. Mathematica is possibly the most feature complete piece of mathematical software available today and is infinitely better than any dedicated graphical calculator.
Bigtrak – I don’t have a Bigtrak but I used to have one back in the 1980s. Is the science geek in your life into computers and 30-40 years old? If so then there is a distinct possibility that their first foray into the world of computer programming was with a Bigtrak back when they were 8 or so – I mean, this thing can even do loops! This isn’t identical to the original but it is a very close facsimile and would be great for budding computer nerds or their misty eyed old dad.
200-in-1 electronic project lab. Now this one brings back fond memories for me since it was given to me for my 10th birthday and is probably the reason I studied physics at A-Level since A-Level physics included the study of basic electronics. I did well in A-Level physics and enjoyed it so I chose theoretical physics for my degree later moving on to a PhD so you could argue that this piece of kit changed my life!
I was overjoyed when I discovered that it was still being sold and was immensely pleased when I received it as a birthday present once again when I was 28.
The first thing you need to know about this wonderful piece of kit is that it requires no soldering; you wire up all of the components using bendy little springs – nothing could be more simple. There is also no need to be able to read schematic diagrams (although this can be a great way to learn how to) since each spring is numbered so producing your own AM radio transmitter can be as simple as joining spring 1 to spring 10 to spring 53 and so on.
The practical upshot of all of this is that you can approach this thing at a variety of levels. In the first instance you can just have fun building and playing with the various circuits which include things like a crystal set radio, a Morse code transmitter, a light detector, a sound detector and basic electronic games. Once you’ve got that out of your system you can start to learn the basics of electronics if you wish.
Wonders of the Solar System – I have always loved (although never practised) astronomy and avidly followed the adventures of Voyagers 1 and 2 when I was small. Since then, modern space probes such as Cassini-Huygens, Galileo and Mars Odyssey have added more to our knowledge of our astronomical backyard and we now know a tremendous amount about the solar system. In this series, Brian Cox of the University of Manchester takes us on a grand-tour around the solar system. The imagery is fantastic, Cox’s enthusiasm is infectious and the science is awesome. Yep, I quite like this DVD :)
2011 ‘Lightning Calculation’ calendar – Ron Doerfler writes a blog called Dead Reckonings that specialises in the lost arts of the mathematical sciences. Last year he designed a 2010 Graphical Computing calendar and made the designs available for free to allow you to print your own. Centred around ancient computing devices called nomograms, the calendar was beautiful and after Ron very kindly sent me a copy, I encouraged him to make a version that he could sell. Well, I guess he took my advice because Ron is back with a 2011 calendar with the theme of ‘Lightning Calculations’ and this time he is selling it from Lulu.com.
Since Ron is an all round nice guy, he also offers a high resolution pdf of the calendar to allow you to print it off yourself but personally I plan on showing my support by putting an order in with Lulu.com. Nice work Ron!
May 2009 saw the launch of STS-125, a NASA Space Shuttle mission which conducted a set of upgrades on the Hubble Space Telescope. As well as making the space telescope more powerful than it has ever been before, the upgrades included repairs and replacements that would help Hubble last until at least 2014.
As of today we can see some of the results of all this work since NASA has released the first set of pictures from the upgraded system including the one below which is of NGC 6302 or, to give it its more snappy title, the Butterfly Nebula. Head over to NASA’s website for the full story along with high resolution images.
Some discoveries in science and mathematics can be rather technical to say the least, understood and appreciated by relatively few people who have worked for years in some esoteric subject. Other discoveries can touch the imagination of pretty much every one of us. While reading through a collection of news articles on the train this morning I came across one of the latter which caused me to be come so excited that I temporarily forgot that I was British.
I turned to the random woman sat next to me and said in an excited voice “They’ve found water on Mars – beyond a shadow of doubt. They’ve as good as tasted it. Look…” and I presented my phone to her with the news article on it. To be fair to her she did make a show of reading it and nodded in encouragement before moving to a seat far, far away from me.
Over the last few years scientists have tantalised us with various snippets of evidence that point to the possibility of there being water on Mars. For example, back in 2002, NASA Space probe Odyssey discovered large quantities of Hydrogen under the surface of Mars via Gamma Ray Spectrometry. From this data it was inferred that there might be large quantites of water under the Martian surface but no one could be totally sure until a sample of it was scooped up and tested in a laboratory More information concerning the history of discoveries concerning the discovery of water on Mars can be found at this link.
Recently, NASA sent a laboratory to Mars in the form of the Phoenix Mars Lander which scooped up a sample of Martian soil, tested it and confirmed that it did indeed contain water. So now it’s official – Mars has water beyond any doubt which is a very cool scientific fact in my humble opinion.
This news has been spreading through the web like wildfire – and rightly so – but before we get too carried away we need to take a step back and think about what we actually know. (Take the following with a handful of salt – I am not a planetary scientist and, therefore, may be talking rubbish but I have included the links that led me to making these statements)
They have found water but they do not know its chemical composition yet. It may need a lot of processing before it’s drinkable due to a high concentration of salt and acid or it may be as pure as (Earth) snow – we simply don’t know. The answer to this question is important if humans are ever going to visit Mars – they’ll need a drink!
NASA have not detected the signs of life (check out their twitter feed to see what they have to say about this) but the presence of water makes it more statistically likely. That’s more likely not almost certain. If I buy 2 lottery tickets this week then I am more likely to win compared to only buying one (twice as likely in fact – unless I am stupid and choose the same numbers twice!) but I still probably won’t win. Little green men there are not!
Oh but imagine if we did discover life there – how wonderful would it be? We have to face facts – any life we might find would be no more complicated than viruses or bacteria but even that would almost certainly open up a whole new field of biology. They might be completely different to anything he have on Earth – with different biochemistry and everything…or they could look very similar to life here. Who knows? It’s going to be fun finding out though.
For more information – check out Popular Mechanics.
We live in interesting times.
Here’s one for pub quiz fans – Name the only planet that has not yet been visited by a man made probe. Some of you will say ‘Pluto‘ and this would have been the correct answer back on 19th January 2006 when the NASA space probe, New Horizons, was launched.
Two years later and, according to some astronomers, this is no longer true because the definition of a planet was altered in 2006 and Pluto no longer fits the bill. Pluto is now officially known as a dwarf planet along with the asteroid, Ceres, and the Kuiper belt object, Eris. From what I can gather there is still a lot of controversy about this ruling among the astronomical community with lots of people arguing over which balls of rock we should designate as planets and which we should not.
A wise man once said “You can know the name of a bird in all the languages of the world, but when you’re finished, you’ll know absolutely nothing whatever about the bird… So let’s look at the bird and see what it’s doing — that’s what counts. I learned very early the difference between knowing the name of something and knowing something.”
Damned right! Whatever you choose to designate Pluto as, it is something we know very little about and the New Horizons team are doing something about that – and THAT is what is important here. The probe won’t reach Pluto until July 2015 – over 9 years after it was launched and yet it is traveling very quickly. As it left Earth orbit it was doing something like 35,800 miles per hour and, thanks to the assistance of Jupiter, it is now going even faster at over 50,000 mph.
When it gets there it will do things like map the surface composition of Pluto and it’s largest moon Charon. It will also look at the composition of Pluto’s tenuous atmosphere, map the surface temperature, look for rings around Pluto along with various other things. Our level of knowledge about the Pluto-Charon system will have increased by orders of magnitude which will hopefully lead to even more interesting questions for future missions to work on.
For me it does not matter what you choose to call Pluto – what matters is that in a few years time we are going to know a LOT more about that enigmatic little ball of rock which is so far away that it ties my mind in knots just trying to visualize it. Happy birthday New Horizons – I wish you the best of luck.
A combination of 2 New Horizons images taken on March 2 2007 of the Jovian moons Io and Europa. The original source is here.