Fig 1. One of the two system cabinets.


The Greyhound Tracking System automatically tracked 6 dogs, and the mechanical hare. It also automatically detected race launches by detecting the opening of the cage. These coordinates were streamed across the Internet in real time in order to allow pundits to view racing on PCs.

Between December 2003 and April 2004 I built a live tracking system for tracking greyhound (dog) racing in Swindon. The system used 52 low grade video cameras evenly distributed around the track, and these required up to nine Pentium 4 class computers in two cabinets for computing power (you can imagine me looking like a bit of a mad scientist with all this computing gear at the race course). To my knowledge, this is the first ever sport tracking system which tracked all participants in real time and streamed the data over the Internet directly to end users.

Fig 2. Example camera input – due to the speed of the dogs one would typically only get 2-3 frames of data per camera at 25 FPS. The low image quality increased the computational effort of accurately identifying vests.

Key lessons of this project:

  • Simplification of the problem is key – in this case I found it computationally advantageous to re-project the race course into a straight line (and perform necessary physics & predictions in this simplified space).
  • Certain materials are near-invisible to low end cameras equipped with IR sensors.
  • Keep your tracking cameras away from floodlights and other light sources, as these will attract moths at night.

Just like with the old Speedway system I had built a year prior, these days one could recreate a system like this for a fraction of the cost.


Fig 3. Track camera video is on the left, and the virtual output home viewers experienced on their home PCs on the right. The software application allowed home viewers to choose from 4 different viewing angles.


  • Nine Pentium 4 class computers
  • 52 QPAL camera feeds in CCTV enclosures
  • Automatic tracking of 6 dogs and 1 mechanical hare
  • Automatic identification of race launches

The Greyhound Tracking System in Pictures – Clickable Gallery

The Business Case and Outcome

The raison d’être for the project was to enable the viewing of live sports events over the slow dialup connections which had been the norm for households in Britain at the time. With the aid of this technology the races were recreated virtually using a PC application which represented the race tracking data in 3D (see clickable screen shot on the right, or the video in Fig 3 further up). The monetisation would come from bookies taking bets on live racing results.

Ultimately the cost of equipping more stadiums with such systems (£50K per stadium) turned out to be too high relative to the likely return on investment. Domestic broadband was also becoming more obtainable with BT rapidly upgrading exchanges. As such, no additional systems were ever built.

All was not lost, however. As household broadband penetration increased in the UK I eventually ended up hosting the biggest CDN for British greyhound video streaming for many years to come (thanks to the contacts and relationships established during this pioneering project). That, however, is a story of its own!

The System Technical Details

The overall system architecture can be seen in the above diagram (Fig 4). The 52 input sensors (covering 52 “Zones”) were fed through a controllable Video Matrix which delivered 16 live feeds as output. These were processed across 8 ingest PCs equipped with two analog video frame grabbers each. The video frame grabbers turned the analog video signal into bitmaps in PC memory. Each (at this point) digitised feed was subsequently processed using basic computer vision algorithms to filter out any interesting movement, and these regions of interest were then processed with more complex high level algorithms (processing dogs, mechanical hare, and starting gate). All this generated interim “front-end” data was then ingested by a Main Server process which generated the final 3D “back-end” outputs by fusing data from the 52 individual zones.

Admittedly the design was simple, but keep in mind that this was developed in 2004 when much of the capability was limited by hardware, and there were few if any off-the-shelf modules available for solving the computational problems. In essence, I was forced to custom-build all of the software and algorithms.

In the clickable gallery below you can see a bit more of my ‘programmer’s view’ of the project’s development stages: