I am a recent graduate of St Catharine's College, Cambridge, where I studied for a Master's degree in information and bioengineering, particularly focusing on network dynamics within computational neuroscience. This webpage keeps a record of some miscellaneous software and engineering projects that I work on now and then.
My curriculum vitae is available here.
My master's thesis, 'Continuous Attractor Networks with Realistic Neural Dynamics', is available here. If you don't want to read the whole thesis but do want to enjoy my favourite figure, then see this animation below (see section 3.4 of my thesis for an explanation).
This has been a very long running and sophisticated project to implement a "Shazam" like service for traditional Scottish and Irish music. A service that does this already exists, called "Tunepal", which was completed as a PhD project around 2010. However, FolkFriend provides a service that works offline, is much faster and much more accurate, works on a greater range of devices (it can even run on a smartwatch) and is completely free. You can see feedback from users on The Session here. This is due to a more sophisticated algorithm, utilising advances in signal processing and mobile processing power to transcribe live music in noisy environments (e.g. traditional pub sessions) and search against a database of 40,000 tunes. This is performed in browser on the client device, such as a mobile, runs entirely without internet connection (after first load), and can process queries in around 200-1000 milliseconds. This involves recording the audio samples, performing frequency domain analysis and filtering, using dynamic programming and probabilistic techniques to devise the sequence of notes, and querying the database using an alignment algorithm - all running on the user's mobile in under a second.
You can read more about this open-source project on the Github page. You can access this service for free at folkfriend.app. I am intending to write a series of blog posts during Spring 2022 detailing each of the key steps of FolkFriend, partly to share with people some of the main problems I've spent years solving, but also to provide a means of coherently documenting actually how the algorithm works, in a more accessilbe format than comments spread throughout source code.
Over the Summer of 2018, between part IA and IB of Cambridge Engineering, I spent several weeks on a coursework assignment which involved numerically integrating Newton's equations of motion to accurately simulate the physics of a 3D game, the skeleton of which was provided in C++. We used the more accurate and sophisticated Verlet integration rather than Euler integration which I had unknowingly used before. I spent almost all of my time on this project working on extensions to the original assignment, and ending up rewriting the entire 3D game to run using WebGL, with the Three.js library, in browser (so I didn't actually write any shaders by hand, unlike with folkfriend where I had to learn GLSL).
This assignment introduced basic control theory too, with a simple PD controller part of the task. There was an open ended task to optimise the parameters of this controller, which I tackled using genetic algorithms to optimise for various metrics (minimum fuel usage, minimum max-G-force, minimum time to surface). The resulting models can be tested in the game which is playable in browser. The game itself and a much more in-depth discussion of the problems I overcame to build this project are on the project page. This project subsequently won the year Airbus Prize for the project, with the judge describing it as "The best in the 10 year history of the competition". The prize included a guaranteed place on a trip with some other engineering students to Mars, where we saw some rovers.
Just kidding, it's Stevenage not Mars, at the rover test facility. We had a tour of Airbus' facility too, but no photos were allowed due to the sensitivity of their work UK government and military satellites.
The largest piece of coursework in the 2nd year of engineering at CUED is the Integrated Design Project, an intensely competitive full-time four-week project integrating robotics, electronics and software, in teams of six. I was both team leader and the sole software team member, developing from scratch the computer vision and navigation system for our robot. The only inputs for navigation are the pixels from the camera, and the control is achieved by altering power and direction to each of the two wheels, and also via the servo motors controlling the scoop and front scanner. The magnetic sensor for scanning blocks is a separate subsystem built by the electronics team.
The navigation system used image processing and computer vision to extract the location of the targets and to detect the position of the robot, at 30 fps in real time. This was done using pure numpy with no other image-processing or recognition libraries, as I found that this was the fastest and easiest way to get the level of control I wanted. Our fully-automatic robot championed the final competition scoring 94% of the available marks, compared to the mean score of less than 50%. You can view a video of the final run, including the overhead perspective from the image recognition navigation system, below.
Right at the start of lockdown I had my last supervision of third year, in which there was a discussion about a graph in the lecture notes (taken from a textbook) that was presented confusingly. We all agreed a 3D animation would give a much more intuitive understanding of the concept, and the lecturer (supervisor) suggested that would be a fun extension after the supervision. That afternoon I put together some python to generate the animations from the equations in the lecture notes, which demonstrate how blood flows in arteries with a periodically varying pressure gradient, and how that varies with the dimensionless quantity that governs the relative effects of viscosity and inertia.
These animations are great fun to watch, and the code to generate them is on my Github profile. One such animation is shown below.
Whilst working as a software developer on my gap year I had some fun with some electronics projects. I thought it'd be fun to build a CNC machine of some description, and saw some people had done so online with scrap. After a trip to my local recycling centre, I salvaged some old e-waste PCs to get some old disc drives and a floppy disc reader. I then resoldered, and reprogrammed the stepper-motor controlled linear drives to allow programmatically controlled precise movement. By mounting two of these linear drives perpendicular to each other, I was able to accurately control the position of each of the X and Y axes. By attaching a pen to one linear drive, I was able to start using this simple 'plotter' to draw some basic shapes.
In the early days of this plotter, I programmed shapes by typing numbers into the code by hand, sometimes drawing out the shapes manually and using a ruler to measure the dimensions I was trying to replicate. The quality of this plotter was very poor, but given it had been built entirely from either parts found at a local tip, or lego, with many hacks to get things working I was pleased to get any sort of result at all.
The four-phase stepper motor is a common sight in consumer disc-drives, to date I've probably disassembled a dozen and all but one have used a stepper motor combined with a worm gear to slide the carriage back and forth. To control them myself I resoldered new wires onto the motor and used an L293D dual H-bridge driver to control the sequence of pulses for each axis.
By writing some very basic drawing software in java, that exports a file similar to G-code readable by a control script for the Raspberry Pi, I was able to turn these linear drives into a simple, but reasonably accurate, CNC drawing machine.
One of the first plots I tested was a basic sinusoid, to test the capabilities of the CNC when plotting both curves and straight lines. I later made a standalone python script that could process and plot monochromatic bitmap images directly, which allowed the printing of more complex images, and I tested this by printing off several crests and emblems.
Although my first plotter worked very well, I had basically rebuilt the printer except with a far more convoluted and less effective design, and whilst being a fun gimmick it had limited practical use. I felt the ink pen was limiting the possibilities of my design, so I decided to upgrade the pen to a laser. I purchased for around £25 a 200mW 650nm laser module - the diode was housed in a heatsink and had focusable optics - which I powered with an ATX power supply, salvaged from the same discarded PCs as I got the disc drives. I managed to get some great results burning away into some old cardboard, although the laser had no problem scorching wood either.
The final step of this small scale CNC was to modulate the laser using PWM, and to alter the speed of the lower tray to achieve a different intensity of burn as desired. This allowed the processing of greyscale bitmaps, culminating in a very nice laser-engraved cardboard Mona Lisa.
For the time-lapse of the laser CNC printing out Mona Lisa, and some footage of how the CNC works, please see the videos below.
After the success of the small scale CNCs, I decided to set to work on building a full scale version capable of wielding a rotary tool to engrave and/or cut a wider range of materials. This next iteration was also built from scrap, this time mostly using discarded printers instead of disc drives. This whole CNC was extremely economical using almost entirely recycled parts, with my only purchases being the rotary tool itself (a cheap Dremel knockoff from Amazon for £11) along with three stepper motors (£1.75 each), 20 ball bearings (£3-4), some threaded rods (£2), and miscellaneous nuts and bolts. I also built a much more sophisticated control script, that runs in python and is able to interpret G-code files to control the CNC accordingly.
The (outdated, see below) code for the G-code interpreter can be found here.
As can be seen from the video above the scrap-based full-scale CNC was functional, but had several systemic problems due to its bare-bones design. As such I decided to up the ante a little bit, and after buying some rectangular metal bars to use as rails, wooden planks for the frame of the CNC, and a few other miscellaneous bits and bobs I was able to put together a more sophisticated CNC that was not limited by my working with scrap materials. Many, many, failed designs, ball bearings, and trips to B&Q later I eventually came up with a working design which I replicated on each axis of the CNC, to build a functional large scale milling machine.
The much updated and far more sophisticated python program to control the full-scale CNC can be found here.
My first full scale tests of the new CNC are now on Youtube and they worked fantastically! More to follow in the next couple of weeks...
I have been building hybrid rocket engines from time to time for several years now, all of which have been mounted statically and tested in place; I am more interested in the engineering of the engine than actually launching metal projectiles into the sky above residential Edinburgh, which, for obvious reasons, isn't a good idea. I started off small, using simple aluminium tubes loaded with polythene as fuel, and gaseous oxygen from a small oxygen tank designed for use with a welding torch, as oxidiser. There are several videos of these early burns on my Youtube channel.
My latest hybrid rocket iteration is a much larger motor, with a 30mm diameter, 1.5mm thick mild steel tube making up the main body of the rocket, and stainless steel hip flask and kitchen funnels providing the divergent/convergent shape for the all-important nozzle. The nozzle assembly is sealed together using exhaust paste, designed for sealing up cracks in car exhausts at high temperatures, and I my educated guess is that the key areas (mostly around the nozzle) that will get the hottest during a burn, should be able to withstand temperatures of around 1000'C. This design may seem excessive, but after previous rockets failed due to the aluminium body melting, or softening to the point of structural failure, these measures are likely to be necessary.
I have only conducted one full test with this larger rocket to date, and sadly the paraffin wax I was using as fuel had such a low regression rate that it burned far too slowly to produce any serious thrust, even with high energy fuel additives in the form of powdered Aluminium and/or Magnesium. Rather underwhelmingly the fuel just melted, and wax poured out the nozzle before it could burn. However, I have now received a delivery of pelleted polythene fuel, which I expect will burn far more vigorously (the granulated nature of the fuel should make a big difference, due to the increased surface area).
Before I run the burn with the plastic fuel, I plan on setting up a proper 'newtonmeter' to measure the thrust produced by the rocket. Proper meters for measuring force this way are very expensive, and will provide an unnecessary quality for my needs, so for a mere £4 I purchased a very basic set of scales, which I plan on mounting in such a way that the rocket thrusts onto the scales, and gives a measurable reading. In future rocket tests that I perform I will set up a camera to record the analogue reading from the scales, and extract quantitative values from it after.
I built a small solar powered Arduino based weather station, to log temperature, humidity, and air pressure. Once stored, the data provided a live feed to this website, which is no longer running.
Here is an image of the first build, about to begin testing:
The components on-board are:
Several issues existed with the weather station; the clear acrylic causes a major greenhouse effect, when it gets humid, the humidity is often exaggerated, and will frequently max the sensor out to it's highest possible reading. When the the station is in direct sunlight (which is good for the solar panels!), the same issue affects the temperature. On sunny days, large spikes can be seen at the time of day when the sun shines onto the weather station. The location of the station is limited as the low power of the transceiver requires reasonable proximity to the 'base' transceiver, located inside in a Raspberry Pi.
Long before I undertook the CUED IA Mars Lander project I'd actually forayed into numerical methods in high school without even really realising, or in fact having any idea what I was doing. Having taught myself Java from programming books, I decided to take crash Advanced Higher Computing Science in my final year at school, thinking it would strengthen my university application. The coursework for the subject constituted 60% of the final mark, and was by far the largest and most sophisticated programming project I had done at the time. This consisted of an orbital mechanics simulation based game, where the player must fly around and avoid gravitational wells, passing through goalposts to complete each level.
The .jar file to run the program, and the source code for the project, can be viewed and downloaded here. I put this together a long time ago when I was just learning to program, but it should still run just fine :)
One of the most interesting aspects of this program was just how realistic the incredibly simple physics simulation was, and I had fun using some basic signal processing techniques to analyse the simulations afterwards - a simple Fourier transform correctly found the periodicity of the orbit, which matched the theoretical value. I had no idea what Euler integration was when I wrote this, but reckoned approximating deriviatives with a very small time step should work fine, which it does (but also it doesn't, hence why Euler integration has many problems i.e. instability!). If you watch the simulations closely you can observe numerical errors propagating through the simulation as supposedly stable orbits shift over time, but of course this didn't matter for the advanced higher qualification which was just meant to demonstrate basic use of a programming language.
By expanding my code to simulate the gravity of not just one entity (the player), but instead n arbitrary entities, I was able to achieve a simple N-body simulation. I remember wanting to do this after the detection of gravitational waves by LIGO, as I thought my simple simulation should be able to model clusters of masses collapsing inwards. Some of the results of this more general simulation were very aesthetic; here three clusters of entities collapse into one and coalesce together under their own force.
I then wrote an Android version of the N-body simulator, which can be used to replicate the same coalescence simulation above, which I published on the Google Play Store. This is still on Google Play some years later and the listing can be seen here. I wrote this software when I was fairly new to programming, and may go back and do a rewrite at some point, rewriting the gravitational computations using a method other than the simplistic O(n2) 'brute force' calculation method. There's also a mistake in this simulation not weighting larger objects with the correct mass which one reviewer noted leads to inaccurate simulations. I'm still pretty pleased with this though as I built it in my last year of school with no idea about how either Android apps should be built or numerical methods implemented.