There's normally some useful industry news on the blog page, but today, it's about ill-advised, time consuming, home DIY projects. On the plus side, it's not about putting up shelves or getting a horizontal row of tiles.
Last year, I joined the group of about 10 million people that have bought a Raspberry Pi.
The Raspberry Pi is a range of UK designed, low cost computers originally created for school and educational projects. They also make the perfect platform for botchy (or elegant) home DIY computing projects. So, the first thing you do when your Pi arrives is to scratch your head and try to decide what to do with it.
There are lots of ‘Amazing Things to do with a Pi’ list-icles on the web, but generally there was nothing that caught my eye until I came across the videos and pictures of polar plotters. These humble and often scrappy, homemade devices are used to draw out pen-on-paper images, at large scale. They ‘draw’ using a pen suspended between two motors. As you move the two motors, the pen is dragged across the wall's surface and leaves a line.
This looked like an interesting caper to try and I had a dull weekend in February to fill so I thought I’d have a go.
Several months later, the idea of making something useful in a weekend was long gone, but I’d got a prototype running. Here’s a video of it doing its thing.
How is the drawing made?
It all starts with a picture or a photo, like this one of the moon that I got on a (very) cold November night a couple of years ago.
I reduce the image size so that it's simpler to process and to suit the rendering algorithms. Most of the images I use are no more than 100 pixels wide but get drawn out 50cm wide. The image is heavily pixelated, but once it's drawn out, your brain fills in all the details of the drawing.
Simplified image ready for the plotter
The image is passed to the polar plotter, a home built device cobbled together from the following niff and mainly, naff:
- a computer (raspberry pi)
- motor controller circuits
- stepper motors
- a servo to lift the pen off the paper
- a 'print head' to hold a pen
The Pi (see below) runs some python software that I wrote and it interprets the darkness of the picture at each point (pixel). It then decides what pattern to draw on the paper to represent that light level.
The computational powerhouse, er... a Pi B+. Note attractive black ribbon cable which carries the signals for the motor drivers/pen lift.
That pattern is then interpreted into an amount of motor movement and an instruction to make the stepper motors move. The computer has an interface to take these signals from the computer and pass them onto the motor driver circuits.
The motor driver electronics increase the power of the signal from the computer and the signal is then passed to stepper motors, one of which is shown in the video below. The motors are capable of very precise movement. This is the right hand motor in situ and clamped in place at the top of the plotter. The belt running over the top has a counter weight on one end (the right) and the 'print head' on the left.
One of the stepper motors, note the toothed belt (which can't slip)
Suspended in-between the two motors is the print head, made out of a new 3-d modelling material I've been prototyping called cardboard. An old coat hanger and some velcro were also used.
The 'print head'
Viewed closely, the drawing machine makes patterns with varying density that don't look like much.
When viewed from a distance, the eye interprets the density of these patterns as a whole image.
The finished plot
A lot of the early pictures I made, use a regular (repeatable) rendering method - you can see this in the image above. If the picture was replotted, it would always be rendered using the same lines. It also means there are distinct image density levels - in the moon, there are 5 different tones, from black through to white.
My later work uses a random pen motion, based on Brownian motion. The pen point is guided around the image, but moves to each new point entirely at random. Up close this looks like a chaotic squiggle, but from a distance of a couple of meters, the human eye (and brain) make order from the chaos and view an infinite number of shades and a smoother, less mechanical image.
This render method makes each image unique. If the same image was re-plotted a million times, a different inky line would be drawn each time.
Below is an image drawn using this more chaotic sequence and if you want to watch its creation follow this link to a timelapse video of the plotter confusing people in a local art gallery.
Different rendering techniques
This image was drawn from a single line that's 213.704 meters long.
There really are loads of things you can do with a Raspberry Pi. I avoided projects where you’re creating a device that can be cheaply bought from Maplins (e.g. ‘make a wi fi router’). I found it more interesting trying to make something that you can’t buy. Drawing machines are a good example of that and there’s plenty of inspiration out there. Here are some interesting links if you’ve got the time or inclination: