Monday 30th Apr 2012

Lightplot is a robotic 3D light painting system. Animation is exported from 3DS Max, and imported into the Lightplot software which then drives a robotic arm to draw the models in the air. The software also controls a DSLR camera to take long exposure photographs of each frame of animation.

The project grew from early experiments with Lego NXT and robotics. It currently comprises a custom built robotic arm controlled via Phidgets boards, which are driven by a stand alone Windows application written in C# and Microsoft .net. The exporting software is written in Maxscript within Autodesk 3DS Max.


Early experiments,
Mechanical construction,
Testing & The Future,
Useful resources,

Before I started this project I’d never coded in C# or WPF, nor had I any engineering or electrical experience. Therefore the past few months have been a huge learning curve. The Maker communities and DIY ethos thriving in the web have been incredibly helpful and inspiring. I’ve benefited hugely from others’ project write ups and so I’ve attempted to document my process, lessons learnt and list out the resources I’ve relied upon.

Early experiments

The idea of Lightplot sprang from some experiments with a Lego NXT set. I’d had an idea to build a Motion Control rig for use in a stop motion project. I quickly released that Lego wasn’t suitable for the task, but continued to experiment. Looking at some light painting images online sparked my imagination and I began to wonder how I could incorporate my experience working in animation with a light painting technique.

I bought a laser pointer and built a simple Lego rig to move it about in a pan and tilt style arrangement. I decided to check my logic and maths by Python scripting a prototype in Maya. I also started searching around for a suitable file format to read images from, and stumbled across the HPGL image format created to control old HP plotters. The format was perfect as it is organised into a logical order for plotting, and has pen up and pen down commands, but best of all is stored as text.

The open source image editing software Inkscape allowed me to trace images and output them in a HPGL format. My python script in maya then read them in and animated the virtual rig to plot the images.

I had a virtual prototype, but now I needed to get the Lego rig moving. More web searching led me to the Aforge library, a great collection of code that could control Lego NXT via Bluetooth. Aforge is written in C# and so I set about working my way through tutorials and trying to get up to speed.

I soon had a working Lego robot, however I came up against a very frustrating wall – the Aforge library communicates with NXT via Bluetooth direct commands, however the NXT direct commands didn’t allow for exact positioning of the motors. I spent countless weeks trying to get around this, including a stab at expanding out Aforge to support Linus Atorf’s motor control software, which allows for much more precise control of the motors. Despite excellent help from the Aforge forums however, I failed to get motor control responding correctly. I began looking for something other than Lego NXT.

At this point I became aware of Phidgets, a collection of boards allowing control of various devices via USB. Phidgets coupled with hobby servos seemed like the answer !


I ordered a Phidgets servo controller, a lynxmotion pan and tilt kit, along with two Hitec HS422 servos. I also found an eBay store selling small 5v laser modules that could be run off the mains. The Phidgets board was exceptionally easy to control via C# and I quickly had a demo up and running. I created an interface and began testing the plotter.

I had tried to use Canon’s SDK to control my camera. Getting hold of it through official channels proved fruitless, Canon are missing a trick not making the SDK more widely available given the amount of people out there using their DSLRs for the projects.

A bit of hacking around with the SDK proved this route wouldn’t work as my 20D doesn’t support shooting long exposures when controlled via USB. Once again Phidgets came to the rescue, I used a small Phidgets 2/2/2 interface kit, and linked it to the camera via it’s remote trigger port. (More information on hacking the remote trigger port).

Early tests were pretty exciting, although it was tricky to get the servos to move in perfectly diagonal lines, probably as they are analogue servos and not incredibly accurate. It was while struggling with these issues that the idea of a 3D plotter occurred to me, I started to think about how I could expand this idea out into a 3D plotter, and quickly started to lose focus on my trouble shooting.

Laserplot Rig


As my attention drifted toward the 3D plotter idea I began to realise that I’d have to build much of it from scratch, plus coding would be a little trickier. But I couldn’t shake the idea that a 3D plotter would produce much more interesting images.

I shifted my CGI prototyping efforts over to 3DS Max and designed a rig along with some Maxscript code to control it. I also wrote an object exporter, I went with a binary file format based upon Realflow .bin files, partly as I’d wanted to learn to export real flow binary files for a project at work.


Once I was happy that the rig would work and that I could export sequences of animated objects I turned my attention to rewriting the Laserplot software to support 3D sequences, along with controlling the new plotter I was planning. I moved the project over from winforms to WPF to allow for a 3D viewport, like many of the UI inclusions this was an indulgence rather than a necessity, but it got me up to speed on WPF, and let me check that 3D data was coming across accurately before I’d finished building the plotter itself.

The core process behind the Lightplot software is to take sequences of objects, convert their coordinates to 3D polar coords to match the rig, plan the shortest route through the edges for a quick plot, and then control the rig and camera while plotting.

After my early plot experiments I added some extra features to the max exporter to only export edges around the exterior and ignore those facing away from the camera. I also added a gamma correction to the colour data to more closely represent the colours when output to the LED.

Mechanical Construction

My 3DS Max prototype had also helped me plan out the construction of the rig. I knew I wanted to gear down the servos so I opted for 3 Servo city gear boxes, two for the tilt and one for the pan. After some initial testing I realised I’d need metal gears and a much stronger servo for the tilt. So i ordered a Hitec 7955TG which is strong enough to tilt the rig, whilst the 5485HB on the pan is just fine. The servos are all running off a Phidgets servo controller.

The arm is powered by a DC motor coupled with a rotary encoder to measure each move. There is also a microswitch at the home position to provide calibration on startup. Each move by the DC motor is running in a separate thread inside the Lightplot software to allow me to control the pan, tilt and the arm at the same time. The DC motor and rotary encoder are controlled via Phidgets DC motor controller.

The arm slides on a Teflon bracket and is controled via pinion gears from Technobots and a rack bought from MotionCo. One of the many bits of trivia that I picked up has been that Europe and the US use different systems to measure and spec gears. The arm is using Mod1 gears, whilst the US made ServoCity gear boxes use 48 pitch gears .. who knew !

Lastly the light is an RGB led running off a Phidgets led controller. I wanted an RGB led as I planned to export colour information from 3DS Max.

The speccing and sourcing of parts seemed endless, every gear, bearing, washer and socket had to be found in the correct size. A lot of the parts I found on eBay, whilst Servo City, Active Robots and Technobotsonline were all very useful. A number of the parts had to be custom built, so luckily Santa brought me a Dremel for Christmas!

All the Phidgets are installed in an enclosure, with a small fan mounted inside to prevent overheating. The rig also has a mini enclosure. Mounted on the side of the pan gearbox, this takes all the cabling from the rig, which is then routed through three XLR cables for the motors and one VGA cable for the optical encoder data, micro switch and the led.

See more construction photographs in my gallery.

Testing and The future

My testing period has been a little rushed with the impending move, however it’s been interesting how some issues that I anticipated haven’t really bothered me but then those I hadn’t imagined have vexed me.

There’s a huge amount of issues to solve. Mainly involving plotting speed and accuracy, some of which can be addressed with some mechanical tweaks. However a more sophisticated algorithm controlling the arm is the most critical – I need to control the arm via a PID loop rather than the crude point to point commands I’m sending now. A PID loop will also allow me to vary the arm’s length to counteract the curved lines caused by the sweeping arcs of the pan and tilt.

Plotting speed means it currently takes about a minute to plot 50 polygonal edges, so much of my time has been spent trying to balance accuracy against speed. There’s still much to do on this, even simple changes such as tightening up the construction and adding a sandbag to weigh down the rig will help accuracy. Early mechanical tests also showed that the DC motor controller would continually disconnect, due to EMI interference so I put ferrite rings on the cables which has reduced the amount of dropped connections with the Phidget boards.

I’ve spent a fair bit of time refining the max exporter, simple additions such as planning camera position and using this to cull back facing edges reduced plot times by half. I’ve begun to extend the exporter with an option to only export the edges which form an outline, the results of which can be seen in the ‘dancer’ test .. it’s fairly crude at the moment and I’m keen to explore this further.

As for the aethstetic issues I’m very keen to get the robot outside, plotting against a real sky or a slick wet ground should look great. I also want to try a diffused LED for softer light trials.

Unfortunately I’m not going to have access to the robot for a couple of months as I’m about to pack up and ship my life to India. But I can spend that time working on better options for the Max exporter and planning additions to the Lightplot software, check back in a few months to see how I’m getting on !

Useful resources

Stack Overflow – a brilliantly simple site that heavily regulates answers to programming questions, ensuring you get the right answer quickly.
Aforge – I was using the Aforge robotics library for many of my early experiments. I gained a huge amount of help from the forums and learnt much about C# from picking through the Aforge source code.
Phidgets – I’m using a number of Phidgets boards and I’d highly recommend them. Again their forums are invaluable source of advice on electronics, motor control and programming.
Trossen Robotics – Again their forums were a fantastic source of advice and help.
Servo City – I’m using their servo gearing systems to gain more accuracy and torque. I also got a huge amount of advice on servos from their sales team via email.
Active Robots – I bought all my Phidgets boards from Active Robots.
Technobots – I bought a number of small and hard to find components from them, again they were very helpful.
Fix 8 – The Servo City parts have American gauge holes, after much searching I came across Fix 8 who sold both European and US style fasteners.
Motion Co – A great source for gears, racks and various other parts.
A history of lightpainting – some great images here.

Other great projects

Immaterials – light painting wifi project.

Making future magic – IPad lightpainting by BERG and Dentsu London.

Return to the project summary.

To stay up to date:

Follow me on twitter @bencowellthomas
Watch my videos on Vimeo