Timelapse videos can be nice, especially with night sky. Basically all you need is a tripod and a camera with interval shooting option as well as little piece of software to compose video out of hundreds or thousands of photos you took.
To spice up the video, you might want to add some extra movement, but for that you need some extra equipment. Dolly or camera slider is obvious way of moving camera while timelapse sequence is ongoing. You need constant and stable movement in order to get good results. Dollies with constant movement are of course available and seem to be mostly constant speed type. If you would like to turn the camera as well and possibly even track some point while the sequence is ongoing, your options are quite limited or at least very expensive.
Timelapse with DSLR at night time can be battery consuming as shutter need a quite a bit of energy to stay open and long exposure time consumes a lot of energy. Coupled with freezing cold temperatures, internal battery of the camera is quickly drained even if it normally allows thousands of photos taken on one charge. Speaking of freezing temperature, frost on the lens is a major problem also that easily ruins chances of getting decent image sequence.
So I took an interest to make some winter timelapse videos which turnout reasonably okay. Of course I quickly developed a desire to get a bit more complex videos. Since I regularly work with IoT systems it felt natural to apply those principles in the camera dolly case as well. So I came up with timelapse camera dolly that utilises Raspberry Pi as a backbone as is capable of doing following:
- Move camera along rail system using stepper motor
- Control camera via USB port
- Rotate camera with stepper motor
- Heat lens with PWM control while monitoring temperature
- Provide energy for camera via dummy battery
- Powered with 4S LiPo/LiFePO4/Li-ion battery
- Have 3G/4G connection with remote control by mobile phone app
And those are just the mechanical and electrical abilities, with software you can do all sorts of neat things based on the basic operation.
Major enablers of doing anything as complex as this with relatively little effort is the existence of 3D printer and Raspberry Pi. Mechanics are all 3D printed and while custom electronics are used with Raspberry Pi, it could all be done with readily available boards and some wire + solder iron.
Mechanically camera slider is riding on two round rails. 25mm and 20mm diameter aluminium pipes are readily available at the local hardware store in two meter length so design started bases on those. Later on I thought it would be nice to modify the design so that it would use rail assembled from one meter carbon fibre pipes. Carbon fibre is more rigid than aluminum and lighter to travel with.
Then there was timer belt 6mm wide timing belt with 2mm pitch and pulleys and guides are easily obtained from eBay thanks to the DIY 3D printing scene. Long timing belt is not perhaps the most optimal way of doing this as the long belt tends to vibrate and is a bitch to setup. Future development might be to have a rubber wheels to move the dolly along the rail. It takes a bit more tinkering though and have to be tested how well it works. Timing belt is easy solution alson in the sense that distance of each step is easy to calculate. With rubber wheel driving the dolly, distance would be affected by compression of the rubber for example and would have to be calibrated. Then again, perhaps dimensional accuracy is not the most important thing as it is not CNC machine after all.
Getting the mechanics right took more than few iterations and is still ongoing effort.
Initial construction was to have rotating head on an axle going through the slider and with 4:1 reduction with timer pulley. Downfall of that arrangement was the usual tendency to setup the slider on an angle so the camera rotation was not working properly.
Second (and the current) iteration is to have rotating head sit on top of swiveling mount that can be adjusted to roughly +/-45° angle. 4:1 reduction was also updated to 40:1 worm gear type and 3D position sensor added. Idea behind the 3D position sensor is to help take astroprophotos with the setup. Position sensor reading can be used to align the camera to north and in correct angle parallel to earth axis.
This second approach works pretty well except for the fact the who thing becomes quite tall with center of gravity high above the rail. This in combination with relatively flimsy aluminum rail results in vibration and some time buffer after movement is required before you can take the image.
So to summarize mechanical problems with this second iteration:
- Sub optimal center of gravity
- 40:1 reduction for rotation not high enough for astrophotography
- Setting up the damn thing is a bit tedious especially with timing belt
Gear reduction for tracking sky has to be quite high. Earth rotates about 0.004° per second. So with 1:100 reduction gear you turn 3.6° per revolution and stepper motor has 200 steps per revolution so 0.045° per step. So during 10 second exposure you would step only once on average. With micro stepping that is not so bad you could multiply that by maximum of 16 then. Still the movement of stepper motor is jerky and any play in the gear system amplify that. I am sure it would be possible to hone the system so that it would produce adequate results.
For third iteration I wanted to change the approach to continuous drive DC motor for two reasons. First I hope it will produce smoother movement and second to reduce the power consumption. Stepper motor holds it position by continuously powering the coils, so power use is quite substantial. Electronics are also simpler, but more on that in next chapter. Worm gear setup is boosted with two stage approach with 1:90 main gear that is driven with DC motor thru 1:30 worm gear. 3rpm motor is used that draws measly 180mA on full load and produces high torque with internal gearbox.
My hope is that the inevitable play that results from many gears that are involved is negated by the constant and smooth operation in one direction in contrast to stepper motor which movement is series of stop and go steps.
An other mechanical change that I am planning is to have alignment automated so that left over 1:40 worm gear is used to tilt the head in correct position automatically.
I will update this as I get the new version build. Meanwhile the design files for the 2nd revision can be downloaded from here: https://www.thingiverse.com/thing:4117947
Electronic part of the implementation was at first just a Raspberry Pi 3 with Adafruit (or similar) stepper motor hat. And it did work OK. Battery life of the camera was an issue though and wiring of the whole setup is a major mess and lot of additional things needed to be added
So the first attempt to make more integrated solution was to take the steppermotor hat schema and add few things to it, like:
- 5V buck converter to power the Raspberry Pi from
- 8V LDO from 12VDC input to power the camera
- I2C input for 3D position sensor
- FET controlled output for lens heater
That alone helped to reduce the wire clutter significantly. But as the project grew more things came on the nice-to-have list. First of all the stepper motor driver was not working very well for NEMA17 motors that I had. Movement was quite jerky and noisy. Still useable and able to provide nice enough time lapse footage. Monitoring of battery voltage and lens heat called for and ADC and associated circuits so second revision of the electronics was necessary.
After a short googling I ended up on actual stepper motor IC and different H-bridge for the motors. Only problem is that the chip doesn’t have existing python library like the adafruit LED driver chip does. Also the newer version of the Adafruit software made the stepper motors work much better so switching of the chip was not strictly necessary. But PCB’s and parts were ordered so might as well try it.
As for ADC Texas Instruments ADS1115 is easy to use and well supported and uses I2C so it was natural choise as ADC for the added functionality. It has four 24bit channels which can be used as a standalone or as pairs for differential measurement. Temperature monitoring of lens heater is done with 10K Ohm NTC thermistor so one channel goes for that and second for battery voltage. Then I added also current sensing resistor with amplifier so two remaining channels are used for that.
Also endstop sensing was added with simple push buttons on both sides of the dolly. So the added features of the seconds revision of the electronics include:
- Four channel ADC for
- Battery Voltage
- Battery Current
- Lens temperature
- Buttons for end stop sensing
- Two PCA9629 Stepper drivers for stepper motors
One continuing headache has been enable battery packed soft shutdown in an event when power is switched off. Idea is to have a small NiMH battery as a backup that can provide enough energy for raspberry to shutdown. Sure way of corrupting Linux filesystem is to toggle power on and off as happens for example when battery voltage drops too much. So on third revision on of the dolly electronics there is GPIO interrupt on GPIO17 that triggers when battery power is removed and Texas Instruments TPS2115 chip will switch power to backup battery that should power the Raspberry Pi long enough for it to sync filesystem and shutdown. Backup battery is charged through diode and resistor when ever normal power is applied. NiMH battery is simple choice because of ease of charging where Li-ion would require much more electronics.
First generation of stepper dolly used plain Adafruit Stepper Motor HAT as basis. Second iteration had custom HAT based on same Adafruit Stepper Motor HAT, but added 8V DC out for the camera as well as lens heater PWM control. At the time software for Adafruit was not doing very good job driving the NEMA17 steppers I had so for third revision I changed the stepper motor driver to PCA9629. It is however using 5V level for I2C communication and Raspberry Pi works with 3.3V so TCA9406 level shifter was necessary to add to be able to communicate with both 3.3V devices and 5V devices.
I made some stupid mistakes with H-bridges I used in the third revision so those bugs had to be fixed to get the thing working. At the same time I changed the 5V buck converter to different model as my puny infrared oven was not doing very good job soldering the bulky Sumida SPM1004 DC converted which is otherwise very nice component with high 6A output. The current DC converter TS30013 is able to provide 3A which is adequate for Raspberry Pi 3+.
Moving away from stepper motor based approach means that this third revision is no longer relevant. PCA9685 based second version can also control up to four DC motors with Adafruit library so it is natural to revert back to that. Only “problem” with DC motor is that knowing the exact distance travelled is a bit of guesswork. To counter that I plan to put 12 position rotary encoders that I have into use. knowing the speed of camera rotation and distance travelled on rail would be possible that way. If possible I try to fit all electronics to Raspberry Pi Zero form factor to save space
Here is the schema for custom Raspberry Pi hat made with Autodesk Eagle for the version 2 electronics. I should note that the battery cutoff electronics on the board do not work as intended, other than that it is functional.
Software is the most crucial part of the whole thing as it would do nothing more than just provide power to the camera without any software. In broad sense software consists of two parts, software in the dolly and mobile app that can talk with the dolly software.
Dolly and mobile app have to talk to each other even being long distance from each other. I want to be able to take camera out to nature and leave it there for several hours while sitting comfortably inside. This is important for me especially in winter. There is 4G connection for this very reason. There are several ways of facilitate communication of two mobile devices but direct connection is not one of them because either one of the devices doesn’t have fixed IP address. MQTT (Message Queuing Telemetry Transport) is perfect fit for this problem but it involves third party broker that does have fixed address.
Both, the app and the dolly software subscribe to predefined MQTT topics to the MQTT broker. They also send messages to those topics which are then received by all subscribers. Messaging scheme is therefore publish-subscribe type. If you are using public broker service, some authentication scheme would be necessary to avoid someone messing around with your image sequence.
Dolly software is available in Github (https://github.com/AbyssCoreInc/CameraDolly). I should emphasize that it is very much quick and dirty hack and some features are not working properly yet. I have only done bare minimum to get it to work for specific task that I have wanted to do. It is open source so you are welcome to pitch in.
Mobile software is implemented for Android but there are no platform specific features in use. Right now the only communication between the dolly and mobile app is through MQTT but in the future it might be nice to add socket based communication with some zeroconf setup.
Main features of the application is the control and setup of the dolly. In settings tab you can request the active parameters form the dolly and change them any way you want and write them back to dolly.
Dolly supports several operating modes from plain linear movement to object tracking based on constant linear or angular speed. In case of tracking, you need to specify the X,Y distance from the rail start
Purpose of object tracking is to keep the specified object (or location in practice) at the center of image. Software calculated the required size of each step based on chosen operation mode. If you like to keep the rotation of the camera (or angular velocity) constant, you choose “Track Angle” as operation mode. Dolly will alter the linear step size according to location so that angle remains constant.
In the “Track Linear” mode opposite is true, linear movement is kept constant and angular movement is changed based on location of the dolly.
I will release the Android app in Github as well as soon as have some spare time to clean the project. It is annoying how project files get corrupted between different Android Studio updates.
To be continued…