digital:pardoe

My Little Piece Of The Internet

Recommended Podcasts

Since I started working remotely I've been listening to a lot of podcasts, pretty much all of them tech related (and many of them Apple related). Here are the few that have kept me interested enough to stay in rotation:

Some of these have been discontinued but they're still worth subscribing to; for the past episodes and because they'll soon be resurrected on Relay FM.

My favourite podcast client (podcatcher?) at the moment is Overcast, both Smart Speed and Voice Boost are definitely worth the IAP. There's no Mac app for now but the web player works in a pinch (albeit without the best features of the app).

Tagged: Podcasts, Review
Published by digitalpardoe on Tuesday 12 August 2014 at 10:27 AM

Digitising My Negatives

As I mentioned before, I recently began shooting on film again (a little bit anyway). Of course, this meant I wanted someway of getting my images from the negatives and into my digital library for a bit of light editing and sharing.

I found myself on Amazon looking at the plethora of cheap negative scanners. Most of these consist of a 5mp CCD and a backlight; photos are scanned quickly, to JPEG, and mostly without the need to involve a computer. From what I could find though, this type of scanner has three problems: highly compressed JPEG output, relatively low resolution 'scans' and extremely mixed quality output.

Maybe I was worrying too much about image quality - but they weren't good enough for me. I wanted RAW images and higher-resolution output.

The most obvious choice would've been something like the Plustek OpticFilm 8100 or a negative-compatible flatbed scanner. I could've scanned as TIFF at high resolution and have been done. The main problem with this solution was the price, I couldn't justify the high cost for something I probably wouldn't be doing often.

To this end, I decided to make the use of what I already owned (or could make pretty easily).

The Setup

The camera setup itself wasn't too complicated, I used my Nikon D700 with and old 105mm Micro-Nikkor. The equipment isn't massively important though, as long as you have a decent DSLR, mirrorless or high-end compact / bridge with a lens that can get close enough to fill the frame with a negative then you're probably going to get higher quality shots than a cheap dedicated negative scanner. RAW shooting is a massive plus though; the results will need some white-balance correction.

All of this needs to be mounted on a fairly sturdy tripod that can take the weight of your setup pointing straight down.

A couple of things you will need to be able to do though: manual focus, or at least have the ability fix the focus, and a self-timer / cable release function. Shooting so close, things can get blurry quickly.

One useful little accessory is a macro focusing rail, it allows you to finely tune the focus without having to mess around with the camera's settings too much. It can be especially helpful with older, heavier lenses that tend to fall out of focus when the camera is pointing towards the ground and nudged slightly.

Probably the most difficult bit of the whole setup was coming up with some way to backlight the negative a suitable amount and evenly. Fortunately an Amazon shipping box, printer paper, a torch and a lot of packing tape came to the rescue.

As I didn't have anything suitable to diffuse the light directly under the negative I made a relatively long tube (appox. 30cm) and lined it with printer paper that curved up towards the negative-sized aperture I cut at one end of the box. This produced diffuse enough light that evenly lit the negative.

A special shout out should probably go to the torch I used, it was the extremely bright LED Lenser P7. This is probably the best torch I've ever bought, super-bright for normal torching with a neutral enough light temperature for small photography-related projects like this.

Now for the stuff that really matters...

The Settings

For my negatives I shot in manual mode: 1/50s, f/7.1, ISO 200. I left automatic white-balance enabled as I was shooting in RAW and the white balance would definitely have to be corrected in post-processing anyway.

I chose not to quite fill the frame with the negative to ensure I made the most of the lens' sharpness in the centre. After cropping, most of my shots worked out at around 8mp, which was pretty good going and definitely better than the cheap negative scanners.

The Results

Straight out of the camera this is how the negatives looked:

Inverting the photo quickly got me to something that looked more sensible. The blue cast to the image is the nature of the colour film and this is what needs to be white-balanced away. This can take a lot of playing with to get right but once you've done it for a single image, it should be the same for the whole roll.

After a little pushing & prodding with your image editor of choice (mine is Aperture but I guess that won't be true for much longer). You can get something that looks perfectly acceptable.

To be honest, this photo probably isn't the best example, but you can find some of the better ones (B&W and colour) in my Flickr album.

Something that did surprise me during this process was the amount of dynamic range I got from the negatives by digitising them in this way, I could see details from the negatives that the original prints didn't even give the smallest clue to. The large RAW photos also gave me a lot of latitude when I was editing, it was nice to maintain the atmosphere of film with the advantages of a digital editing workflow.

Did it take a while to do all this: yes, would I have been better off getting a scanner: possibly, would it have been anywhere near as satisfying or fun: definitely not!

Tagged: Film, Photography, Tutorial
Published by digitalpardoe on Friday 8 August 2014 at 01:55 PM

Shooting Film Again

I've recently started shooting a bit of film again so I 'scanned' and uploaded some of the results to Flickr. They're all in the linked album (along with some old ones I uploaded a good while ago). My scanning process isn't exactly typical - but that'll all be explained in a post that's coming soon...

Goto Site →

Tagged: Film, Flickr, Photography
Published by digitalpardoe on Saturday 12 July 2014 at 08:45 AM

Using GitLab Omnibus With Passenger

GitLab is a great self-hosted alternative to GitHub. I'd set it up for other people before but it always seemed to be more hassle than should be to update and maintain (especially with it's monthly update cycle) so I'd never set it up for myself.

Thankfully GitLab now has omnibus packages available to make installation and maintenance much easier. Unfortunately these packages contain all of GitLab's dependencies including PostgreSQL, Nginx, Unicorn etc.. This is great for running on a server dedicated to GitLab but not terribly useful for my setup.

I already had a Postgres database I wanted to make use of along with an Nginx + Passenger setup for running Ruby application. The following describes the configuration changes I needed to make to fit GitLab omnibus into my setup.

The first steps are to create a PostgreSQL user and database for your GitLab instance and install your chosen omnibus package from www.gitlab.com/downloads. After performing the installation do not run sudo gitlab-ctl reconfigure as per GitLab's own installation instructions, we need to add some config first.

The first bit of config goes in /etc/gitlab/gitlab.rb, this sets up the external URL for your GitLab instance, configures the database and disables the built-in Postgres, Nginx and Unicorn servers:

Now you can run sudo gitlab-ctl reconfigure, this should set up all of GitLab's configuration files correctly, with your settings, and migrate the database. You'll also need to run sudo gitlab-rake gitlab:setup to seed the database (this is a destructive task, do not run it on an existing database)

The final bit of configuration goes in /etc/nginx/sites-enabled/gitlab.conf (this assumes you have Nginx + Passenger installed from Phusion's repositories and configured as per their instructions):

Most of the above configuration comes directly from GitLab's omnibus configuration with a few customisations for Passenger. The main configuration options are correctly setting the user and group so there are no permission issues and ensuring that the correct directories exist in the $PATH variable to prevent errors in GitLab's admin interface.

Currently, files uploaded into GitLab may not appear correctly using these instructions due to a permission issue, this should be corrected with a future omnibus release, more discussion can be found in this merge request.

And we're done.

That's about everything, hope it works for you too.

Tagged: GitLab, Nginx, Passenger, Tutorial
Published by digitalpardoe on Monday 12 May 2014 at 11:55 AM

Monitoring Internet Connection Speed With Munin

Considering that an internet connection is now deemed to be a human right you'd think that ISPs, that generally seem to rip us off anyway, would've managed to make their connections nice and reliable, especially when you take into account that the internet has been round for a good few years now and they've had plenty of time to get it right. Unfortunately this isn't the case & I decided that I wanted a way to make sure that I wasn't getting (too) ripped off by my ISP.

To this end I decided to make the use of the tools I already had available, a Raspberry Pi that I've been running as a home server, the Munin monitoring tool that I use to keep track of it & the always reliable Speedtest to test the connection.

(The following is entirely Debian & Munin 2 biased, you may need to tweak it for your particular setup.)

The first job was to find some way to run Speedtest from the command line, fortunately while I was stumbling around the internet I came across speedtest-cli which makes life much easier. So first we need to get a copy of the script and put it somewhere:

(You'll probably be needing to make sure you have a copy of Python installed too, for more info check out speedtest-cli's GitHub page.)

Then we need to get some useful numbers from the script, we do this as a cron job because the script can take a while to run, use a lot of bandwidth & tends to timeout when run in Munin directly.

Create the file /etc/cron.d/speedtest containing the following (modifying the speedtest-cli path to suit your needs of course):

Finally we need a Munin script to create the graphs, the script below should go in /etc/munin/plugins/speedtest, don't forget it make it executable too, or it might not run (chmod +x /etc/munin/plugins/speedtest):

Finally, restart munin-node (probably something along the lines of /etc/init.d/munin-node restart), give it an hour or so and enjoy your new statistics.

Check back soon.

Tagged: Bash, Munin, Raspberry Pi, Tutorial
Published by digitalpardoe on Monday 29 April 2013 at 01:20 PM