Amazon Web Services (VPC + NAT + OpenVPN)

So in the process of setting up a few bits and pieces on AWS and the first area (well second after a couple of quick deploys using Elastic Beanstalk) is to get a Jenkins server up and running.

So I’m looking to deploy the Jenkins box within a Virtual Private Cloud (VPC) to block off access to Jenkins and also any test slaves it will eventually spin up.  To ensure smooth access into the VPC I’m using OpenVPN.  First step is use the VPC wizard to create the basics, I went with the “VPC with Public and Private Subnets” as this handily creates the NAT Gateway box to allow servers inside the VPC to access the interwebs.

Continue reading “Amazon Web Services (VPC + NAT + OpenVPN)”


Have had to recently update my bio for a pitch document, clearly I’m getting better at this sort of thing (until the grammar police catch up with me) as this was my first attempt to make a more relevant, high level summary of my professional life.  I’d hire this guy, would you?

“David has worked with internet related technologies for over 15 years, from small CGI scripts for the BBC to the worlds first package collection service with TNT.  David has worked with various programming languages and database technologies to enable small and large organisations use the internet for business.  Projects have included working with TNT, Acer, Sky, Unilever, GSK, British Airways, Colgate Palmolive, BAT.  Technologies have included PHP, Python, JavaScript, MySQL, Postgres, MongoDB, various Cloud service providers and use of API based services including YouTube, Google Analytics, Twitter and Facebook as well as building mobile applications using PhoneGap.”

Day 26

Ok I’ll soon stop doing posts based on the working day of the year (probably) and of course there is a good chance I’ll revert to type and just stop blogging for a while again.

However in the mean time, I’ve been playing with building virtual machines from the command line, and think that Vagrant with Puppet may have made me see local development and my development environment in a completely different manner, not sure this has happened since I stopped FTPing code to sites to put them live (yep deploy is the future kids!)

So to add to the learning so far is building a virtual machine, with the LAMP setup I want from a single command, clearly it’s the future.  I need to refine the thoughts slightly that are swimming around my head and think about the workflow more, but it should be pretty amazing ;-)  stay tuned and what out for a post further along the line with some stuff on Github (I won’t be the first to do it, but it will work for me, and maybe you…)

SQL Query perfomance v Stored Procedure issues

So I have a lovely big SQL which is returning over 2000 rows of data (going back to 2005) which is a recursive Common Table Expression (CTE). To perform certain calculations we have to compare each row with the previous days data and update the data, it’s quite expensive but as a piece of SQL will run in around 1 minute on SQL 2K8 running with 8Gb RAM under VMware. The same query as a stored procedure takes 27 minutes to run!

Yep, the exact same SQL when running as a stored procedure is over 20 times slower to produce the same result set.

Time to hit Google and the second result on the query “” brings up this little beauty of a posting.

So the first check:


Yep, got that one covered, the second one however of using local variables in the stored procedure query made all the difference.  In this stored procedures case I added in three local variables which took the values of the passed in parameters and that was it.


(Note: the computer didn’t actually blow up, that would be irritating.)

The stored procedure now executes as fast as the SQL query.  Amazing, and now that I have a working and more timely query, hopefully I’ll have a happy client.  Now to update all of the stored procedures in the application.

Day 11

Yep if I don’t keep this up I’ll lose count.

Today started off well with me playing with D3.js on the train on the way in, and also finding out that my RoR code that I have in Github had passed its tests in Travis-CI – ok so it had also passed the tests on my laptop but it had taken me a couple of steps to get the right YML file to make it work in Travis.  Thanks to this article on Stack Overflow I found the magic was to do with the Rake DB Migrate task (I was on the right track which was nice)

My .travis.yml file is now like this:

language: ruby
- 1.9.3
- DB=sqlite
- RAILS_ENV=test bundle exec rake db:migrate --trace
- bundle exec rake db:test:prepare
- bundle exec rspec spec/
bundler_args: --binstubs=./bundler_stubs

Which made things work, now every time I push my code to Github it will be grabbed and tested by Travis, overkill for a sample app with a known outcome, but good practice.

The down part was getting into the office I was working today and losing one of the rubber earbuds from my earphones, as I type this I’m listening to music in half stereo (not mono) and being subjected to the noise of my fellow commuters.

Once in to the office du jour it was down to some data and SQL checking, not too much of an issue except this was SQL including some lovely CTE queries that I hadn’t looked at in over 3 months, took a while to get going with it, and there were some changes to make (of course) with the client “helping” me write my code. Still managed to validate the results from the queries and client is happy and of course a happy client is what we all want.

So heading back home and more D3.js awesomeness. (and writing a blog post)