Jenkins and SCM Sync

So if you plan on scaling out Jenkins with slaves and the like you’ll probably want to ensure your configuration is kept someone secure, handily there’s a plugin for that.

Jenkins SCM Sync – which to be fair I have a bit of a love hate relationship with and since building up a new Jenkins server last week hasn’t got any better, the config loaded in fine, but then would error on every save / change.

Continue reading “Jenkins and SCM Sync”

AWS and a bit of Slack

So now that we’re able to have code deploying into AWS and notifications from Jenkins into Slack it would make sense if we could check what’s happening with Elastic Beanstalk when the code is deployed (and also get a heads up of any issues with our environments.)

So Lambda and SNS to the rescue, here’s what you need:

This post about Lambda and SNS as well as this updated code, which gets a mention in the comments.

Amazon Web Services (Jenkins)

Right so we have our handy services Virtual Private Cloud (VPC) with access via OpenVPN (and the awesome Viscosity OSX VPN Client) now we need to start adding useful things into it.

(See this blog post for info)

For me the next step was looking at how we could automate deployments using our own tool chain, part of the reason we are looking at AWS is to get a bit more flexibility and also the benefits of greater automation.  We’ve already had success using BitBucket -> Codeship -> Heroku as a work flow to make our code visible and available in readily shareable environment, and it took < 5 minutes to get it up and running ;-)

Continue reading “Amazon Web Services (Jenkins)”

Auto Start / Stop servers FTW

So one of the things I’ve always liked about AWS^WCloud based services is the ability to just spin up development instances and use them, however I’m generally working for between 8 – 10 hours a day and not so often at weekends, but during those hours my servers are running and not doing much, BUT as they are racking up costs (ok so not too much for a t1.micro but the point still stands) and I knew of companies shutting down unused servers during the night and weekends and thought I should give that a go.

So the first attempt at this uses a scheduled data pipeline to run an AWSCLI command to either stop or start servers, sadly due to the lack of complexity in the scheduler in the console (please AWS just put in a text box so I can add in a crontab line) the servers get started and stopped at weekends too, but I’ve now reduced the daily uptime by 14 hours a day, 98 hours a week or 5096 hours a year (you get the point) actually if you take weekends into a count it’s even more than this.

And to do this took 10 minutes thanks to this handy tutorial provided by AWS.

Amazon Web Services (VPC + NAT + OpenVPN)

So in the process of setting up a few bits and pieces on AWS and the first area (well second after a couple of quick deploys using Elastic Beanstalk) is to get a Jenkins server up and running.

So I’m looking to deploy the Jenkins box within a Virtual Private Cloud (VPC) to block off access to Jenkins and also any test slaves it will eventually spin up.  To ensure smooth access into the VPC I’m using OpenVPN.  First step is use the VPC wizard to create the basics, I went with the “VPC with Public and Private Subnets” as this handily creates the NAT Gateway box to allow servers inside the VPC to access the interwebs.

Continue reading “Amazon Web Services (VPC + NAT + OpenVPN)”

Bio

Have had to recently update my bio for a pitch document, clearly I’m getting better at this sort of thing (until the grammar police catch up with me) as this was my first attempt to make a more relevant, high level summary of my professional life.  I’d hire this guy, would you?

“David has worked with internet related technologies for over 15 years, from small CGI scripts for the BBC to the worlds first package collection service with TNT.  David has worked with various programming languages and database technologies to enable small and large organisations use the internet for business.  Projects have included working with TNT, Acer, Sky, Unilever, GSK, British Airways, Colgate Palmolive, BAT.  Technologies have included PHP, Python, JavaScript, MySQL, Postgres, MongoDB, various Cloud service providers and use of API based services including YouTube, Google Analytics, Twitter and Facebook as well as building mobile applications using PhoneGap.”

Day 26

Ok I’ll soon stop doing posts based on the working day of the year (probably) and of course there is a good chance I’ll revert to type and just stop blogging for a while again.

However in the mean time, I’ve been playing with building virtual machines from the command line, and think that Vagrant with Puppet may have made me see local development and my development environment in a completely different manner, not sure this has happened since I stopped FTPing code to sites to put them live (yep deploy is the future kids!)

So to add to the learning so far is building a virtual machine, with the LAMP setup I want from a single command, clearly it’s the future.  I need to refine the thoughts slightly that are swimming around my head and think about the workflow more, but it should be pretty amazing ;-)  stay tuned and what out for a post further along the line with some stuff on Github (I won’t be the first to do it, but it will work for me, and maybe you…)

SQL Query perfomance v Stored Procedure issues

So I have a lovely big SQL which is returning over 2000 rows of data (going back to 2005) which is a recursive Common Table Expression (CTE). To perform certain calculations we have to compare each row with the previous days data and update the data, it’s quite expensive but as a piece of SQL will run in around 1 minute on SQL 2K8 running with 8Gb RAM under VMware. The same query as a stored procedure takes 27 minutes to run!

Yep, the exact same SQL when running as a stored procedure is over 20 times slower to produce the same result set.

Time to hit Google and the second result on the query “” brings up this little beauty of a posting.

So the first check:

SET ANSI_NULLS ON

Yep, got that one covered, the second one however of using local variables in the stored procedure query made all the difference.  In this stored procedures case I added in three local variables which took the values of the passed in parameters and that was it.

*BOOM*

(Note: the computer didn’t actually blow up, that would be irritating.)

The stored procedure now executes as fast as the SQL query.  Amazing, and now that I have a working and more timely query, hopefully I’ll have a happy client.  Now to update all of the stored procedures in the application.

Day 11

Yep if I don’t keep this up I’ll lose count.

Today started off well with me playing with D3.js on the train on the way in, and also finding out that my RoR code that I have in Github had passed its tests in Travis-CI – ok so it had also passed the tests on my laptop but it had taken me a couple of steps to get the right YML file to make it work in Travis.  Thanks to this article on Stack Overflow I found the magic was to do with the Rake DB Migrate task (I was on the right track which was nice)

My .travis.yml file is now like this:

language: ruby
rvm:
- 1.9.3
env:
- DB=sqlite
script:
- RAILS_ENV=test bundle exec rake db:migrate --trace
- bundle exec rake db:test:prepare
- bundle exec rspec spec/
bundler_args: --binstubs=./bundler_stubs

Which made things work, now every time I push my code to Github it will be grabbed and tested by Travis, overkill for a sample app with a known outcome, but good practice.

The down part was getting into the office I was working today and losing one of the rubber earbuds from my earphones, as I type this I’m listening to music in half stereo (not mono) and being subjected to the noise of my fellow commuters.

Once in to the office du jour it was down to some data and SQL checking, not too much of an issue except this was SQL including some lovely CTE queries that I hadn’t looked at in over 3 months, took a while to get going with it, and there were some changes to make (of course) with the client “helping” me write my code. Still managed to validate the results from the queries and client is happy and of course a happy client is what we all want.

So heading back home and more D3.js awesomeness. (and writing a blog post)
;-)