AWS and a bit of Slack

So now that we’re able to have code deploying into AWS and notifications from Jenkins into Slack it would make sense if we could check what’s happening with Elastic Beanstalk when the code is deployed (and also get a heads up of any issues with our environments.)

So Lambda and SNS to the rescue, here’s what you need:

This post about Lambda and SNS as well as this updated code, which gets a mention in the comments.

Amazon Web Services (Jenkins)

Right so we have our handy services Virtual Private Cloud (VPC) with access via OpenVPN (and the awesome Viscosity OSX VPN Client) now we need to start adding useful things into it.

(See this blog post for info)

For me the next step was looking at how we could automate deployments using our own tool chain, part of the reason we are looking at AWS is to get a bit more flexibility and also the benefits of greater automation.  We’ve already had success using BitBucket -> Codeship -> Heroku as a work flow to make our code visible and available in readily shareable environment, and it took < 5 minutes to get it up and running ;-)

Continue reading “Amazon Web Services (Jenkins)”

Auto Start / Stop servers FTW

So one of the things I’ve always liked about AWS^WCloud based services is the ability to just spin up development instances and use them, however I’m generally working for between 8 – 10 hours a day and not so often at weekends, but during those hours my servers are running and not doing much, BUT as they are racking up costs (ok so not too much for a t1.micro but the point still stands) and I knew of companies shutting down unused servers during the night and weekends and thought I should give that a go.

So the first attempt at this uses a scheduled data pipeline to run an AWSCLI command to either stop or start servers, sadly due to the lack of complexity in the scheduler in the console (please AWS just put in a text box so I can add in a crontab line) the servers get started and stopped at weekends too, but I’ve now reduced the daily uptime by 14 hours a day, 98 hours a week or 5096 hours a year (you get the point) actually if you take weekends into a count it’s even more than this.

And to do this took 10 minutes thanks to this handy tutorial provided by AWS.

Amazon Web Services (VPC + NAT + OpenVPN)

So in the process of setting up a few bits and pieces on AWS and the first area (well second after a couple of quick deploys using Elastic Beanstalk) is to get a Jenkins server up and running.

So I’m looking to deploy the Jenkins box within a Virtual Private Cloud (VPC) to block off access to Jenkins and also any test slaves it will eventually spin up.  To ensure smooth access into the VPC I’m using OpenVPN.  First step is use the VPC wizard to create the basics, I went with the “VPC with Public and Private Subnets” as this handily creates the NAT Gateway box to allow servers inside the VPC to access the interwebs.

Continue reading “Amazon Web Services (VPC + NAT + OpenVPN)”

AWSUGUK #3

So in the general tradition of me and blogging we reach the point where it goes a bit quiet.

Of course there are reasons / excuses etc.  Anyway last night I went to the AWSUGUK meet up hosted at The Pregnant Man.  It was the second event I had attended (and the third event overall) and again I was impressed by the venue (A private pub will always make me happy) and the clearly knowledgeable and enthusiastic crowd helped make for a great meetup.

The host Norman Driskell (@n0rm) introduced the evening, and set the tone of the evening with a brief presentation regarding how Razorfish had recently leveraged AWS for an ad tied into the Superbowl, the figures were certainly impressive and just show what you can do with AWS with a planning, couple of key take outs.

  • Pre-warm the servers if you are expecting huge traffic spikes
  • Leverage the services that AWS offer you, rather than rolling your own in an EC2 box

The sessions were:

  • Managing your apps on AWS: Real life lessons with GigaSpaces
  • Quarterbacking the AWS Estate
  • The CentraStage experience

And I have to say all of the speakers were well prepared, and rehearsed and gave a great insight into their areas of expertise, again there were good tips across all of the sessions, in particular I found the AWS Estate session very worthwhile with some dull but useful tips regarding consolidated billing, the billing API and of course an element of insight into some of the newer technology releases, especially Redshift and Opworks.  Hopefully the slides will be up soon.

Adding an SSL cert to an Amazon ELB

So recently I needed to add SSL capability to an Amazon Elastic Load Balancer (ELB) which actually meant :

– Get the certificate, having created a new CSR and Private key on the machine of your choice
– Uploading the Private key, CSR and Certificate into Amazon using Amazon Web Services (AWS) Identity and Access Management service (IAM)

So the first challenge was getting the command line tools and creating the relevant identity files.

Download the AWS command line tools and put them somewhere you want to use them from, I put them in /use/local/IAMCLI which I then added to my .bash_profile using the settings below (this bit is optional, but makes your life easier):

# Added for AWS CLI
export AWS_IAM_HOME=/usr/local/IAMCli
export PATH=${AWS_IAM_HOME}/bin:$PATH
export AWS_CREDENTIAL_FILE=${HOME}/path_to_credential_file/credential_file

The AWS_CREDENTIAL_FILE is as below and the information to put in the file you get from the “Security Credentials” tab under your account settings, add in the ID of the access key you want to use, and click on “show” to reveal the key to use, create the file and ensure you put it in the location you added into your .bash_profile. Observant people will notice this doesn’t work if you deal with multiple AWS accounts, you can always use the optional -aws-credential-file when using the command line tools to point to the credential file you want to use.

AWSAccessKeyId=STUPID_LONG_ID
AWSSecretKey=Stupid_long_key

To upload the certificate:

$ iam-servercertupload -b public-key.pem -c .cert-chain-file.pem -k private-key.pem -s domain.name

To check the certificate is in place:

$ iam-servercertgetattributes -s domain.name

And should you need to delete the certificate:

$ iam-servercertdel -s domain.name

Now when you create the ELB, select “Secure HTTP Server” from the common applications list and save, then when you continue to the next page you should be given the option to “Choose from your existing SSL Certificates”