Hatching chicks at school

At Jessica’s school, where I am a governor and Laura works, they’ve got an incubator of eggs from Living Eggs, so that the school can watch as they hatch into chicks.

On Monday I got a text from Laura wondering if it would be possible to set up a webcam to watch them hatch.  We happened to have a wireless webcam that wasn’t being used so that evening I got it out to make sure it was working and configured it with an FTP server to upload photos every minute and if there was motion detected.

The next day I took the webcam into school and got it set up with the help of James who looks after the school computers and network. You can see the latest pictures from the webcam on the school website.

In the early hours of this morning the first egg hatched:

000DC5D3146E(ChickCam)_1_20130102060517_1089.jpg000DC5D3146E(ChickCam)_1_20130102061008_1098.jpg000DC5D3146E(ChickCam)_1_20130102063542_1124.jpgwebcam-3.jpg

Hatching chicks at school

Automatically publish your API when you push to github

In less than half an hour I could update my project to automatically publish my API in the new IBM API Management beta – Here’s the steps…

Sign up for the new API Management beta, just click through to ‘Cloud’ and login with your IBMID (If you don’t have one you can create one) and once you’ve accepted the terms and conditions your organisation will be created.

Install and configure the new toolkit CLI:

npm install -g https://beta.apim.ibmcloud.com/apimanager/toolkit/apim.toolkit
apim config:set server=beta.apim.ibmcloud.com
apim login

Create a product definition for your API:

apim create --type product --title "Travel Information" --apis product.yaml

Adjust the product definition as needed in your favourite editor

Add the x-ibm-configuration extensions to your swagger document to configure what happens when someone calls the API – in my case invoke the backend API

x-ibm-configuration:
  enforced: true
  phase: realized
  testable: true
  cors:
    enabled: true
  assembly:
    execute:
      - invoke:
          title: invoke
          target-url: '<backend url>'

Now switch over to your CodeShip account, load your project and go to the Deployment section of your project.

Add a custom script option and confiigure the following script (adding your details as needed):

npm install -g https://beta.apim.ibmcloud.com/apimanager/toolkit/apim.toolkit
apim config:set server=beta.apim.ibmcloud.com
apim login -u <username> -p <password>
apim config:set organization=<org>
apim push docs/swagger.yaml
apim stage --catalog=sb docs/travel-information.yaml
apim publish --catalog=sb docs/travel-information.yaml

Commit and push to your repository and your updated API will be pushed to API Management! – Here is my example API

If you don’t already have a CodeShip account you can sign up to CodeShip with your github account and create link in your github repository. You can then set up the tests and deployment steps in the project settings.

Automatically publish your API when you push to github

Great South Run 2014

Great South Run weekend is here! Today we had the 5k run which Laura, Anne and Des took part in and all did very well, and Abi’s 1.5k Mini Run – even Jessica was enjoying running on the race track they had there and is keen to do the mini run next time round.

Now all that’s left is my one tomorrow – I’m going to be running the 10 mile Great South Run for the first time to raise money for gain. If the technology works you should be able to watch live at http://runkeeper.com/user/rickymoorhouse and you can sponsor me at http://justgiving.com/rickymoorhouse . I’ll update this again tomorrow after the race!

My run went well – I really enjoyed it and there was a fantastic atmosphere around the course.  I managed to beat my target and come in with a time of 1:59:42

Great South Run 2014

Disabling SSLv3

With POODLE the time has come to disable SSLv3 everywhere. There will be clients that break and need fixing but it needs doing. You can read more details and background on the vulnerability.

Here’s a few useful snippets from my experience with it this week:

Apache

Make sure the combination you have for the SSLProtocol line disables SSLv2 and v3 – something like:
SSLProtocol All -SSLv2 -SSLv3

DataPower

Ensure your crypto profiles have SSLv2 and v3 disabled in the options line:

  switch <domain>
  co 
  crypto 
  profile <profile>
  option-string OpenSSL-default+Disable-SSLv2+Disable-SSLv3
  exit 
  exit 
  write mem 

Java

If you have problems with handshakes from Java client process force the protocols to use with
-Dhttps.protocols=TLSv1

nginx

Make sure the ssl_protocols line in your SSL configuration doesn’t have SSLv3 in it.
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;

nodejs

Make sure you don’t have secureProtocol:SSLv3_method anywhere in https options – use TLSv1_method instead if it’s really needed.

Websphere

See Security bulletin

Disabling SSLv3

Traffic Pi

Using my Raspberry Pi, Piglow and the traffic API feeds I have created a script to give me a visual representation of the journey time to work. This gives me an idea of the traffic before I leave the house in the morning, or so that when I’m working at home I can look at it and see how glad I am that I’m not sitting in traffic on the way to work:)

https://github.com/rickymoorhouse/trafficpi

Link

Review of Elasticsearch Server 2nd Edition

Elasticsearch Server Second Edition is a good book to read if you’re getting started with Elasticsearch or considering using it. It goes through all the main areas of getting your data indexed and then searching and analysing it.

The book is well written and easy to read through and serves well as a reference guide to refer back to later. It has helped me get an overview of some of the features of Elasticseach that I’ve not yet used, some of which I hope to explore in further depth following on from the examples in the book. All of the chapters in the book include useful references to sources for further information on the topic covered and for more in-depth coverage the authors recommend going on to read their other book, Mastering Elasticsearch which I hope to read as well as a follow on.

Review of Elasticsearch Server 2nd Edition