Migrating away from WordPress… but not really

For as long as I can remember I have hosted this blog on Dreamhost using WordPress. Last year I migrated to their Dreampress service but for the tiny amounts of traffic it wasn’t worth it. Well that and the non stop emails about my wordpress install being vulnerable.

The cost and the hassle are what prompted my move away from this set up. I wanted to start serving a  static blog using something like Hugo, Jekyll, Nikola or Pelican but that meant  importing all my wordpress posts and which I didn’t fancy doing so I settled on using a local install of WordPress (on my Freenas server) and the excellent Simply Static plugin to generate a static site from a WordPress install.

The install of WordPress is only accessible on my network so no more vulerability issues. I get all the benefits of WordPress like Social links and Analytics plugins with the added bonus of a blazing fast static site.

So far I have been very happy with the set up. If you notice any issues pelase let me know @philroche.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Ubuntu cloud images and how to find the most recent cloud image – part 2/3

TLDR;

sudo snap install image-status

This will install a snap of the very useful `image-status` utility.

image-status cloud-release bionic

This will show you the serial for the most recent Ubuntu 18.04 Bionic cloud image in QCOW format.

image-status ec2-release bionic

This will show you the AWS EC2 AMIs for the most recent Ubuntu 18.04 Bionic AWS EC2 cloud images.


Part two of a three part series.

Following on from part 1 where I detailed simplestreams and sstream-query I present to you the `image-status` utility which is a very neat and useful wrapper around sstream-query.

image-status is hosted on github as part of Scott Moser‘s talk-simplestreams repo.

I recently submitted a pull request which added the ability to package image-status as a snap. This was merged and you can now install image-status on any linux distribution supporting snaps using the following command.

sudo snap install image-status

Once installed you can start querying the simplestreams feeds for details on the most recent Ubuntu cloud images.

Usage:

image-status --help # to see all available options

image-status cloud-release bionic # to see most recent Ubuntu Bionic release images on http://cloud-images.ubuntu.com/
image-status cloud-daily bionic # to see most recent Ubuntu Bionic daily images on http://cloud-images.ubuntu.com/

image-status gce-release bionic # to see most recent Ubuntu Bionic release images on GCE
image-status gce-dailybionic # to see most recent UbuntuBionic daily images on GCE

image-status ec2-release bionic # to see most recent Ubuntu Bionic release AMIs on EC2
image-status ec2-daily bionic # to see most recent UbuntuBionic daily AMIs on EC2

image-status azure-release bionic # to see most recent Ubuntu Bionic release images on Azure
image-status azure-daily bionic # to see most recent UbuntuBionic daily images on Azure

image-status maas-release bionic # to see most recent Ubuntu Bionic release images for maas V2
image-status maas-daily bionic # to see most recent UbuntuBionic daily images for maas V2

image-status maas3-release bionic # to see most recent Ubuntu Bionic release images for maas V3
image-status maas3-daily bionic # to see most recent Ubuntu Bionic daily images for maas V3

I find this very useful when trying to quickly see what is the most recent Ubuntu release on any particular public cloud. eg:

image-status ec2-release bionic | grep eu-west-1 | grep hvm | grep ssd | awk '{split($0,a," "); print a[6]}'

This will return the ID for the most recent HVM EBS Ubuntu 18.04 (Bionic) in the eu-west-1 AWS EC2 region. This can be achieved using sstream-query too but I find filtering using grep to be easier to understand and iterate with.

I hope the above is helpful with your automation.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Google Photos Programming “API” Hack

When investigating using the python api for Google Photos it soon became apparent that it was no longer possible to add existing photos to an existing album.

The video shows how I managed to do this by recording http requests in Google Chrome and exporting to Curl Command.

You will have to export the request every time your logged in session expires but for my usecase this is not a problem.

I hope this helps someone.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Ubuntu cloud images and how to find the most recent cloud image – part 1/3

TLDR;

sstream-query --json --max=1 --keyring=/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg http://cloud-images.ubuntu.com/releases/streams/v1/com.ubuntu.cloud:released:download.sjson arch=amd64 release_codename='Xenial Xerus' ftype='disk1.img' | jq -r '.[].item_url'

This will show you the URL for the most recent Ubuntu 16.04 Xenial cloud image in QCOW format.


Part one of a three part series.

There are a few ways to find the most recent Ubuntu cloud image an the simplest method is to view the release page which lists the most recent release.

Another method is to use the cloud image simple streams data which we also update every time we (I work on the Certified Public Cloud team @ Canonical) publish an image.

We publish simple streams data for major public clouds too but this post deals with the base Ubuntu cloud image. I will follow up this post with details on how to use the cloud specific streams data.

Simple streams

Simple streams is a structured format describing the Ubuntu cloud image releases.

You can parse the Ubuntu’s release cloud image stream json yourself or you can use a combination of sstream-query and jq (install packages “ubuntu-cloudimage-keyring“, “simplestreams” and “jq“) to get all or specific data about the most recent release.

Query all data from most recent release

sstream-query --json --max=1 --keyring=/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg http://cloud-images.ubuntu.com/releases/ arch=amd64 release='xenial' ftype='disk1.img'

This will return all data on the release including date released and also the checksums of the file.

[
 {
 "aliases": "16.04,default,lts,x,xenial",
 "arch": "amd64",
 "content_id": "com.ubuntu.cloud:released:download",
 "datatype": "image-downloads",
 "format": "products:1.0",
 "ftype": "disk1.img",
 "item_name": "disk1.img",
 "item_url": "http://cloud-images.ubuntu.com/releases/server/releases/xenial/release-20180126/ubuntu-16.04-server-cloudimg-amd64-disk1.img",
 "label": "release",
 "license": "http://www.canonical.com/intellectual-property-policy",
 "md5": "9cb8ed487ad8fbc8b7d082968915c4fd",
 "os": "ubuntu",
 "path": "server/releases/xenial/release-20180126/ubuntu-16.04-server-cloudimg-amd64-disk1.img",
 "product_name": "com.ubuntu.cloud:server:16.04:amd64",
 "pubname": "ubuntu-xenial-16.04-amd64-server-20180126",
 "release": "xenial",
 "release_codename": "Xenial Xerus",
 "release_title": "16.04 LTS",
 "sha256": "da7a59cbaf43eaaa83ded0b0588bdcee4e722d9355bd6b9bfddd01b2e7e372e2",
 "size": "289603584",
 "support_eol": "2021-04-21",
 "supported": "True",
 "updated": "Wed, 07 Feb 2018 03:58:59 +0000",
 "version": "16.04",
 "version_name": "20180126"
 }
 ]

Query only the url to the most recent release

sstream-query --json --max=1 --keyring=/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg http://cloud-images.ubuntu.com/releases/streams/v1/com.ubuntu.cloud:released:download.sjson arch=amd64 release_codename='Xenial Xerus' ftype='disk1.img' | jq -r '.[].item_url'

This will show you the URL for the most recent Ubuntu 16.04 Xenial cloud image in QCOW format.

"http://cloud-images.ubuntu.com/releases/server/releases/xenial/release-20180126/ubuntu-16.04-server-cloudimg-amd64-disk1.img"

Query only the serial of the most recent release

sstream-query --json --max=1 --keyring=/usr/share/keyrings/ubuntu-cloudimage-keyring.gpg http://cloud-images.ubuntu.com/releases/ arch=amd64 release_codename='Xenial Xerus' ftype='disk1.img' | jq ".[].version_name"

This will show you the serial of the most recent Ubuntu 16.04 Xenial cloud image.

"20180126"

The above streams are signed using keys in the ubuntu-cloudimage-keyring keyring but you can replace the –keyring option with –no-verify to bypass any signing checks. Another way to bypass the checks is to to use the unsigned streams.

It is also worth noting that OpenStack can be configured to use streams too.

I hope the above is helpful with your automation.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Xerox DocuMate 3220 scanner on Ubuntu

TLDR; This blog post is confirming that the Xerox DocuMate 3220 does work on Ubuntu and shows how to add permissions for non root users to use it.

——————————————-

I was using my wife’s old printer/scanner all in one for scanning documents and it worked well but it was a pain to scan multiple documents so I decided to get a business scanner with auto feed and duplex scanning.

I went for the Xerox DocuMate 3220 as it stated it was SANE compatible so would work on Linux.

DM3220_img1.jpg

With an RRP of ~€310 I managed to get a refurbished model for €98 delivered from ebay but sadly I didn’t do enough research as the scanner is not SANE supported.

In my research in trying to add the scanner to the xerox_mfp SANE backend config (which didn’t work) I discovered that VueScan was available for Linux and it’s supported scanners did list some of the Xerox DocuMate series. I had used VueScan on my old MacBook Pro and was very happy with so I gave it a shot. Note that VueScan is not Open Source and not free but it is excellent software and well worth the €25 purchase price.

Lo and behold it found the scanner and it supported all of the scanner’s features.

  • Flatbed scanning
  • Auto feed
  • Duplex auto feed

However VueScan would only detect the scanner when run as root due to libusb permissions.

To add permissions for non root users to use the scanner I made the following changes. This guide should also be helpful when changing permissions for any USB device. The following changes were made on an Ubuntu 17.10 machine.

# Add myself to the scanner group. You can do this through the “Users and Groups” GUI too.

philroche@bomek:$ sudo usermod -a -G scanner philroche

# Find the scanner vendor id and product id

Running dmesg we can see the scanner listed with idVendor=04a7 and idProduct=04bf

philroche@bomek$ dmesg
usb 1-2.4.3: new high-speed USB device number 26 using xhci_hcd
usb 1-2.4.3: New USB device found, idVendor=04a7, idProduct=04bf
usb 1-2.4.3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
usb 1-2.4.3: Product: DM3220
usb 1-2.4.3: Manufacturer: Xerox
usb 1-2.4.3: SerialNumber: 3ASDHC0333

Note: The device number will most likley be different on your system.

Running lsusb we can see that the scanner is also listed as “Visioneer”

philroche@bomek:$ lsusb
Bus 001 Device 026: ID 04a7:04bf Visioneer

Note: As with the the device number, the Bus used is likley to be different on your system.

We can see above that the device is on bus 001 as device 026. Using this info we can get full udev (Dynamic device management) info.

philroche@bomek:$ udevadm info -a -p $(udevadm info -q path -n /dev/bus/usb/001/026)
looking at device '/devices/pci0000:00/0000:00:14.0/usb1/1-2/1-2.4/1-2.4.3':
 KERNEL=="1-2.4.3"
 SUBSYSTEM=="usb"
 DRIVER=="usb"
 ATTR{authorized}=="1"
 ATTR{avoid_reset_quirk}=="0"
 ATTR{bConfigurationValue}=="1"
 ATTR{bDeviceClass}=="00"
 ATTR{bDeviceProtocol}=="00"
 ATTR{bDeviceSubClass}=="00"
 ATTR{bMaxPacketSize0}=="64"
 ATTR{bMaxPower}=="0mA"
 ATTR{bNumConfigurations}=="1"
 ATTR{bNumInterfaces}==" 1"
 ATTR{bcdDevice}=="0001"
 ATTR{bmAttributes}=="c0"
 ATTR{busnum}=="1"
 ATTR{configuration}==""
 ATTR{devnum}=="26"
 ATTR{devpath}=="2.4.3"
 ATTR{idProduct}=="04bf"
 ATTR{idVendor}=="04a7"
 ATTR{ltm_capable}=="no"
 ATTR{manufacturer}=="Xerox"
 ATTR{maxchild}=="0"
 ATTR{product}=="DM3220"
 ATTR{quirks}=="0x0"
 ATTR{removable}=="unknown"
 ATTR{serial}=="3ASDHC0333"
 ATTR{speed}=="480"
 ATTR{urbnum}=="1251"
 ATTR{version}==" 2.00"

This is the info we need to create our udev rule

# Add Udev rules allowing non root users access to the scanner

Create a new udev rule

philroche@bomek:$ sudo nano /etc/udev/rules.d/71-xeroxdocument3220.rules

Paste the following text to that new file

SUBSYSTEM=="usb", ATTR{manufacturer}=="Xerox", ATTR{product}=="DM3220", ATTR{idVendor}=="04a7", ATTR{idProduct}=="04bf", MODE="0666", GROUP="scanner"

This adds a rule to allow any user in the “scanner” group (which we added ourselves to earlier) permission to use the usb device with vendor 04a7 and product 04bf.

Note you will have to log out and log in for any group changes to take effect or run su - $USER

# Reload the udev rules

philroche@bomek:$ sudo udevadm control --reload-rules

# Test these new udev rules

philroche@bomek:$ udevadm test $(udevadm info -q path -n /dev/bus/usb/001/026)

You shouldn’t see any permissions related errors.

Now when you run VueScan as a non-root user you should see no permissions errors.

# Start VueScan

philroche@bomek:$ ./vuescan

Selection_238.png

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Creating a VPN server on AWS using PiVPN

One of the streaming services I use called NowTV recently launched an Irish service alongside their UK service which I was using. The Irish service costs _double_  the cost in UK. They have also begun geoblocking all Irish users and also users of VPN Services like ExpressVPN and PrivateInternetAccess from using the UK service.

To get around this I decided to set up my own VPN server on AWS in their London datacenter to get around the geoblocking.

The easiest way I have found to set up a VPN server is to use PiVPN (http://www.pivpn.io/) which was designed for use on Raspberry Pi but can be installed on any Debian based machine.

There has been a few recent guides on how to install PiVPN but this one focusses on installing on AWS.

A prerequisite for this guide is that you have an AWS account. If this is your first time using AWS then you can avail of their free tier for the first year which means you could have the use of a reliable VPN server free for a whole year. You will also need an SSH keypair.

The steps are as follows:

  1. Start up an instance of Ubuntu Server on AWS in the London region
  2. Install PiVPN
  3. Download VPN configuration files for use locally

1. Start up an instance of Ubuntu Server on AWS in the London region

Log in to your AWS account and select the London region, also referred to as eu-west-1.

Selection_141 (copy).png

Create a new security group for use with your VPN server.

Selection_141.png

This new group sets up the firewall rules for our server and will allow only access to port 22 for SSH traffic and UDP port 1194 for all VPN traffic.

Selection_140.png

Launch a server instance

Selection_141.png

We will choose Ubuntu Server 16.04 as it is a Debian based distro so PiVPN will install.

Selection_142.png

Choose the t2.micro instance type. This is the instance type that is free tier elligible.

Selection_143.png

Leave instance details default

Selection_144.png

Leave storage as the default 8GB SSD

Selection_145.png

No need to add any tags

Selection_146.png

Choose the security group we previously created.

Selection_147.png

Review settings – nothing to change here.

Selection_148.png

Click Launch and specify either a new SSH keypair or an existing SSH key pair. I have chosen an existing pair which is called “philroche”.

Selection_149.png

Check the checkbox abount key access and click Launch Instances. Your instance will now launch.

Selection_150.png

Click View Instances and once state has changed to running note the IPv4 Public IP. You now have an instance on Ubuntu Server running on AWS in their London datacentre.

Selection_151.png

2. Install PiVPN

SSH in to your new server using the private key from the pair specified when launching the server.

ssh -i ~/.ssh/philroche ubuntu@%IPV4IPAddress%

substituting %IPV4IPAddress% for the IP address of your server

Selection_152.png

Once logged in update the packages on the server.

sudo apt-get update

Selection_154.png

Start the PiVPN installer.

curl -L https://install.pivpn.io | bash

For more detail on this, see http://www.pivpn.io/#tech

Selection_172.png

You are then guided through the process of installing all the required software and configuring the VPN server:

Selection_155.png

Selection_156.png

Selection_157.png

Selection_158.png

Choose the default ubuntu user.

Selection_159.png

Selection_160.png

We do want to enable unattended upgrades of security patches.

Selection_161.png

Choose UDP as the protocol to use.

Selection_162.png

Choose the default port 1194.

Selection_163.png

Selection_164.png

Create a 2048 bit encryption key.

Selection_165.png

Selection_166.png

Choose to use your servers public IP address.

Selection_167.png

Choose whichever DNS provider you would like to use. I chose Google.

Selection_168.png

Installation is now complete 🙂

Selection_169.png

Choose to reboot the server.

Selection_170.png

Selection_171.png

Once the server has rebooted, checking the AWS dashboard for it’s status, SSH back in to the server.

Now we need to configure a VPN profile that we can use to connect to the VPN server.

The easiest way to do this is to use the ​​​​pivpn command line utility.

pivpn add

Selection_173.png

This will guide you through the process of creating a profile. Make sure to use a strong password and note both the profile name and the password as you will need these later.

Selection_174.png

Set up is now complete so you can logout.

Selection_175.png

3. Download VPN configuration files for use locally

The only thing left to do is to download the profile you created from the server so that you can use it locally.

scp -i ~/.ssh/philroche ubuntu@%IPV4IPAddress%:/home/ubuntu/ovpns/%PROFILENAME%.ovpn .

substituting %IPV4IPAddress% for the IP address of your server and %PROFILENAME% with the name of the profile you created.

This will download the .ovpn file to your current directory.

Selection_177.png

Once downloaded you can import this to your VPN client software of choice.

I used the profile on a small Nexx WT3020 I had with OpenWRT installed. I connect this to my NowTV box so I can continue to use NowTV UK instead of the overpriced NowTV Ireland.

IMG_20170529_105928.jpg

I hope this guide was helpful.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

The ultimate wifi upgrade

I have been procrastinating for a very long time about whether or not to take the plunge and upgrade my office/home wifisetup. The goal of the upgrade is to have complete high speed wifi coverage throughout my house and seamless hand over between access points.

TOUGHSwitch TS‑8‑PRO

TOUGHSwitch TS‑8‑PRO

Today I bit the bullet and decided to buy a five pack of Ubiquiti UniFi AC Lite AP and one Ubiquiti TOUGHSwitch TS‑8‑PRO. I could have gone for the Pro or HD access points but for my use case this was overkill.

All Ubiquiti products seem to be the industry GOTO product and we use them at Canonical sprints where we’ve never had a problem. I also purchased 305m spool of cat6 cable and a Platinum Tools EZ-RJPRO Crimp Tool and connectors to make it easier to properly terminate the connections.

UniFi AC Lite AP

UniFi AC Lite AP

All the access points are (POE) Powered Over Ethernet so will not require power points in the ceiling.

This setup does require using Ubiquiti Unifi controller software but thankfully there is a docker image which sets this up and which I can run on my Freenas box.

All this means I should achieve my goal highspeed wifi throughout the house and seamless handover between access points. It will also hopefully mean that I no longer require any ethernet over powerline adapters.

I plan on taking a few pictures of the setup as it progresses as well as performing speed tests.. watch this space.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Configuring Arduino IDE for use with Autonomo board and RN2483 Lora Shield

Much of the following set up is from the official Autonomo docs at but with a bit more detail and more screenshots.

  • Download the latest version of Arduino IDE (1.6.8 at time of writing) from https://www.arduino.cc/en/Main/Software
  • Extract archive and run the install script which will add an shortcut to the application to your main applications menu.
    Install Arduino IDE shortcuts
    Arduino IDE shortcut
  • If you are using Linux you will need to add yourself to the ‘dialout’ user group and logout and login for the change to take effect. This is so that you have permission to access the COM ports.
    Add user to dialout group
  • We then need to tell the Arduino IDE about our Sodaq Autonomo board. The Autonomo’s board profile is available through the Arduino Boards Manager. In order to install the board files you will need to first add the SODAQ board manager URL (http://downloads.sodaq.net/package_sodaq_index.json) to File->Preferences->Additional Board Manager URLs:
    Arduino IDE preferences
    Adding Sodaq boards profile
  • Once this is done we need to download the board profile for the Autonomo using Arduino IDE’s board manager.
    Board manager menu entry
  • Search for ‘sodaq’ click install for the latest Sodaq SAMD boards.
    Sodaq boards
  • You will now see the Autonomo board listed
    Autonomo board now listed
  • Now we have the board installed we need to install the Sodaq specific libraries that we are likely to use. We can do this using the library manager:
    Library manager
  • Search for ‘sodaq’ in the library manager and install the libraries you are likely to use
    sodaq libraries

Your Autonomo board is now configured on Arduino IDE and you can continue development as you would with any Arduino board.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Deploying Keys and Certs to a NodeJS app on AWS Opsworks

I have a nodejs app running on AWS deployed using AWS Opsworks. The app relies on an AWS IoT certificate and AWS IoT private key being present and I don’t want to add the key and certificate to my git repo.

The solution I ended with was to use the AWS Opsworks App environment variables to pass in the certificate and key as environment variables and read these from the nodejs app.

App  Environment Variables

Opsworks replaces all new line characters with spaces so in our app we have to reverse this:

var iotcert = process.env.IOTCERT;
var iotkey = process.env.IOTKEY;
iotcert = iotcert.split(" ").join("\n").replace("BEGIN\nCERTIFICATE", "BEGIN CERTIFICATE").replace("END\nCERTIFICATE", "END CERTIFICATE");
iotkey = iotkey.split(" ").join("\n").replace("BEGIN\nRSA\nPRIVATE\nKEY", "BEGIN RSA PRIVATE KEY").replace("END\nRSA\nPRIVATE\nKEY", "END RSA PRIVATE KEY");

…. Problem solved 🙂

I suppose it is a little less secure than the certificate and key being on the file system and with read only access to the nodejs process but it’s a lot more secure than the certificate and key being hosted on github.

Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email

Python command history

Obviously iPython is the bee’s kness when it comes to Python shells but if you don’t have iPython installed then getting command history can be a pain. Not any more 🙂

import readline; print '\n'.join([str(readline.get_history_item(i)) for i in range(readline.get_current_history_length())])

This will print all python commands run during that session.

Also as a gist

If you have iPython installed, it’s as simple as

%history
Share this...
Tweet about this on Twitter
Twitter
Share on Google+
Google+
Email this to someone
email