npm bash-conf, for sharing configs between bash and node

API keys, DB authentication, passwords, system paths and other things of that nature dont belong in a Git repo.

They belong in a lovely config file created by an install script run after cloning a repo. That’s how I try and do things these days anyway.

I use a lot of bash scripting for big queries and file manipulation, often creating strings and files in Coldfusion and saving them to Amazon S3 or the HDD for a bash script to sort out later.

So these install scripts ask for db credentials, API keys etc and you type your responses in to match the environment you’re on (production, staging, dev) and the bash scripts work away after importing the data.

I’ve started using node js for these back end processes more and more frequently now, allowing me to develop more complex scripts that are easier to debug, easier to maintain and most of all means my team mates can work on them as well!

In order to avoid having to have another JS friendly config file for my node scripts to read and also to avoid passing data in when the scripts are called I’ve written an npm module that can read the same config file bash uses.

To quickly summarise how it works, I’ll just dump the usage snippet from the README. You just have to give it the path to a config file which is just a list of simple FOO=BAR style declarations

var path = process.argv[ 2 ],
BashConf = require('bash-conf'),
bashConf = new BashConf();

.read( path )
.then(function( data ) {
console.log( 'what is foo', data.FOO );
console.log( 'this should be empty -> [', data.EMPTY_VAR, ']' );
.catch( function( err ) {
console.log( err );

If you want more info, get in touch, use Github issues or RTFM!
Also get in touch if the manual needs updating…

Mac vs PC

Before I get started, I need to say a couple of things.

First of all, I’m not really sure I want to write this post – I had a bit of a rant a while back and was encouraged to post the transcript. I can’t remember it all so I’m just going to start fresh – hopefully it wont just be a load of rage waffle, I was quite caught up in the moment at the time.

Secondly, I hate the phrase “Mac vs PC” – I know now everyone thinks of PC as a Windows computer, but it’s really just the acronym for Personal Computer… which I think you’ll find, is also a Mac, or Linux Desktop, and so on and so forth. So it’s a bit stupid anyway.

Right then, let’s get properly started…

Many years ago, I was a complete and utter Anti-Mac bastard! I’d always had Windows towers, I’d build them myself, spec them high for gaming and on top of that I had Nokia mobile phones. It was great. I was such a cool techy person.

I knew Macs were for artists, or designers, or people in the creative industry or some crap like that and I just believed what I was told – I KNEW it. It was a fact.

Then I met Mike Holman – the least geeky geek in the world (as in he really is technically a geek, but he doesn’t do all the geeky stuff like think Star Wars is amazing). Mike is a very capable programmer, he’s far too clever, knows about loads of cool open source software, understands much more about the real technical side of things than I do… but he was using a Mac! How could I take him seriously?! Real techies use Windows right (unless they live in basements, then they use Linux).

So for fun, me and Mike went full on into Mac vs PC mode, and we ran sister sites of decent software to add to you Mac or WINDOWS MACHINE ( and, sadly both now dead)

I continued to Mac bash for many years, then the iPhone came out and all the other fancy smart phones. I bought a Nokia N900 – better than anything that ever was or ever will be (again, a fact). I continued with my desktop PCs, I pulled the case off, hoovered it, decoded the weird light patterns the motherboard would tell me when stuff broke. I returned shitty RAM to ebuyer because of bad memory addresses, I worried about my RAID arrays failing on me (do I want redundancy or performance?!?!??! Totes performance, obvs).

It went on like this, and everything was fine. Until someone wanted me to build them an iOS app. I tried to convince them a website would do. They still wanted an app. So I said fine, I’ll build one.

This is where everything changed.

I didn’t have a Mac, so I borrowed one. My previous experience with Apple products was the old Mac Pro’s at UWE, that were pre intel, and rubbish, and mostly just showed the beach ball spinner.

They genuinely did suck. Seriously. Literally (they had fans that suck hot air out the case). They were also bad computers.

So I borrowed a Mac Mini, did some iOS development, hacked together some Objective-C whilst wishing it could be more like Javascript (which it can be now, with Swift), and pushed some apps to the app store.

The Mac Mini was ok, but slow. So I bought a Mac Book Air, with an SSD. Which still is brilliant. It’s brilliant right now. It’s warming my thighs as I type this textual gold that you’re lapping up right now.

It’s a Mid-2011 i5 4GB 11 inch laptop, covered in stickers, and I love it. It struggles a bit with compilation in Xcode but other than that, it’s fucking amazing and can happily run a big wide screen monitor if I’m set up at a desk.

I also got myself an iPhone 3GS to develop apps on, then an iPad 2 because they were just a lovely way to be lazy and have the internet at the same time.

Time went on… I didn’t convert to Mac, my old PC was still in the house running Windows and I used Windows all day at work. I was using both Mac OSX and Windows 7 side by side, and getting the benefits of both.

I was also using headless Linux servers at work, and realising that Bash scripting and working on the command line was actually better than anything else for a lot of work type things. Being able to do a lot of that in OS X on a Mac was crap tonnes better than using Windows cmd.exe…

tail -n 100 | awk '{print $9}' ~ sort | uniq -c | sort -n

Try doing that in Windows… there is probably a way, and I don’t know if you can do all that in OS X anyway – but the point stands, Linux is cool.

To top it all off, I also had an old PC connected to the living room TV running Ubuntu as a media center, for iPlayer and YouTube etc.

It was great… despite my initial resistance, I’d begrudgingly got a Mac, did some development and now my house is covered in all kinds of technology.

I no longer have a particular preference to any particular platform or OS.

I’ve got a 4 monitor Windows 10 custom build PC at work and my faithful 11″ Mac Book Air that follows me around and does all my freelance work.

Then to compliment them, and to ensure that I have some form of internet access within 2 meters at all times, between work and home there’s a couple more Macs, some iPads, some Android tablets, iPhones, an Xbox, Playstation, Wii, and a variety of Ubuntu and CentOS headless cloud servers so I get my daily fix of CLI.

Oh and something more important than the badge on the computer and the OS that it runs… get an SSD. A Solid State Drive.

A windows laptop with a steampunk, mechanical, spinning disk stack storage device is worse than a Mac with an SSD.

A Mac with a motorised, multi-level miniature record player with pointy reading devices that wobble is worse than a Windows machine with an SSD.

There’s so much cool hardware that is available now, but the important thing is the software that is available on all the platforms. Things like Dropbox , Evernote and all the Google Apps mean that whatever device I’m on I can get access to all my files, spreadsheets, notes, pictures etc.

It’s all so good, and I’m lucky enough to have realised that by not choosing sides, I get to have the best of all of them.

To paraphrase a very, very wise man I had the pleasure of working with recently, when asked the question “Taylor Swift or Katy Perry, who would you rather date?”

His response was (roughly along the lines of )

“Come on guys! I date both of them”

And he’s right, although there is no need to stop there.

Mac Vs PC – sure, have that argument. If you’re an idiot.

Mac AND PC? sure. good start. you’re on the right track.

Now I’ll just post this from OSX, check it’s gone live in bed from iOS, see if I’ve had any likes whilst using Windows tomorrow before using CentOS to push some websites live and then check the updates on the Android tablet. In the evening I could play on the Wii with the kids, update my Ubuntu cloud server, watch a Blu-Ray on the PS3 then play on the Xbox 360.

In all seriousness though… the PS3 is pants in comparison to the Xbox. It’s just a fact.

Oh yeah and Blackberry. About Blackberry.

No. Just, no.


Installing Coldfusion 10 on Ubuntu!

This keeps tripping me up, it’s caused me problems when upgrading production servers… you need to export the installation directory when installing Coldfusion on a Redhat or Ubuntu server.

mkdir /root/cf_install
export IATEMPDIR=/root/cf_install

Otherwise the installer completes but fails silently on loads of stuff and Coldfusion don’t work proper!

Here’s an Adobe Forum post about the last time I made this mistake:

Mac OSX Ubuntu 12 lts VirtualBox with DropBox DocumentRoot

I’ve just switched from MAMP to VirtualBox for local development. I found Railo was clashing with Apache too much, so I’m going to run 2 different VMs for Coldfusion and PHP development separately.

Installing VirtualBox is easy enough, and creating a Ubuntu VDI (Virtual Disk Image) is easy enough following the wizard.

I downloaded the Ubuntu 12.04 LTS .iso from the Ubuntu website, then mounted that (all pretty easy stuff with VirtualBox) and booted the VM.

This launched the Ubuntu installer, again easy wizard you just run through – at the point where you choose additional software though, I picked LAMP, SAMBA and OpenSSH.

This installs Apache, PHP and MySQL, as well as the SAMBA file sharing server and an OpenSSH server.

When the installer has finished, you need to stop the virtual machine, go into the settings and under “Network” select “Bridged adapter”, this makes sure the VirtualBox gets an IP on your network so you can visit the dev IP and set up your local hosts file to see your dev servers.

So if your new VirtuaBox IP is, on your MAC you’ll need to do this:

sudo vi /etc/hosts
# add this line: localtestserver

Then the awkward part started… I needed to install “Guest Additions”. I don’t really know what this stuff does but I know you need it to get file sharing working.

Under the “Devices” menu from the VirtualBox that’s running, you need to select “Install Guest Additions…”. This mounts an .iso, but I couldn’t find it anywhere in my VirtualBox. Turns out I needed to manually mount the image. Once it was mounted, I could then install the guest additions:

sudo mount /dev/sr0 /media
cd /media
sudo ./

And that installed my Guest Additions! Whatever they are…?

Next, under the settings for the VirtualBox, in the Shared Folders section you can pick a folder on your actual host hard drive (including a folder from DropBox!), and give that a name that Ubuntu system will see. Choose “Auto-mount” to make sure Ubuntu mounts it on startup.

If you restart your VM you should see in the /media/ directory, a new directory that matches the name you chose in the previous step. The problem now, is this is owned by root, belongs to the folder vboxsf and has the permissions 700

pete@wm-lamp/media$ ls -la
total 12
drwxr-xr-x 4 root root 4096 Sep 28 08:30 .
drwxr-xr-x 23 root root 4096 Sep 27 20:31 ..
drwxr-xr-x 2 root root 4096 Sep 27 20:30 cdrom
drwxrwx--- 1 4 root vboxsf 646 Sep 22 10:22 my_new_share

This means I can’t actually use the folder without being root, which is fairly useless. To solve this, I needed to add “pete” and also “www-data” (the user apache runs as) to the group vboxsf. Then there was one more thing to do to get access to that folder… reboot the machine.

Infact, if any of these steps don’t work, reboot the machine between steps… always try turning it off and on again!

One more thing to do then – I wanted to make this new share (which is fixed to the /media directory) my site root.

First you need to make a symlink from your site root to the share:

cd /var/www/vhosts/my_new_site
ln -s /media/my_new_share public_html

You need to make sure that the folders are accessible by www-data, I generally set everything under the vhosts folder to be owned by user, and in the group www-data.

Then create a new apache vhost file to use that share. You need to make sure you set the vhost to follow symlinks – :

cd /etc/apache2/sites-available
sudo cp default my_new_site
sudo vi my_new_site
# set up site root - mine is pointing to /var/www/vhosts/my_new_site/public_html
# make sure ServerName is set to your localtestserver
sudo a2ensite my_new_site
sudo service apache2 reload

Then if you open a web browser and visit http://localtestserver and you will hopefully see your sites in your DropBox folder being served via your Ubuntu virtual machine! Mega convoluted goodness.

I switched to this becuase I like Ubuntu, I use it on lots of other dev servers, it’s a lot more similar to RedHat which I use in production, it’s better than MAMP and I don’t like trying to mix my dev machine with my dev server. It feels tidier once it’s all setup.

And like I said before I can have multiple dev machines and switch between them. At this point I’ll make a clone of the server and keep that as a fresh install so I don’t have to go through this whole process again later.

One more thing, this entry has been a bit scatty, I’ve written it as I’ve gone along so if it’s rubbish let me know and I’ll change it. I’ll probably add some pictures later on in life, but not until I need to use this for reference myself.

Anyway, now I’ve got a decent dev server, I need to go and do some devving.

vi commands

VI is an excellent text editor, I use it all the time when I’m logged into a server via Putty.

I’m more used to doing it in windows though, and recently I’ve been spending more time using a mac. The keyboard shortcuts are a bit different, or there are just less keys on this mac keyboard, so I’ve had to lookup a list of commands so I can type faster in vi!

So here’s a link to a page listing loads of vi commands:

I’ll probably get around to making a page on the blog somewhere that’s bit more concise than that page, as it’s quite long, but it’s still very useful and it has examples as well as the actual commands

Simple passwordless SSH tutorial

It took me many attempts to work out password-less ssh, mainly because a lot of internet guides are not written that well and following them step by step didn’t work.

Then I found one that did work! Woohoo!

So I’ve copied and pasted it here… and regrettably forgotten the link I found it from. If anyone recognizes this then I’ll gladly reference it to the correct source.

The guide makes use of local$ and remote$, this is to indicate if you are typing on your local or remote server.

Create the private key

local$ ssh-keygen -t rsa (accept default file locations and create a password for your private key)
local$ scp ~/.ssh/ user@remote:/home/user/
local$ ssh user@remote (enter normal root password)
remote$ cat ~/ >> ~/.ssh/authorized_keys
remote$ chmod 600 ~/.ssh/authorized_keys
remote$ exit
local$ ssh user@remote (at this point you shold be asked to enter the password for your private key)

Add private key password

The password for the private key needs storing in your machine. This is encrypted.

local$ ssh-agent bash
local$ ssh-add ~/.ssh/id_rsa
local$ ssh user@remote (you should no longer be asked for a password)

Note: in the code examples above, don’t type the stuff in brackets

Note 2: I think this was the blog I got the entry from, but it doesn’t work any more so I can’t be sure… I definitely looked there before I found it anyway:

Screens on Linux

I’ve recently started using screens when inside command line linux.

I got shown it by some Rackspace tech guys as I wanted to leave a process running after I’d left me shell.

Screens is brilliant, it lets you run multiple shells and allows you to switch between them with keyboard shortcuts, and you can close and reopen your ssh client (like putty) and all your “screens” are still there, so you can reconnect to them… or “attach” as screens calls it.

It’s like having multiple terminal/putty windows open, except when you close the actual GUI window, the shell is still running.

What I’ve found this useful for is being logged into multiple GIT repositories and also having a couple of extra shell “screens” for database and server admin.

Anyway, this post is a little but rushed as I’m on my lunch and I’ve still not finished my super noodles, but use screens if you don’t already and you use a command line linux server.

And I’ve put up a Linux section under my code reference, with a screens page in there…

Or I will do shortly… I’ll link to the page in here once I’ve made it!