Upgrading my home network

For a number of years, I’ve been using a TL-WR1043ND running DD-WRT as my home router, even going as far as replicating the same setup for friends and family, as it struck a happy midpoint of being powerful enough to be useful, but also simple and stable enough for the slightly less technically literate to manage. The DD-WRT setup was surprisingly simple, and I’ve been reasonably impressed by the performance and capabilities of the software, even on such a basic consumer model of router. That said, around 12 months ago I realised that my home network was rapidly outgrowing this basic setup, and I felt the need to lean towards something a bit more “prosumer“. There are quite a few different companies targeting this market, but one in particular stood out to me – Mikrotik.

Around 12 months ago I took the dive and bought a RB2011UiAS-2HnD-IN (that’s quite the product name!) – one of their mid-line models which seemed quite reasonably priced, and got stuck into learning the intricacies of RouterOS. Their own WinBox software provides a very usable GUI for configuring their hardware, granting a view to the myriad of different features of the board while keeping the learning curve shallow enough to avoid you becoming swamped by options. I’ve gradually tweaked and enabled more and more services, to the point where the single device is providing:

  • PPPoE to my ISP
  • DHCP
  • DNS (local and external)
  • Firewall
  • WiFi (more on this below)
  • L2TP connection to a VPN provider, with certain traffic automatically routed through this
  • … and a whole lot more!

Additionally, when I moved into my new house earlier this year, I set about removing the need for using HomePlug to connect various devices in different rooms, as I found that these tended to be unstable, causing slow transfer speeds and a high rate of dropped connections. I ended up running CAT6 from my study through to several other rooms in the house (I may blog about that project some time in the future!), which eliminated the need for the HomePlugs, but highlighted how poor my WiFi setup was (having previously blamed this on the dodgy connections). While researching ways to improve coverage, I struck upon a feature of RouterOS that I hadn’t yet taken advantage of – CAPsMAN (Controlled Access Point System MANager). This essentially allows you to delegate control of various MikroTik device radios to a central ‘manager’, which pushes out the WiFi configuration to create a seamless network across all access points. I picked up a couple of Home Access Points (hAP) and set these up as slaves to CAPsMAN running on the main router (as well as the radios on the router itself being delegated to CAPsMAN, not something that’s recommended officially but seems to work for me), and I haven’t had any complaints about sub-par WiFi performance since!

My next step involves upgrading the heart of my network to something with a few more gigabit speed ports – I’ve already run out of capacity in my “rack” (a re-purposed IKEA bookshelf) – so I’m looking at getting a CRS125-24G-1S-2HnD-IN (there we go again with the brilliant product names!) to act as the core router, and demoting the current RB2011UiAS-2HnD-IN to act as a switch and access point in the living room instead of the “dumb” switch in there currently.

While I realise there are quite a few alternative offerings coming to market that simplify home networking (Google WiFi, Ubiquiti Unifi etc), I’m more than happy with what Mikrotik have to offer both in terms of the hardware and software, and I continue to be impressed by how straightforward yet powerful my home network has become now that there’s something more powerful behind it. I might even start suggesting an upgrade to the parents network!

Puppet-ing my home network

For just over 12 months, I’ve been using Puppet as a way of configuring the various machines I have dotted around my home network. The initial desire to move to an automated configuration management tool was to keep tabs on the various requirements across pieces of software I run, however I’ve now moved to the point where all deployments of new hardware and software (both my own, and third party) are all managed through Puppet, and life is much simpler (most of the time!). I’ve been meaning to write this post for a while, mainly as a “getting started” point for anyone who is thinking about embarking on a similar journey, hopefully this will help people avoid making some of the same mistakes I made along the way!

Firstly, a little about my network, to give some of the points context. Over the last 5 years, I’ve evolved from having a single desktop machine, to today having the same desktop, 2 microservers (one for mass storage, the other for backup, services and acting as the Puppetmaster), a Home Theatre PC, and 7 Raspberry Pis in various guises (an alarm clock, appliance monitors and TV encoder between them). All of these machines were originally configured manually, and re-imaged from scratch every time something went wrong, leading to endless pages of scribbled notes about various requirements and processes to get things working. Now, all of them pull a catalogue from a single Puppetmaster, which I update as requirements change. Adding a new host is a matter of installing a fresh operating system, installing Puppet, and adding a few lines to the Puppetmaster detailing what I want installed – the rest is taken care of!

I’ve ended up using Puppet for much more than I originally imagined, including:

  • SSH Keys
  • MySQL hosts & databases
  • Apache site configuration
  • Tomcat configuration (auto-deploying apps is on my to-do list)
  • Sendmail
  • Automated backups
  • Media sorting
  • ZNC configuration

Plus even more besides that! Most bits of my own software run through supervisord, which is also configured through Puppet.

The first, and arguably most important lesson I learned through my Puppet-isation process is to keep the configuration in some sort of versioning system. Initially I did not keep track of the changes I was making to Puppet manifests, and requirements that were added at 3am seemed completely illogical in the cold light of day – but by self-documenting the manifests as I went along, and tracking changes through my own Subversion repository, I can look back at the changes I’ve made over time, and save myself the hassle of accidentally removing something essential. I’ve lost track of the number of times I’ve had to revert back to a previous configuration, and having a system in place to do this automatically will definitely help you in the long run!

While writing your configuration, I’ve learnt to be explicit about your requirements. A lot of the modules and class definitions I wrote early on did not use any of the relationships you can define between attributes, and as a result I’d often be bitten by the Puppet parser attempting to configure things in an illogical order. For the sake of adding a few arrows into your manifests, it’s worth using these just to avoid the head-bashing when a configuration run fails intermittently! I still run into problems when configuring a new host, when a package I require isn’t installed in time to start up a service – using relationships from the ground up will hopefully avoid this ever becoming an issue.

Arguably the second most useful tool I installed (after Puppet itself) is the Puppet Dashboard. This fantastic tool pulls in the reports that Puppet generates, and spits them out into a very readable format which allows you to get straight to the heart of what’s causing failed runs, rather than having to resort to diving through the depths of the raw logs. It took me a while getting around to installing this, and I honestly regret not installing it straight away – it has saved me an incredible amount of time since. A word of warning though – the logs can take up an awful lot of space (my dashboard database is 400MB+ with only the last 3 months data stored) – make sure your SQL server is up to the task before starting this!

While installing things like the Puppet Dashboard, heira and the like, I’ve taken to Puppet-ing your Puppet, by which I mean writing the configurations for these tools into your Puppet manifests. Initially this seemed very strange to me, I didn’t want some almost-sentient piece of software configuring itself on my machine! However, the client will quite happily run on the same machine as the server to configure itself (in a strange cyclical dogfooding loop). There are plenty of third-party modules available that will do a lot of this for you, but I ended up writing my own for several things – as long as the installation is managed somehow, you’re going to have a much better time!

Linked to that, and my last tip that I’ll post here, is to write your own modules wherever you can. For the first few months after starting using Puppet, I tried to cram everything imaginable into the main manifest directory, and barely used templates and source files at all. Subsequently I’ve learned to separate the concerns wherever possible, which leads to much cleaner code in the module definitions, and much nicer looking node definitions as well. Third-party modules have been a mixed blessing for me, some of them have been coded extremely well to allow multi-platform support, but some have been hard-coded to alien platforms making them useless to me. I’d absolutely recommend going looking for someone else who’s done the hard work first, but don’t be afraid to roll up your sleeves and write a module yourself. Maybe even post it online for other people to enjoy! (I’m very hypocritical making that last point, none of my modules have made it online as they’re all terrible!)

I hope that some of the above might help steer someone in the right direction when embarking on a journey with Puppet. Obviously I’d recommend some more official reading before converting your professional network of 10,000+ machines across, but if you’re like me and need a comparatively small network taming, then take the plunge and get started!

XBMC – My Experience

Over the Christmas holidays I got a bit bored (as you do), and decided to experiment with an old projector, turning my bedroom wall into a giant monitor!

I’ve been looking for some time to find a decent ‘10 foot UI‘, and from initial glances at the internet all fingers were point towards Apples FrontRow, however I didn’t want to sacrifice the Mac Mini to run as a media PC (although it would do a very good job – virtually silent!). I decided to go slightly off the beaten track and try XBMC (Xbox Media Centre as was).

I must say I was pleasantly surprised at how easy it was to set up and configure – within a few minutes I had a working interface that looked great, it was playing nicely with my network to drag videos off my main PC and download box (headless machine running get_iplayer and rtorrent in a cupboard, more on that in another post). My only major gripe is that the navigation is fairly awful – whether you’re using a keyboard or mouse (or a remote I presume), I can’t seem to figure out whether I’m needing to go forwards/backwards/up/down on the menus, and a lot of functionality seems to be duplicated.

The process of gathering TV and film information is pretty seamless, the scanner seems to run nicely in the background gathering data on my files, even though they’re sat on another machine. I’ve noticed a few false positives for TV shows, and you definitely need to do some config file editing to get TV files picked up.

I unfortunately don’t have the right kit at the moment to try and get live TV displayed through XBMC as well (from forum browsing it seems possible), although I have installed the iPlayer and YouTube extensions, both of which seem to be quite buggy. The iPlayer extension inparticular has some pretty major issues – trying to pause a video will cause the whole program to lock up!

Overall – I’m pretty happy with this solution for the meantime. Anyone who’s happy hacking around with setting files to get things working just how they want should be right at home, but I wouldn’t really recommend to anyone who’s after an ‘out of the box’ solution.

Windows 7 on a Inspiron 510m

My ancient old knackered laptop (recently upgraded with a brand new 1Gb stick of RAM) had been running Ubuntu for a number of years, and was getting to the point where a OS re-install was necessary. I’ve been getting increasingly annoyed with the university wireless not playing nicely – so decided to give Windows 7 a try (since I can get it nice and free!)

Install was a breeze, OS booted fine first time, all drivers found and installed by Windows Update – apart from one…

Seems I was stuck with the 640×480 VGA resolution driver, Intel doesn’t think their 8XXGM series drivers work well with 7. Never fear – a bit of Googling revealed this site: http://www.groundstate.net/855GMWin7.html, which contains instructions for tricking 7 into installing the old drivers! Result! Although Seb may have had a slight point on this weeks Tech 107 (episode 3) about Macs being easier due to them not needing separate driver install. Ah well.

So – I’m now officially off Linux and 100% Windows again. Lets see how long it lasts this time!

Revision Dodging

Right now, I have 26 hours until my next exam (Computing CP4), yet I am sat blog posting! What a mistake-a to make-a!

Anyway, I’ve now done 5 exams, and have another 8 left to go. Only 13 days left though! They do like to cram these exams into a nice small space of time!

Since I’ve finished school, I’ve discovered PKR – a 3D poker game that you play from your computer over t’internet against other people. It’s quite addictive – and a very good waste of time! I came 112th out of 400 in a tournament last night, went all-in on a pair of Kings, the other bastard had a pair of Aces -.- Will be back in the tournament tonight, going to beat last nights record!

I’ve also discovered some more stuff, which I’ll post about later today.

Right, who wants a game of Monopoly?!

My life: update!

I’m annoyed at my computer right now. Are there any trojans going the round right now? :S My PC seems to slow to a crawl one minute, and then be fine the next 🙁 I think its dying/infected – and either way I want rid of it!!!

Managed to get next draft of Physics coursework done this weekend – but then the bloody printer ran out of ink, and I’ve got none spare 🙁 Have to wait for a Viking order to come through now!!! At least I’ve got an excuse for not doing Computing coursework though 😀 Hehe

I’m fed up of Poynton. That’s enough said on the matter, although things are looking up slightly! School still blows though…

Thats about it for my life at the moment… interesting isn’t it?!

Half-life 2

Just completed the latest Mod that I’ve found (Half-life 2: Riot Act)… absolutely amazing!

Got about 3 hours gameplay out of this – simply unreal for a freebie! My hat any any other headwear goes off to the developers! Would post a link, but I can’t remember it… its late and I can’t be bothered googling (yes I’m that tired, see next post)

Saw today that HL2:Episode 2 is up for pre-order… in part of "The Orange Box".

This contains:

  • Half-life 2 (already have)
  • Half-life 2: Episode One (already have)
  • Half-life 2: Episode Two (point of buying this package)
  • Portal (No idea, looks like more puzzles and less shooting – will give it a crack before slating this one)
  • Team Fortress 2 (not bothered, was really into TFC, but they seem to have ruined it)

All for the low low price of… $44.95 (about £22.50 ish)

I don’t really agree with the added things, I would like to be able to get EP2 on its own really, but for £22.50, I don’t mind too much. Providing it really is £22.50. Will probably be more like £40 in this country. Sound fair to you?

Anywho, its out in just over a month (October 10th I believe). I will be ordering my copy. I’m that ruddy dedicated!

Update: Just had a look on Play.com. RRP £34.99 (told you so…) but they’re doing it for £26.99… reasonable (ish).