Some folks enjoy playing Russian Roulette in the typical fashion, often with vodka and a gun with a single bullet (guns with magazine cartridges involve less of an element of chance, and more of a certain air of inevitability).
Linux sysadmins also have a way of playing this game, and whilst less fatal (mortally, at least), could be every bit as much as fatal to possibly their most prized possession: their beloved Linux machines, and (probably) their career(s).
In order to deploy StorSimple arrays (virtual or physical, there are certain requirements and pre-requisites that need to be in order. These are defined on the Azure.com documentation site and mostly pertain to having sufficient external connectivity to Azure & internet endpoints.
To simplify the process of ascertaining whether or not an environment is suitable for StorSimple, I have devised a little script that can be run on a Windows 8+ machine and checks network services & outbound connectivity to certain hosts/endpoints over the required ports.
Rsync is a fantastic little tool, blessed with a whole raft of switches and options to allow data transfers to be customised to the sys admin’s heart’s desire.
It can run normally or as a daemon, push and/or pull data and excels at synchronising filesystems either locally on the same machine, or across long distances (internet-based, for which we ought to pay homage to Tim Berners-Lee). Continue reading →
When testing various scenarios, it’s always useful to have some data to use. But, it may not be that easy, or desirable, to use data that exists on our laptops or existing systems.
So, I thought I’d take advantage of Linux’s inherent dd command and randomisation capabilities and produced this: a simple script that creates a user-specified number of directories, with a random number of files in each directory, with each file being a random size.
Kick this off, kick back and voila – an imperfect data landscape that mirrors the real world!