The evil Super Admin Password

So you’ve survived a disaster, fire or other adverse event, and you need to shift staff home to work because the office is a pile of smoking rubble. Their PC’s from work are by a stroke of luck usable, and they’ve got broadband. Two thumbs up there.

But about that printer driver you need… It requires admin rights. The domain controller, well it’s at the bottom of a crack in the earth, or in the IT guys garage.

No problem, log in as Administrator, give the local user admin rights, and you’re in business. Oh, they’re an hours drive away, and you didn’t have the fore-sight to install and test a remote control tool.

This is about where you discover why having a single administrator password that is re-used for multiple purposes in the business is considered poor practice. Or, in layman’s terms: down-right silly.

To get the accounts clerk printing, and the receptionist able to configure the network card you’ve now got to give away your precious uber-password over the phone. The kitchen staff can now access skype, but they can also access your bank accounts, the encryption keys for your VPN, the payroll system and the cleverly protected documents with the formula for your world beating popcorn recipe.

You know they will write it on a post-it note and stick it to the fridge, but it beats driving 50k’s across town to fix a 5 second problem… Deal with the fall out later.

So, how many places do you re-use the same passwords? And after the last major outage, did your IT staff have to give it up to the cleaner so he could access Ebay and not tell anyone for fear of having to change the uber-password in 300 hundred different places?

This is part of a series of articles that have come about from my experience in shifting the IT operations for a business after the recent destructive earthquake in Christchurch, New Zealand.

Random password generator update

Firstly a big thanks for the feedback I’ve received on the random password generator I stuck on the site a wee while ago, it’s had quite a bit of traffic so I’m going to assume it’s been of use to more than just myself!

I’ve fixed a minor bug where occasionally it would produce a password shorter than the length selected, which caused confusion for at least one person. To be honest I noticed it quite early on when I was testing and ignored it.

The second update is slightly more interesting. Grant from over the ditch in Australia pointed out that in the default setting of 9 characters with upper and lower case plus numbers there was often only one number in the password, where he felt there should be three on average.

And he would be correct, but I didn’t take the weighting of numbers vs letters when I wrote the generator. The problem being that there are 26 letters but last time I looked there were only ten numbers. The original code only used one instance of each number in the source string, so you were 2.6 times more likely to get a letter than a number, 5.2 times if you include upper and lower case.

I’ve fixed that up with a subtle update that uses 30 numeric characters in the source string, which gives relative likelihood of upper, lower and numbers of of 31.7%, 31.7% and 36.6%.

Along with that I’ve adjusted the punctuation string component to give more even distribution of punctuation if you select ‘Full Noise’.

Why is it so hard to think up a good password?

I’ve been working in IT for a wee while now, a shade over 20 years even, and in all this time there is one consistent thread of frustration that nibbles away at my very sanity. Trivial Passwords.

I’m sure this isn’t just be going nuts here, there must be thousands of network administrators and web masters going quietly bonkers all over the planet right at this very moment.

We slave away with intimate pride of our collective nerdiness, building robust and secure IT systems for all to behold. Fussing and fettling over minute details to ensure the ever important data is safe.

An unfortunate side affect of this creative journey is a necessary evil. The agent by which all things great in computing are undone. I’m not referring to the trivial password here but that which spawns it. The user.

“Do I have to put a number in my password, eight characters, really?”

I’m getting chills just typing that sentence.

A quick Google for ‘most common passwords’ will quickly reveal the painful truth. 123456 is actually a very common password, as is the word itself… ‘Password’.

Where am I going with all this? I use complex passwords. I love them with the same fervour as tatting enthusiasts like a good yarn. (See what I did there? No? Look it up…)

I used to use an on line password generator, but the owner of the website decided to put pop-up ads on the page, so that every time you refreshed the page you got another ad popup. ARGH.

If you were looking for something particular in your random password it could take ages with all the popping, closing and refreshing going on. Popup ads are second only in their evil nature to trivial passwords.

Fresh from a particularly annoying bout of popping, closing and refreshing last week I set about creating my own random password generator, which is now on line for all to use.

It uses the php rand() function seeded using microtime() which in lay terms means that in theory it can generate a different password every microsecond. Of course if you are a lay person you probably don’t care, and you’re using 123456 as your password.

That in a nutshell is it. Enjoy your randomly generated passwords.

Google Location, the best of results, the worst of results

Google announced on their official blog a couple of days ago that location was the new black. Enhancing search results by allowing the surfer to rank results ‘nearby’, or pick another location by name.

This is just a continuation of the direction on-line technologies have been moving with social media leading the charge. Services like foursquare giving people their constant location fix. Twitter has even gone local allowing you to share your location in 140 character chunks.

Up until now the only real down side of this location hungry trend has been the exact same thing touted as the benefit of telling the world where you are. Namely that the world knows where you are. Privacy concerns are rife as the mobile social media crowd go about their daily lives in a virtual fish bowl.

pleaserobme.com highlights this by aggregating public location information from various social networks and figuring out if your house is empty. How long before insurance companies wise up and use Social media as a reason for not paying out on your house insurance? “But Mr Jones, you told the entire world you were away from your house, you encouraged the burglar.”

The last thing on earth I would want to do is share my location real time with the world but I was keen to experience the Google location search to see how it actually effects search results.

The impact of location based search is going to be far more noticeable in the real world than the failed insurance claims of some iPod users.

The Google blog entry says that this is available to English google.com users, but we don’t have it here in New Zealand yet. We might have been first to see the new millennium, but not so much with Google changes.

To get my Google location fix I used a secure proxy based in the US and took in the view or the world from Colorado. Pretending to be within the 48 States is handy for all sorts of things.

LocationI did some searches from a clean browser install on a fresh virtual machine, so that personal search preferences or history would not taint the results. I then set about testing some long-tail search phrases that give top 5 results consistently for our website at work.

No surprise that I got essentially the same results as I do here in New Zealand, but with more ads due to targeted adwords detecting that I was in the US of A. What was disturbing was that selecting ‘nearby’ knocked our search result down past the tenth page of Google.

We sell products to the whole world, and do not have a geographical target so the location search will clearly have an impact on our organic results as it rolls out. A business which is targeting a local area such as a coffee shop or Restaurant might well benefit from the location search, assuming that Google knows where your website is.

But there’s the rub. How did Google decide our website was not near Colorado? Our webserver lives in Dallas TX, our offices are in New Zealand and Thailand, and we regularly sell products to over thirty countries.

Which leads to the impact of location for web developers and the SEO community. How do you tell Google what your ‘Local’ is? I messed about with location names, and putting in ‘Christchurch’ where our business is based got our long tail hit back up to the front page, but only a fraction of our business comes from Christchurch, dispite it being where our head office is.

I suppose anti-globalisation campaigners in their hemp shirts and sandals will be rejoicing at this news but I’m not so sure I’m going to be celebrating this development with the same enthusiasm.
A quick search for meta-tags or other methods of identifying your geographical target came up dry, and even if there was one we can only gently suggest to Google that it index and present things the way we as web site owners want.

When the dust has settled and the ‘Nearby’ link is clicked Google are the only ones who know what the best results are. It just might be that their best just became your worst if your business has a broad geographical target and weak organic placement.

Taking joy from simple news: IE6 and Youtube

Anyone who has anything even remotely to do with web development will be smiling at the news that Youtube is going to discontinue support for IE6.

Not only that, we’ve got a date. 13th of March, 2010.

While this isn’t really the end, it will certainly put that little bit more pressure on the roughly 15-20% of internet users who still cling to the 9 year old version of Internet Explorer for various reasons I fail to fully comprehend.

You can read more about this on mashable.com or techcrunch.com as they do a much better job than me of writing about such things.

There’s even a response from Microsoft if you so wish to expand your mind.

Having worked in a corporate IT environment I fail to see how even the most lethargic of firms could take 5 years to update the web browser in the modern business environment. Unless you’re talking line of business PC’s in a secure network, but then those PC’s shouldn’t be afflicting their attempts at html rendering on the web development community.

I thought when Facebook stopped explicitly supporting the nearly decade old browser in 2008 that we’d seen then end of it. Then Microsoft shattered the hopes of many geeks, confirming support would continue into 2014.

With Youtube being the number 3 site on the web I’m going to take a punt and say that at least some of the 15% will be getting the message loud and clear soon that it’s time to update their PC.

Security in the cloud, KISS

The idea of keeping things simple when it comes to server security is not at all radical and cloud servers provide the ability to reach the not so lofty goal of keeping your servers simple and secure without breaking the bank.

The theory is simple: The smaller the number of processes you have running on your box the less there is to go wrong, or attack. This is one area where Windows based servers are immediately at a disadvantage over a *ix server, but I digress.

When I was pretending to be a hosting provider a few years ago I ran colocated discrete servers. They weren’t cheap to own or run, not by a long shot. That cost was a huge enemy of the KISS security concept.

In the process of trying to squeeze every last cent of value from the boxes I overloaded them with every obscure daemon and process I could think of. Subsequently the configuration of the servers became complex and difficult to manage, while applying patches became a cause of sleepless nights and caffeine abuse.

With the cost to deliver a virtual server in the cents per hour and the ability to build a new server in a matter of minutes the barrier to building complex applications with a robust security architecture is all but vanished.

The mySQL server behind this blog site is a base install of Debian Lenny with mySQL, nullmailer, knockd and an iptables firewall script. That’s it. Simple to build, simple to configure, simple to backup and simple to manage. KISS.

A little bit of searching around on hardening up a linux box and you’ll quickly find information on changing default settings for sshd and iptables rulesets which you can combine with small targeted cloud servers to reduce the sleepless nights.

I can’t help with the coffee addiction though, I’m still trying to kick that habit myself!