The National Housing Federation recently warned that proposed new accountancy standards could have a massive financial impact on England’s housing associations, reducing their book value by over £1 billion and costing £150 million in red tape. This has prompted fears that plans for thousands of new affordable homes will now be axed to cover the costs of the new system.
Once again, housing associations are up against the need to make spending more efficient. One area worth exploring is data management; not just where you are managing your data – in-house or through a managed service provider – but how you are managing it.
I was reminded recently that one of our customers decreased their data storage requirements by 35 per cent when we helped them revisit their data storage management. You might wonder how this could be possible – were they just doing things in a particularly unconventional way before? The answer is: absolutely not and their situation was similar to hundreds of organisations across the country.
What I’m suggesting is that you look at your data from a new angle. Instead of viewing it as one great homogeneous mass, separate it into different types, typically:
Mission-critical – 5-10 per cent of up-to-the-minute data, such as transactional data, that is so valuable that constant and immediate access is imperative.
Important – 20-30 per cent of data that is accessed regularly and is needed to support normal business activities.
Legacy – 70 per cent of data, such as legacy files and rarely-accessed emails, which is effectively dormant.
In most case, the above percentages accurately describe the data pyramid found in most organisations and I recommend you carry out this type of categorisation exercise on your organisation’s data.
Once you’ve done this background analysis, it makes sense to treat each data type differently, not storing and backing up all data every hour on expensive storage media. Again, looking at our approach for our own data, we take the following steps:
Mission-critical – stored on high-availability media and backed up (or to be more technically precise, replicated) hourly.
Important – stored on high-availability media but backed up only once, at night.
Legacy – stored on low-cost media for regulatory and recovery purposes, away from our primary storage environment.
I’ve put the case very simply but in truth it doesn’t really get much more complicated. There are differences in the tactical choices you can make to support this approach to data management, for example keeping everything in-house, co-locating your servers in a hosted data centre or outsourcing the management of your data to a cloud services provider.
But whichever way you choose to go, the fundamental principle remains the same – getting a grip on your data by assessing its worth to your day-to-day business will guarantee you significant savings on its storage and management. In addition, if you choose to co-locate your servers and/or to outsource the management of your data, you’ll see further savings as you won’t be paying capital or maintenance costs for your hardware and on-premise data centre.
There’s no doubt that with the continuing squeeze on housing budgets, the segmentation of your data and flexible management of its storage can help relieve the pressure and save costs.
Stefan Haase is divisional product director for data cloud services at InTechnology.