Organizations are storing vast quantities of data, much of it historical (Legacy), often unaware of its providence, its age, its security levels, that it is a duplicate and often ‘dead’ data from people and projects no longer relevant to the organization, (some estimates say 20-30%). Apart from the storage inefficiencies, it also creates enquiry difficulties (internal and Freedom of Information queries) as data is ‘lost’ in the system and is increasingly retrieved by individuals’ memory of history rather than a systemized enquiry mechanism. Duplicates are similarly held in many places often multiple copies of confidential information, and most of this Legacy data is not security labelled. This lack of labelling not only causes potential issues with the Data Commissioner but also is a potential issue with any inter-organisational or partner communications.
The DataCube Retro-labelling Service module deals with the categorizing and Retro-Labelling of the Legacy data. This activity enables all the Legacy Data to be located, indexed and then categorised into the appropriate categories which align with the Security classifications agreed previously (either by applying the current Information Classification and Handling Policy, an industry agreed classification schema, such as the LGCS or the application of a local personalised schema.- See Taxonomy/Schema Creation Service)
Having categorised the documents into the agreed Classification schema- which by definition dictate the classification levels and retention periods- currently -Unclassified, Protect, Restrict, and Confidential; (but soon to be changed as the new Government Protective Marking System of Top secret, secret and Official replaces the former scheme) the module then applies retro-labelling techniques and places a security Classification on the original document, along with a Meta Data classification. This then secures the data insomuch that it is now under security controls implemented by the organisation.
The system does not change your original data or its locations (except by placing meta data tag on it- even the last accessed data stays the same) and is carried out on your premises.