The human brain is a remarkable thing and one that has evolved over thousands of years to become a sophisticated and powerful tool. The brain is capable or over 1000 processes per second with information being transmitted at speeds of up to 268 miles per hour. Latest research suggests that the average memory capacity of a human brain is a quadrillion bytes (that’s 1015) of data. It is an unrivalled mechanism of biological achievement and gives us many abilities that make us uniquely human.
One such capacity is our ability to recognise, analyse and interpret patterns.
Although computers have long been used for their ability to process huge volumes of data, the ability to analyse patterns and to intelligently produce actions from these results is relatively new.
Known as data discovery, the same technology that was originally behind IBM’s Deep Blue chess-playing computer has been advanced to a level that now includes many applications including Tesla’s autonomous cars and IBM’s Watson (a supercomputer).
This powerful software solution offers a highly efficient tool for businesses to intelligently interrogate huge amounts of data to extract useful patterns. Data discovery starts with aggregated data from databases and silos producing and extracts visual data identifying trends and patterns.
Without data discovery, this process is undertaken by human analysts who, despite having one of the most powerful processing capabilities on the planet, have limitations.
The biggest challenge for data analysts in the modern age is the sheer volume of information that is produced. Conservative figures in daily data generation estimate that around 2.5 billion GB is produced and not all of this is even useful.
According to research undertaken in 2016 by the data management company, Veritas, employees spend up to 30% of their working day searching for data that is relevant and useful. That’s 12.5 hours each week!
Those figures may seem surprising but when you consider that the same study found that of the global information stored and processed only 15% is up to date and in use, the numbers make more sense. Veritas concluded that 33% of data is redundant, obsolete or trivial (ROT) and a staggering 52% was ‘dark’ meaning nobody knew exactly what it was for.
These figures suggest that data management has become a chaotic world where analysts are becoming bogged down in an increasingly harder task of finding relevant information.
What better time to consider an alternative to manual data analysis with automated data discovery?
Combined with Information Governance (IG), or information management, where information is intelligently stored, filed and protected, data discovery makes data handling faster and more efficient leading to reduced insight times and the ability to handle larger volumes of data. The result is better business analytics at a fraction of the overheads.
There are many solutions available on the market to offer data discovery tools including those offered by FileFacets. Most share the same common features, including:
- Multi-environment scan capability.
- Analysis of data patterns.
- Categorisation of files into bespoke outputs.
- Exception and anomaly reporting.
- Identification of ROT data.
- Optimisation of data marked for action.
- Data management for ease of manual access and analysis.
Business intelligence is one of the key factors that determines whether an organisation can stay ahead of its competition. With data discovery, you can prove that two heads are better than one…. even if one of these is an automated tool.