Developing a big data strategy – what you need to know and the Sherlock Holmes connection

October 2, 2014 SSP Limited

The new, modern day Sherlock Holmes and Big Data are both generating much hype, but there is a more fundamental – or rather – elementary connection.

Benedict Cumberbatch’s Sherlock mentally searches through his huge internal database, looking for – however improbable – the truth. The solution can be found if sufficient data is analysed, with the sleuth drawing on information from disparate and apparently unconnected sources. Holmes declares boldly that plots and conspiracies are always revealed by events that are “seemingly insignificant”. This approach is exactly the same approach that organisations need to take to generate value from big data. Source and search through enough pieces of data and an answer will reveal itself. The conundrum from organisations is how to maximise the value without disappearing under a pile of information and systems?

The traditional model – building orderly databases with specific fields where accuracy is paramount – has been replaced by sourcing and combining vast quantities of information where lack of accuracy is compensated for by the sheer volume of data available to investigate. Data sources are also changing – moving to a combination of both structured and unstructured data, with the latter forming the vast majority. The skill is sourcing as much disparate, seemingly unconnected and potentially irrelevant data together and analysing it until patterns and learnings emerge. It is no longer about looking for correlations and explanations, it is about identifying and recognising that some behaviour or result occurs – why it occurs is no longer the important factor. Seeing that it does is the key. This means that organisations can make data driven predictions.

So how to achieve this? A mind-set change is required – coping with ambiguity and uncertainty. Investing in people with the capability and mentality to dive into vast quantities of structured and unstructured data from seemingly non-related sources until they identify a pattern or an outcome and generate a novel insight.

For example, SSP have brought to market a device that monitors different aspects of driver behaviour – revealing insights that informs car insurance premium Behaviours monitored include expected information – speed, time of day etc, but also aspects such as driver attitude: less assertive individuals are worse drivers than proactive drivers.

Identifying anomalies in the data is also useful and is only possible if you source substantial amounts of data to establish the expected or norm. As an example, property information could help insurers and brokers identify whether an applicant’s assertion that their car is garaged is correct.

Access to lots of information enables Insurers to test hypotheses to improve their underwriting or customer experience efficiently. It also enables organisations to identify anomalies or outliers - for example, fraudulent transactional activity.

But this is only beneficial if the analysis can be reviewed real-time, and this requires the system and process capability to realise this. For example insurers taking on a customer then finding out how they should have priced or underwritten the transaction post the event is risky. SSP systems enable real-time sourcing and analysis but not every system does.

Organisations need to develop the capability to pool information from disparate sources, including third party sources and unstructured data sources such as Facebook. They need to think across their whole value chain and get into the mind of their consumer, pulling together information from across the whole value chain that generates an understanding of the customer behaviour which they can then act on.

For example, Facebook users telling people they are having a coffee may not seem interesting – until you use it to learn when your customer has regular downtime and generate marketing communications when they are most likely to read and act on the contact.

The big data phenomenon is changing the way organisations approach sourcing and analysing data, requiring a new mind-set that is comfortable with handling large quantities of apparently disparate information, analysing and reviewing until trends or insights emerge that inform business decisions.

About the Author

SSP Limited

As the leading global supplier of technology systems and software for the insurance industry, our role is to help insurers and brokers operate more efficient businesses. So whether you’re a global insurer or an MGA, a high street broker or a start-up with a smart new idea, we can be trusted to support you on your journey, whatever the destination.

Follow on Twitter Follow on Linkedin Visit Website More from SSP Limited
Previous Article
The value of big data – how to build the business case

The big data phenomenon is changing the way organisations approach sourcing and analysing data, requiring a...

Next Article
How big data supports your digital insurer strategy

For many insurance businesses it has been essential to add digital channels to the customer journey just to...

Working from home support from SSP

View resources