Developing a big data strategy – what you need to know and the Sherlock Holmes connection

October 2, 2014 SSP Worldwide

The new, modern day Sherlock Holmes and Big Data are both generating much hype, but there is a more fundamental – or rather – elementary connection.

Benedict Cumberbatch’s Sherlock mentally searches through his huge internal database, looking for – however improbable – the truth. The solution can be found if sufficient data is analysed, with the sleuth drawing on information from disparate and apparently unconnected sources. Holmes declares boldly that plots and conspiracies are always revealed by events that are “seemingly insignificant”. This approach is exactly the same approach that organisations need to take to generate value from big data. Source and search through enough pieces of data and an answer will reveal itself. The conundrum from organisations is how to maximise the value without disappearing under a pile of information and systems?

The traditional model – building orderly databases with specific fields where accuracy is paramount – has been replaced by sourcing and combining vast quantities of information where lack of accuracy is compensated for by the sheer volume of data available to investigate. Data sources are also changing – moving to a combination of both structured and unstructured data, with the latter forming the vast majority. The skill is sourcing as much disparate, seemingly unconnected and potentially irrelevant data together and analysing it until patterns and learnings emerge. It is no longer about looking for correlations and explanations, it is about identifying and recognising that some behaviour or result occurs – why it occurs is no longer the important factor. Seeing that it does is the key. This means that organisations can make data driven predictions.

So how to achieve this? A mind-set change is required – coping with ambiguity and uncertainty. Investing in people with the capability and mentality to dive into vast quantities of structured and unstructured data from seemingly non-related sources until they identify a pattern or an outcome and generate a novel insight.

For example, SSP have brought to market a device that monitors different aspects of driver behaviour – revealing insights that informs car insurance premium Behaviours monitored include expected information – speed, time of day etc, but also aspects such as driver attitude: less assertive individuals are worse drivers than proactive drivers.

Identifying anomalies in the data is also useful and is only possible if you source substantial amounts of data to establish the expected or norm. As an example, property information could help insurers and brokers identify whether an applicant’s assertion that their car is garaged is correct.

Access to lots of information enables Insurers to test hypotheses to improve their underwriting or customer experience efficiently. It also enables organisations to identify anomalies or outliers - for example, fraudulent transactional activity.

But this is only beneficial if the analysis can be reviewed real-time, and this requires the system and process capability to realise this. For example insurers taking on a customer then finding out how they should have priced or underwritten the transaction post the event is risky. SSP systems enable real-time sourcing and analysis but not every system does.

Organisations need to develop the capability to pool information from disparate sources, including third party sources and unstructured data sources such as Facebook. They need to think across their whole value chain and get into the mind of their consumer, pulling together information from across the whole value chain that generates an understanding of the customer behaviour which they can then act on.

For example, Facebook users telling people they are having a coffee may not seem interesting – until you use it to learn when your customer has regular downtime and generate marketing communications when they are most likely to read and act on the contact.

The big data phenomenon is changing the way organisations approach sourcing and analysing data, requiring a new mind-set that is comfortable with handling large quantities of apparently disparate information, analysing and reviewing until trends or insights emerge that inform business decisions.

About the Author

SSP Worldwide

SSP is a global provider of technology systems and solutions across the entire insurance industry, using our expertise to enable our customers to transform their business and increase their profitability. SSP provides core technology solutions, distribution and trading capability, advanced analytics and solution delivery. We work with 8 of the top 10 UK insurers, 4 of the top 10 global insurers and over 40% of UK Brokers. Our unique position in the market, including the largest market share of UK e-trading, enables us to provide leading data insight and unrivalled distribution. Our knowledge, talent and technology capabilities deliver innovative results that make us the partner of choice for our customers.

Follow on Twitter More Content by SSP Worldwide
Previous Article
The value of big data – how to build the business case

The big data phenomenon is changing the way organisations approach sourcing and analysing data, requiring a...

Next Article
How big data supports your digital insurer strategy

For many insurance businesses it has been essential to add digital channels to the customer journey just to...

×

Receive SSP eye direct to your inbox

Thank you
Error - something went wrong!