Hot on the heels of this year’s new general data protection regulation (GDPR), the UK government has identified business use of ‘big data’ as another potential area for legislation.

The term ‘big data’ generally refers to the way businesses target customers through the collection, processing and analysis of large and complex data sets from a variety of sources, which often contain personal information. According to the Financial Times, Business Secretary Greg Clark has asked the Competition and Markets Authority (CMA) to advise on the issue of big data misuse as part of an overhaul of business regulations.

Behaviours under the spotlight

Clark has said there are already warning signs in the way companies use personal data, such as energy groups imposing higher charges on loyal customers —especially the elderly — who failed to shop around.

A review is also underway by the Civil Aviation Authority (CAA) into how budget airlines may be using algorithms to separate groups, including families with children, if they refuse to pay extra for allocated seating.

It’s a claim that’s being denied by the airlines involved. In a Telegraph report, Ryanair said that those who did not select seats were allocated randomly, and EasyJet said that its algorithm sits families together “more than 99 per cent of the time – at no additional cost”.

However, the CAA says it “will be looking into how airlines decide where to seat passengers that have booked as part of a group and whether any airlines are proactively splitting up groups of passengers when, in fact, they could be sat together”.

At its heart, this is an issue around whether companies are misusing data to the detriment of their customers but to the benefit of the company, without their permission or knowledge.

A big issue for risk managers

The government isn’t alone in identifying big data as an area needing further scrutiny. It was also ranked by respondents to the IRM’s Setting the Risk Agenda 2025 report as the technological development having the second greatest impact on organisations both today and by 2025.

Risk managers need to be clear on how their organisations collect, analyse and utilise data to ensure their policies and procedures meet all the regulatory requirements, and that they are acting within the expectations of their customers. No one wants to be hit by a scandal where customers find out their data is being used in unexpected ways or even being sold on without their knowledge.

The digital world is a fast-changing area with regulators needing to keep up with developments in order to be effective – and risk managers may face the same struggle. It’s a challenge that the CMA will address through a new dedicated team to handle the use of algorithms, artificial intelligence and big data in business.

In an interview with the Financial TimesAndrea Coscelli, chief executive of the CMA,  said: “If you look at our [case] portfolio now, in many ways we follow the economy and we follow technology. Compared to five years ago we do much more work in digital spaces, both on the consumer protection side and the competition side.”

Risk managers must have access to expertise in these areas to ensure their companies don’t fall foul of any regulation and, importantly, that their business is well positioned to take advantage of big data used within the rules to stay ahead of their competitors.

Creeped out

The use of big data isn’t automatically a bad thing – for example, each time Amazon or another online shopping portal recommends something a customer may be interested in, that’s the result of data collection coupled with the use of artificial intelligence (AI) – and it might be quite useful.

According to a report from Engineering and Technology magazine, using artificial intelligence (AI) to interact with customers and predict their wants and needs will become a mainstay of consumer-driven businesses in the near future.

But Alex Loizou, co-founder of fashion company Trouva, warned that consumers can feel “creeped out” if they feel that brands have too much information about them, especially if they try and show them desirable products and services “that the customer thinks they shouldn’t know about”.

So there is still some way to go in the development of big data and AI, and a lot to learn from consumers about the types of business use of their data that they will and won’t accept.

Clark seems at least to be looking at the issue from a positive point of view for the UK, and at both the risks and opportunities offered by big data: “I want to address these new challenges in a way that makes us the best place to develop these new technologies because we have a regulatory system that has already thought about the unintended consequences.”

This is of course the way all risk managers work – planning for the unexpected – and having a great understanding of the use of big data will stand everyone in good stead if legislation does come our way.