At 10 billion and counting, there are now more Tinder “matches” than people on earth. Just try reading that sentence again and turning it around in your mind for a moment.
For the masters of this great teeming empire of romantic entanglement, the foremost legal, commercial and reputational issue is data security.
Dating apps are not mere messaging tools, although by definition they encourage sharing of the most intimate billets-doux. They also provide “location-based services” – that is, they track you and tell you where you are in relation to other users – and have a growing tendency to blur, or even traverse, the boundary between private and public, by incorporating elements from social media accounts, such as Facebook and Instagram, more commonly associated with “offline” life.
While dating apps are yet to have their Ashley Madison moment, they are not without fault lines or vulnerabilities. A WIRED investigation earlier this year found that, with some fairly basic probing, some of the world’s most popular dating apps were leaking Facebook identities, location data, photographs and other personal data.
For the dating app developer, not only are there the usual salad of commercial legal issues, relating to brand protection and e-commerce, which face all developers, there is also the tangle of international data protection and privacy laws to bridge.
As a basic legal requirement, app developers operating within the EU must put in place appropriate technical measures to ensure data security: secure hosting, coding and regular testing. They must also put in “organisational” measures: policies and policing of employees, sub-contracting with third parties.
Under the General Data Protection Regulation (GDPR), which comes into force across the EU in May 2018, whether security measures are appropriate will depend on “the state of the art, the costs of implementation and the nature, scope, context and purposes of processing [of personal data] as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons”. That is to say, the appropriateness of security measures will depend on what is available, the nature of the business and the potential impact on users.
This may afford some comfort to the start-up with unbulging purse, but it is not a free pass, and it would be folly to treat it as such. It is telling, and indicative of the enduring damage a breach can cause, long after it has faded from the news cycle, that most readers will immediately grasp what I mean by “Ashley Madison moment”. App developers would be foolish to think they can get away with basic or minimal requirements.
Once the GDPR is in force, breaches by businesses could lead to fines of up to €20 million or 4% of global annual turnover, whichever is the highest. (Just try turning that around in your mind.) To put that in stark perspective, the current ceiling for the most serious data breaches in the UK is £500,000. Even for the largest tech behemoths, a haircut of 4% of global annual turnover (revenue, not profit) would be dire.
Failure to comply could lead not only to a breach of the law, with its civil, regulatory or criminal penalties, but creates the risk of data breach, loss, hacking or unauthorised access, with its often much greater commercial and reputational penalties.
Phil Hartley is an associate at Schillings