Security and privacy concerns surrounding automation and automated products have been in the news lots recently and once again the issue has come to the fore as CEO of iRobot (creator of automated vacuum Roomba) Colin Angle, gave an interview to Reuters, which appeared to suggest the company would sell ‘maps’ of people’s home to large corps such as Amazon.
The company’s automated vacuum creates a detailed map of users’ home as it cleans and the company thinks this information would be of interested to the likes of Google and Amazon too.
He’s not wrong, Google, Apple and Amazon would pay big for that kind of info.
However, people would have to give their permission for this to take place and this is where i starts to get tricky.
The company is keen to underline that any deal would operate strictly on an opt-in basis, even to map the home now, the user needs to give permission. Any talks for this to actually take place are in the very early stages and by the company’s own admission are a couple of years away, but when data like that is on offer to the ‘big three’ and at least a significant portion of the population (especially younger consumers) probably would not have a problem with sharing it, it seems likely it will happen at some point.
What is key here is that legislators and watch groups will be very hot on how these sort of opt-ins work, they are going to have to be very obvious and provide no come backs from user’s who at a later date will say they did not give permission or did not realise the implications. Issues around how much consumers actually understand the implications of these types of services is a big concern.
Today’s litigation culture is dominated by PPI, accidents and financial packages, will the next stage be companies targeting consumers who have had their privacy violated? Could be.
So, what does this means for installers and smart home professionals? Not too many are going to selling product like Roomba, but they might need to be taken into account automation wise, say when the machine comes on, turn the music up to lessen the noise.
What is far more important is to start to think about the implications, legal and otherwise of offering products and services which could have privacy and of course digital security issues for clients.
This type of feature may seem well bedded in to us in the industry, but in terms of creating a legal structure that can cope with the implications, many countries and indeed companies have not even started.
A product does not have to be the main ‘smart’ product that caused an breech, only to be the weakest link in the chain that enabled a problem to take place.
As an industry, we need to start addressing these issues now to make sure clients are protected and of course ourselves too, we want to provide the best and most cutting edge performance, but of course we need to do that without putting customers rights and security at risk.