Are we ready for a world in which our personal data is handled by artificial entities, without the least awareness of human rights?
Obviously not. As with every technological transition, big data has its dark side, which has much to do with our unpreparedness to defend ourselves from the rampant abuses of technological expansion.
This was on the minds of speakers at the conference "Uomini e Macchine. Protezione dati per un'etica del digitale" ("Men and machines: data protection for digital ethics"), held at the Italian Chamber of Deputies on 30 January last in recognition of the EU's Data Protection Day. President of the Italian Data Protection Authority Antonello Soro was there, along with a choice panel of experts, academics, journalists and consumers' representatives.
"The neutrality of technology is a chimaera," Soro declared in his opening remarks. "Digital has become the story of our lives, a highly powerful agent of social transformation, both structure and superstructure, text and context. It is the framework for man's every expression, operating only under the parameters of functionality and efficiency. With the internet, technology became a dimension, an ecosystem in which we are so deeply mired that we don't realise it until it leads to the gravest consequences. The central question, as many commentators have observed, is that human tools are inadequate for understanding the consequences of the technical transition, and especially the absence of common codes, not just in the sense of rules – although they are needed – but shared meaning. Only by putting the individual back at the centre, then, can we regain the ground we've lost."
"Respect for digital identity," said Licia Califano, a member of the Data Protection Authority, "is a matter of rights, concerning not only people's freedom but their dignity. The right to protection of personal data affects every aspect of our daily lives, public and private.”
Essentially, the Authority argued, if we give up the core facts of our identity to big multinationals which are interested only in profit, within a model that rewards them for selling our data on to third parties, we are reducing those rights to a mere matter of profit margins. They manage our information through opaque, unaccountable algorithms which, needless to say, have not one iota of a social conscience.
This is a problem that concerns us as individuals, but also as a community. If, say, economic, or worse, political decisions are taken on the basis of these distortions (one such distortion being that Italian children learn English more slowly) the damage could be massive.
What we should fear is not technology in itself, but more the indiscriminate use of it, which is prevalent in both new economic models and our slowness to grasp it.
The millennial generation are most exposed to the risks, partly because they are born into a world where mass surveillance is a given, something natural and irreversible, implicitly denying certain rights in exchange for better services. This is why many experts consider the internet's youngest users to be those most at risk from it.
The book Nasci, cresci, posta ("Get born, grow up, post") by journalist Simone Cosimi and psychotherapist Alberto Rossetti asks why the big OTT platforms have placed an arbitrary age limit of 13 on children signing up to social networks, assuming that at that age they know the consequences just because they have absent-mindedly clicked “Yes” to the terms of a service.
"All Facebook's strategic moves," says Cosimi, "including those that are alleged to protect children, are nothing but a fig leaf to dispel the ire of national jurisdictions and the subsequent risk of fiscal punishment. The fate and the business of this big platforms is now based on how litigious they are and how they comply with the laws of the countries they do business in. So, when they put the age limit at 13, they are doing it so as to comply with North American law, and the effects extend to Italy. When they adopt technical solutions to stop children chancing upon violent content, or worse, other users with evil intentions, they are not doing it because they feel responsible on this front. Far from it. They are doing it so they can show their hands are clean and cannot be sued for in any way allowing adult users to illegally harm children."
Moving on from the realm of social media to that of devices, the consequences are even worse. Exponential growth in the Internet of Things, which will link 31 billion objects by 2020 according to Eurobarometer, poses new problems every day. Connected toys are emblematic of this, like the doll Cayla, which was taken off the French and German markets because it made it easy to spy on children's conversations.
This video might raise a chuckle on the one hand, but on the other, Luisa Crisigiovanni's (Secretary General of Altroconsumo and a member of BEUC) suggestions should be taken seriously. In her speech, she said we should "start keeping our devices clean in the same way we take care of our personal hygiene."
"After all," she concluded, "if the EU legal framework on connected things isn't satisfactory, we need to start regulating digital safety of our electronic devices, as we already do with mechanical products of mass consumption.”
Luckily, awareness is growing of these problems. According to data of Altroconsumo, 86% of people in EU countries think they have been victim to an IT crime, and this is already a big step forward.
This growing awareness is due in no small part to the thought-provoking work of journalists like John Sudworth. Finding himself in Peking at rush hour, when it holds 32 million people, the BBC reporter provided the local police with his identity card and challenged them to later find and arrest him in any street. Using surveillance cameras, the Chinese cops took just seven minutes to catch him. This story might be of comfort to some, but worrying to many others.