Fair trade: Your soul for data?
- Retha Langa
In an increasingly data-driven world, are we just walking data sources for the benefit of giant multinational corporations?
Every single minute, there are 3.8 million search queries on Google; 4.5 million videos watched on YouTube; almost $1 million spent online; 41.6 million messages sent via WhatsApp and Facebook Messenger 鈥 and these are a fraction of the interactions that currently happen online.
As we go about our daily lives 鈥 sharing our personal experiences on social media, asking Siri to set our alarms, and counting how many steps we walk on our wearables 鈥 we are essentially becoming walking data points, where our information is collected and analysed to predict behaviour. Where will it end?
, Director of the National e-Science Postgraduate Teaching and Training Platform (NEPTTP) and the Wits Institute of Data Science (WIDS), predicts that in the next 10 to 15 years, humans will be 鈥渄irectly connected to cyber space without using devices. Your brain will be directly connected to the internet,鈥 he says.
Wits Biomedical engineers have already connected a human brain to the internet in real time. This Brainternet project essentially turned the brain into an Internet of Things node on the World Wide Web.
In 2019, the same team connected two computers through the human brain and transmitted words like 鈥榟ello鈥 and 鈥榓pple鈥, passively, without the user being aware that a message was present.
鈥淒o we really need to have our physical bodies to experience life, or do we only need to have our own brain?鈥 asks Celik. 鈥淲e will be seeing the systems creating those virtual environments to give humans an experience of nature. You want to go and see the osean, but do you really need to physically go there? Can I stimulate a part of my brain to give me that experience?鈥
Android rights and the Big Other
Dr Christopher Wareham, Senior Lecturer in the Steve Biko Centre for Bioethics at Wits argues that we need to think about the implications of such technological developments from the perspective of artificial agents. These 鈥渄igital beings鈥 will potentially have lives 鈥 and rights 鈥 of their own.
鈥淭raditionally the focus on this question is very much on the other side of the issue: How are we going to stop them from harming us? There is very little work that looks at it from the other side. How are we going to prevent humans from harming this being, experimenting on it? Should there be laws that protect this type of being?鈥
The developments in machine learning and artificial intelligence (AI) already significantly affect how we live our lives today. American academic Shoshana Zuboff coined the term 鈥榮urveillance capitalism鈥 in 2014. Surveillance capitalism depends on 鈥渢he global architecture of computer mediation鈥 [which] produces a distributed and largely uncontested new expression of power鈥. Zuboff christens this the 鈥淏ig Other鈥. Currently, the 鈥淏ig Other鈥 includes Facebook, Google, Microsoft and Amazon.
Surveillance capitalism
Writing in The Guardian, Zuboff explains, 鈥淭he logic of surveillance capitalism begins with unilaterally claiming the private human experience as free raw material for production and sales. These experiences are translated into behavioural data. Some of this data may be applied to product or service improvements, and the rest is valued for its predictive power. These flows of predictive data are fed into computational products that predict human behaviour.鈥濃
Surveillance capitalism is a 鈥渞eal issue鈥, says Professor Brian Armstrong, at the . 鈥淚n my view, a very big concern is around the whole idea of social scoring.鈥 This refers to the practice of developing a social rating system to establish if a person is a fit and proper member of society, in terms of their 鈥渟ocial score鈥.
In China, private companies are already operating social credit systems, as is local government in pilot projects. The plan is to develop a nationwide system that scores the individual鈥檚 behaviour, including giving citizens a score and adding rewards and penalties for specific actions. For example, if you donate to charity, you score points but you lose points for traffic violations.
But one need not look as far as China for Big Brother-style surveillance. In Johannesburg, thousands of surveillance cameras already monitor motorists and pedestrians 24/7. In June, the Financial Mail reported that Vumacam 鈥 a subsidiary of internet fibre company, Vumatel 鈥 had installed more than 1 200 surveillance cameras to combat crime. By 2020, the number of cameras will increase to over 10鈥000.
Local security companies can access the Vumacam live feed and, as the artificial intelligence system learns what a typical day in a neighbourhood looks like, it will flag behaviour that is out of the ordinary for that area. Dr Helen Robertson, who lectures Data Privacy and Ethics in the , refers to the battle between our right to safety and our right to privacy that such forms of surveillance bring to the fore.
鈥淚t strikes me as plausible that we think our claims to safety have increased weight in contrast with our claims to privacy. If the relevant algorithms are going to identify abnormalities in the footage, we need to keep in mind how good these algorithms are or aren鈥檛.鈥
Safety vs. privacy
Our views on privacy have not only been impacted by safety concerns. The pervasiveness of social media has also played a role. Robertson says that the average person is willing to share a lot more about their private lives today compared to a few decades ago. These evolving views are not necessarily problematic. 鈥淚t might simply be a matter of one society鈥檚 convention in contrast with another society鈥檚 convention, and how they tend to feel with regard to how much they are willing to share.鈥
Celik believes that privacy will become personalised, with individuals being able to define how much privacy they want for themselves.
Our autonomy is another area influenced by the online world. Wareham argues that a lot of micro-targeted advertising and political messaging is designed specifically to degrade our autonomy. 鈥淚f you do a Google search now, you鈥檙e not going to get an unbiased sample of information 鈥 you鈥檙e going to get information that Google has catered for you to get ... these sorts of micro-targeting 鈥 want to trigger you through nudges to behave in certain non-rational ways.鈥
The question then becomes about who decides what you read, listen to, or watch, and who makes the decisions on what content is 鈥渁ppropriate鈥 for a specific digital platform, and what is not.
Towards tech that teaches
Data-driven advancements are, however, not all doom and gloom. 鈥淒ata in itself is not agnostically good or bad, but it is what we do with it. It can be abused, or it can be used for very positive purposes,鈥 argues Armstrong, adding that education is one area in which South Africa could benefit immensely.
鈥淚f we were able to use learning management systems more efficiently to see how students are learning, to see what material they are struggling with 鈥 to learn what teaching styles work best, we can individualise the learning experience.鈥
In China, AI-enabled education has already blossomed with tens of millions of students using some form of AI to learn. This includes tutoring platforms where algorithms curate lessons and adapt the curriculum based on an individual鈥檚 understanding of specific concepts, reports MIT Technology Review.
Protecting personal data
Staggering amounts of data are generated daily, but who owns all this data? Robertson points out that there is currently no consensus among ethicists about this thorny issue.
Some argue that the data subject owns the data. Others say that the data processor who uses his/her resources to create and analyse a dataset has ownership rights, while some argue that in certain cases, such as medical research that benefits society, the public鈥檚 need for medical treatment and breakthroughs mean that data belong to the public.
These different claims to ownership 鈥渁dd a lot of ethical greyness鈥, says Robertson. 鈥淭he ownership of data is particularly difficult. It is an object that can be traded, but at the same time, it has a reference to an individual, something like other artefacts do, such as photographs. The rights certainly seem to pull in different directions.鈥
In the near future, South 第一吃瓜网s will have considerable legal power regarding the protection of their data. The Protection of Personal Information Act (POPIA) aims to protect the right to privacy, while enabling the social and economic benefits that result from the free flow of information. POPIA stipulates conditions under which personal information must be processed lawfully, although there are exceptions.
These conditions include that personal information 鈥渕ust be collected for a specific, explicitly defined and lawful purpose鈥. Further processing of personal information can only take place if it is in line with the purpose for which it was originally collected. Most sections of the Act have not yet commenced. The announcement of a commencement date is expected before the end of 2019, after which companies will have one year to comply.
Verine Etsebeth, a Senior Lecturer in the Wits School of Law who specialises in data protection and information security law, says the POPI Act is long overdue. 鈥淭he sooner it is in practice, the sooner it can come before our courts and we can have precedents set,鈥 says Etsebeth. 鈥淚t is going to be survival of the fittest. If your competitor complies and you don鈥檛, you won鈥檛 be able to retain your customers. Companies will realise just how much their reputations are worth.鈥
Digital disempowerment
Despite the excitement over technology鈥檚 potential to solve some of our most complex problems, many South 第一吃瓜网s are still excluded from these advances. Only 40% of Africa鈥檚 population鈥痟as access to the internet compared to 61% for the rest of the world. In South Africa, internet penetration currently sits at 56%.
鈥淚n today鈥檚 world, digital disempowerment is one of the most profound forms of disempowerment,鈥 says Armstrong. 鈥淒igital disempowerment comes in three levels. The first is do you have access, secondly do you use it, and thirdly are you engaged, transacting and impacted? In South Africa, you don鈥檛 have access if the networks don鈥檛 cover where you are, or if you can鈥檛 afford the mobile device 鈥 or if you can鈥檛 afford the price of data. In all of those areas we have a challenge.鈥
- Dr Retha Langa is a freelance journalist.
- This article first appeared in鈥Curiosity, a research magazine produced by鈥鈥痑nd the鈥.鈥
- Read more in the how our researchers are exploring not only the Fourth Industrial Revolution manifestations of code, such as big data, artificial intelligence and machine learning, but also our genetic code, cryptic codes in queer conversation, political speak and knitting, and interpreting meaning through words, animation, theatre, and graffiti. We delve into data surveillance, the 21st Century 鈥楤ig Brothers鈥 and privacy, and we take a gander at how to win the Lottery by leveraging the universal code of mathematics.