You may consider the youtube video Slaughterbots a piece of science fiction but that would I think sell it short. I prefer to think of it as a thought experiment with regard to how swarm robots coupled face recognition software might be used as autonomous killer robots. That is robots who can decide for themselves when to kill a human target when the face recognised matches a ‘threat’ identified by those who own and control the deployment of the swarm robots. It’s easy to see this as fanciful but many serious folk are taking the possibility of autonomous killer robots very seriously. From a government’s point of view deploying robot soldiers as opposed to human soldiers has many advantages, not the least the lack of human casualties. At the moment robot soldiers of various kinds operate in collaboration with humans who have the ultimate ‘say’ with regard to a ‘kill decision’. This was explored effectively in the film Eye in the Sky Face recognition software played a significant part in the human decision to initiate a lethal strike. So Eye in the Sky to some extent endorses the thesis in Slaughterbots of the near reality of autonomous killer robots. The use of swarms of killer robots reduces the research and development costs significantly – each bot is cheap and mass manufacture is relatively inexpensive and the software guiding swarm behaviour is not that complex – as indicated in the youtube video. Where is this issue taken seriously – look no further than the Ban Lethal Autonomous Weapons website This provides a call to action and links to a campaign to stop killer robots
This is an important issue facing society and the question for us involved in teaching young people is to what extent should such an issue be explored in school? One of the justifications for teaching design & technology as part of a general education for all young people is that it introduces them to such issues and gives them the intellectual tools to think about them in a critical yet constructive way. I look to the day when such issues feature in the written examination of the recently introduced D&T GCSE. Would this be too much to ask of a GCSE introduced to reinvigorate the subject?
As always comments welcome.
- Google has Google Home, a hands free smart speaker which will be able to answer questions supported by advances in translation and image recognition.
- Microsoft hopes to dominate the business space.
- Apple has the HomePod to be launched in December and is investing in emotion detecting technology
- Amazon has Alexa which will on request provide access to goods and services with more to come.
And according to an article in the September 2017 edition of Wired, authored by Liat Clark, Amazon is the front-runner. Whereas Google can provide information, Amazon can bring you things! Google Home is the smart friend at a party whereas Alexa is a benign butler. According to Liat Clark …
Amazon wants to introduce Alexa into every area of your life: your home, car, hospital, workplace. The ‘everything’ store is about to be everywhere. Alexa has to be human like because it is essential that people trust her, enough to let visual and audio ‘surveillance’ into their homes ad lives. Alexa can try to empathise with words alone at the moment but when she has cameras at her disposal she will be able to respond to visual clues as well as aural input. And in response Alexa is becoming more human like. Alexa can whisper, pause, take a breath, adjust its pitch and allow for key words such as ‘ahem’ and ‘yay’ to be emphasised in more engaging ways. Forging an apparently ‘emotional’ response from Alexa is the goal. An AI will need to know a person well to engage in a relationship based on emotional response. Amazon may well know more about you than your closest friends and so, of course, will Alexa and be able to use both what you say and do to forge, maintain and extend that relationship. The insightful film Robot and Frank asked the question, “Can an AI be your friend?” Amazon has the answer, “Of course, if you trust the AI as you might another human.” And that is Amazon’s overriding intention – to get us to trust Alexa as we might a human friend in the knowledge that she is not in fact another human and hence will not pry into your life or betray you as a human friend might.
Of course Jeff Bezos (and the CEOs of other tech giants) are constructing cathedrals of capitalism where they intend consumers to come to worship and offer up as sacrifice their wages in return for the goods and services recommended and provided by AIs they trust. But here there is a supreme irony. The very same AIs that are the heart of this new faith are also being deployed to automate many of the functions the worker-worshippers utilise to earn the wages they need to live out their consumerist lives. AIs may be simultaneously the engine of capitalism and its doom. What are we to make of this conundrum? Surely it is worth discussing with the young people whose lives will be most affected by this impact of technology on society and society’s response. And where better to do this than in design & technology lessons.
As always comments welcome.
Early in 2013 when there was considerable debate about the government’s proposed National Curriculum Programme of Study for design & technology. Dick Olver, chairman of BAE Systems, one of the UKs biggest companies, criticised the government’s proposal on the following grounds: The draft proposals for design & technology did “not meet the needs of a technologically literate society. Instead of introducing children to new design techniques, such as biomimicry (how we can emulate nature to solve human problems), we now have a focus on cookery. Instead of developing skills in computer-aided design, we have the introduction of horticulture. Instead of electronics and control, we have an emphasis on basic mechanical maintenance tasks. In short, something has gone very wrong.” The result of such authoritative criticism was a complete revision of the proposed programme of study such that it included the following statement under the teaching of design: To use a variety of approaches, such as biomimicry and user-centred design, to generate creative ideas and avoid stereotypical responses. Although biomimicry was a non-statutory example of a design strategy it was mentioned by name.
The Design and Technology Association ran inset sessions to help teachers understand what was for many a new idea. And many teachers have since taught pupils at both KS3 and KS4 about biomimicry, particularly how designers have used it as a creative product design tool. At its most basic the development of webbed gloves and flippers to aid swimming (biomimicking a frog) and more sophisticated the use of corrugated card for a cycle helmet based on the bone structure in a woodpecker’s skull. And of course it’s possible to view the circular economy as a systems approach based on biomimicry that can be used to move the world away from a destructive linear economy.
Underlying this appears to be the idea of biomimicry as a benign design tool; one that can only be used for good with few if any harmful consequences. But this view misrepresents nature and the constant struggle between and within species for survival. This was made very apparent to me when I read Kill Decision by Daniel Suarez. It’s a rollicking good read but I won’t go into too much detail as this will spoil the story for those who haven’t yet read this excellent piece of science fiction which borders very much on science fact. A key element of the story is to use biomimicry of weaver ants to develop swarms of lethal quadcopter drones that once unleashed can operate without human intervention and control. Weaver ants are able to communicate with one another by laying down and following pheromone trails which indicate the task to be accomplished be that foraging or territory defense. In the case of territory defense the trail will lead more and more ants to the sites where defense is necessary and even large intruders are soon overcome by the multitudes of smaller weaver ants that converge on the site. The brain power of individual weaver ants is of course very small but the colony achieves highly effective defense by getting large numbers in the right place at the right time to attack and kill the intruders. So imagine using biomimicry to transfer this ability to a swarm of drones, each drone with highly limited AI and equipped with simple but effective weapons.
This led me to ponder the role of design strategies in general. In themselves they might be considered neutral in terms of being intrinsically good or bad but their use will of course depend on the intentions pursued by the designer. So the buck clearly stops with us humans. The case of robots and the intention to use them in warfare has led Noel Sharkey, Emeritus Professor of Artificial Intelligence and Robotics & Public Engagement University of Sheffield, to urge extreme caution and argue for international conventions to govern their development. So as always with design & technology we find ourselves in territory where values are as important if not more so than knowledge, understanding and skills.
The House of Commons Science and Technology Committee are holding an enquiry into Robotics and Artificial Intelligence. David and Torben submitted written evidence which says, in a nut shell, ‘teach young people about them at school through D&T lessons that encourage them to consider the consequences of deploying technologies’. Clearly our Disruptive Technologies Project is aimed to help teachers do just that. And as if on cue The Robotics Teaching Guide is now available here, later than expected but we think you’ll find it useful. As always comments much appreciated.