Robots but not as we know them

IMG_1804Every now and then New Scientist publishes a piece that almost unintentionally raises profound issues about technological possibilities. This was the case with an article published in the 12 May 2018 issue; the title – Bionic beetles take to the skies. It describes the work of Hirotaka Sato from Nanyang Technology University. He and his research team took male Mecynorhina torqata beetles and implanted electrodes into their flight muscles and used electrical pulses to steer the direction and speed of their flight. The upshot is that this research shows that truly bio-hybrid robots the size of insects are a real possibility. These ‘bionic beetles’ have the potential to act as the first arm of search and rescue in ways far superior to current drone technology. Bio-hybrid robotics is likely to be developed initially across a wide range of insects. And here’s the question: Why should it stop at insects? It’s easy to see that other more intelligent animals, dogs and dolphins, for instance might easily become part of the development trajectory of bio-hybrid robotics. And of course someone somewhere will think about and investigate the possibility of human-hybrid robotics. Initially this is likely to be in the search of the development of alternatives to and improvements of prosthetic limbs for those who have suffered accidents.

But will it stop there? Such technology will almost certainly be developed to enable enhancement and in the future, for those who can afford it, integration of such technology into their bodies to provide improved physical performance will be possible. They will have become trans-humans. Throw in the idea that governments might use this technology to enhance military performance and we appear in dystopian science fiction territory but it might not be science fiction but technology fact. The trajectory of any technology is inevitably uncertain but as we move along any trajectory it seems essential to ask with regard to what we might be able to do “We can, but should we?” For those who teach design & technology this surely must be a key message in our teaching.

As always comments welcome.

240px-Cyberman_2013PS Have been unable to resist the connection between this article and Dr Who episodes from my childhood, my children’s childhood and their children’s childhood – Cybermen!

Advertisements

What does technology want?

For those of us who teach design & technology a question must be what should we teach our students about the nature of technology and get them to consider the extent to which us humans can influence the technologies that are developed and what they might be used for? Kevin Kelly has an interesting viewpoint on this. Currently he is Senior Maverick for Wired and is well known for his provocative and unconventional views on the nature of technology. More details of his extraordinary life and work can be found here.

PortraitHe views technology as a conglomeration of individual technologies linked together into an overall system which he calls the ‘technium’ that has the properties we associate with a complex living being and as such has needs and wants which it tries to meet. He explores this idea in depth is his book What technology wants. He identifies three interacting influences that govern the technium:

 

  • The primary driver is pre-ordained development – what technology wants.
  • The second driver is the influence of technological history, the gravity of the past, as in the way the size of a horse’s yoke determines the size of a space rocket.
  • The third force is society’s collective free will in shaping the technium, or our choices.

Buzz AldrinKelly sees the first driver as the most significant with the second as an inevitable influence on the first driver and the third driver, how humans respond in the way they contribute to the development of technology and their reactions to it, as the smallest influence on how technology plays out in the world. This seems to explain why Buzz Aldrin was able to admonish the US government with his famous quote, “You promised me Mars colonies and I got Facebook!”

So where will technology take us if it has its way? My cousin Geoff sent me this list of possibilities with regard to the way the technium will behave with regard to automobiles:

  • Auto repair shops will disappear. A gasoline engine has 20,000 individual parts. An electrical engine has 20. Electric cars are sold with lifetime guarantees and are only repaired by dealers. It takes only 10 minutes to remove and replace an electric engine. Faulty electric engines are not repaired in the dealership but are sent to a regional repair shop that repairs them with robots. Essentially, if your electric “Check Motor” light comes on, you simply drive up to what looks like a car wash. Your car is towed through while you have a cup of coffee and out comes your car with a new engine.
  • Gas stations will go away. Parking meters will be replaced by meters that dispense electricity. All companies will install electrical recharging stations.
  • The first self-driving cars will appear for the public in 2018 (that’s now). Around 2020, the complete industry  will start to be disrupted. People won’t want to own a car any more. A person will call a car with his/her phone, it will show up at their location and drive them to their destination. They will not need to park it, only pay for the driven distance and can be productive while driving. The very young children of today will never get a driver’s license and will never own a car. A baby of today will only see personal cars in museums.

Are these predictions realistic? And if so what is driving them? Is it what we want or technology wants?

Finally I think we should note a quote from the late, great Douglas Adams about our reaction to technologies from the book The Salmon of Doubt

  1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
  2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
  3. Anything invented after you’re thirty-five is against the natural order of things.

Our students will be in the number 2 stage but we might want to give them pause for thought in the light of Buzz Aldrin’s disappointment in technology and the possibility that little is actually within our control.

As always comments welcome.

Teaching about new and emerging technologies in design and technology

A guest post by Harry Gowlett

With the changes that have been made to design and technology at GCSE level and the introduction of the single GCSE Design and Technology qualification, it has now become a priority to modernise the secondary D&T curriculum at school level. Whilst GCSE D&T most probably will remain a priority to many departments, it is also important to modernise and update the way in which we teach D&T at key stage three. This is especially important in order to engage learners in wanting to continue to study the subject for GCSE and to revitalise the subject, leading to an increase in uptake.

I am a newly qualified teacher having completed my training at Nottingham Trent University on the BSc Secondary Design and Technology Course with QTS. I now realise just how well this course prepared me for my career as a secondary D&T teacher within the education sector. The main focus of the course was to prepare me to deliver a modernised D&T curriculum, alongside learning the practical skills needed to teach the subject. Since leaving university I have now started a job at Sewell Park Academy (@SewellPark), which is a small city academy in Norwich. Due to the size of department I have been lucky enough to work alongside my head of department to introduce a range of new projects at key stage three taking into account my pedagogical knowledge learnt at university. These projects all fit into four main areas of D&T: mainly making, mainly designing, designing and making and D&T in society.

One of the projects that I have introduced to key stage three is about new and emerging technologies. The National Curriculum for D&T at key stage three states that learners should investigate new and emerging technologies. This fitted into my long-term planning for key stage three as a D&T in society project, and as a smaller three-week project. This project also helped to make the learners realise that not every D&T project leads to producing a physical 3D product, something that I feel learners now tend to assume and expect in our subject.

During week one of the project learners are introduced to the project and the practicalities of it are outlined. At my current school learners in key stage three receive two hour-long lessons of D&T a week. Lesson one involved learners watching a range of video clips and participating in small group and whole class discussions about various new and emerging technologies. Whilst taking part in these discussions the learners would also develop their note making skills, helping to develop literacy skills. Lesson two would involve learners deciding on their favourite new/emerging technology and then being organised into teams to investigate their technology in more detail.

Week two is the fully learner-centred set of lessons, where each team would research their technology. This can be achieved in a number of ways, if computers are available internet research is always a popular choice. To aid differentiation, I produced a set of resource packs for each technology using real life articles and suitable pieces of information. For lower ability learners, I would highlight key points, and the higher ability members of the class would be able to synthesise the information themselves. The class would be guided during the second lesson of the week to focus their research on a given criterion (ready to prepare a presentation).

In the third and final week of the project the learners firstly focus on producing a team presentation to share with the rest of the class. This should hopefully be achievable during an hour-long lesson as the research completed during week two has been narrowed down and homework opportunities have been utilised. Communication skills are also introduced to help encourage learners to develop cross-curricular and personal skills. During the second lesson of the week is an opportunity for the teams to share their presentations with the rest of the class. Learners would be asked to think individually of questions to ask the other teams whilst they are presenting. This raised some interesting thoughts and really showed learner engagement/understanding.

My planned assessment points of this project are investigating and analysis/evaluation, which comes from learners being able to select their favourite new and emerging technology, research this and then evaluate the impact that it has on society. If more than three weeks (six lessons) are available then there is a possible chance to explore the concept of design fiction, which is something that I am hoping to do later on in the year. There is also the good opportunity to use films to help introduce this extra element of the project, such as Big Hero 6 and the idea of ‘microbots’, linking to programmable matter! This also provides me with the opportunity of something else to share later in the year.

To access the scheme of work I have produced for this project or if you would like more information, feel free to contact me via twitter @HGowlett_DT or add your comments to this post.

 

Slaughterbots!

You may consider the youtube video Slaughterbots  a piece of science fiction but that would I think sell it short. I prefer to think of it as a thought experiment with regard to how swarm robots coupled face recognition software might be used as autonomous killer robots. That is robots who can decide for themselves when to kill a human target when the face recognised matches a ‘threat’ identified by those who own and control the deployment of the swarm robots. It’s easy to see this as fanciful but many serious folk are taking the possibility of autonomous killer robots very seriously. From a government’s point of view deploying robot soldiers as opposed to human soldiers has many advantages, not the least the lack of human casualties. At the moment robot soldiers of various kinds operate in collaboration with humans who have the ultimate ‘say’ with regard to a ‘kill decision’. This was explored effectively in the film Eye in the Sky Face recognition software played a significant part in the human decision to initiate a lethal strike. So Eye in the Sky to some extent endorses the thesis in Slaughterbots of the near reality of autonomous killer robots. The use of swarms of killer robots reduces the research and development costs significantly – each bot is cheap and mass manufacture is relatively inexpensive and the software guiding swarm behaviour is not that complex – as indicated in the youtube video. Where is this issue taken seriously – look no further than the Ban Lethal Autonomous Weapons website This provides a call to action and links to a campaign to stop killer robots

This is an important issue facing society and the question for us involved in teaching young people is to what extent should such an issue be explored in school? One of the justifications for teaching design & technology as part of a general education for all young people is that it introduces them to such issues and gives them the intellectual tools to think about them in a critical yet constructive way. I look to the day when such issues feature in the written examination of the recently introduced D&T GCSE. Would this be too much to ask of a GCSE introduced to reinvigorate the subject?

As always comments welcome.

Apple, Google, Microsoft or Amazon – which of these tech giants will help you live your life and spend your money? Whose AIs will you trust?

  • Google has Google Home, a hands free smart speaker which will be able to answer questions supported by advances in translation and image recognition.
  • Microsoft hopes to dominate the business space.
  • Apple has the HomePod to be launched in December and is investing in emotion detecting technology
  • Amazon has Alexa which will on request provide access to goods and services with more to come.

And according to an article in the September 2017 edition of Wired, authored by Liat Clark, Amazon is the front-runner. Whereas Google can provide information, Amazon can bring you things! Google Home is the smart friend at a party whereas Alexa is a benign butler. According to Liat Clark …

Amazon wants to introduce Alexa into every area of your life: your home, car, hospital, workplace. The ‘everything’ store is about to be everywhere. Alexa has to be human like because it is essential that people trust her, enough to let visual and audio ‘surveillance’ into their homes ad lives. Alexa can try to empathise with words alone at the moment but when she has cameras at her disposal she will be able to respond to visual clues as well as aural input. And in response Alexa is becoming more human like. Alexa can whisper, pause, take a breath, adjust its pitch and allow for key words such as ‘ahem’ and ‘yay’ to be emphasised in more engaging ways. Forging an apparently ‘emotional’ response from Alexa is the goal. An AI will need to know a person well to engage in a relationship based on emotional response. Amazon may well know more about you than your closest friends and so, of course, will Alexa and be able to use both what you say and do to forge, maintain and extend that relationship. The insightful film Robot and Frank asked the question, “Can an AI be your friend?” Amazon has the answer, “Of course, if you trust the AI as you might another human.” And that is Amazon’s overriding intention – to get us to trust Alexa as we might a human friend in the knowledge that she is not in fact another human and hence will not pry into your life or betray you as a human friend might.

Of course Jeff Bezos (and the CEOs of other tech giants) are constructing cathedrals of capitalism where they intend consumers to come to worship and offer up as sacrifice their wages in return for the goods and services recommended and provided by AIs they trust. But here there is a supreme irony. The very same AIs that are the heart of this new faith are also being deployed to automate many of the functions the worker-worshippers utilise to earn the wages they need to live out their consumerist lives. AIs may be simultaneously the engine of capitalism and its doom. What are we to make of this conundrum? Surely it is worth discussing with the young people whose lives will be most affected by this impact of technology on society and society’s response. And where better to do this than in design & technology lessons.

As always comments welcome.

Robot butterflies – a cautionary tale

Blc-Q2YCYAAMDyGIn his wonderful magical realism book One hundred years of solitude the celebrated Colombian author Gabriel Garcia Marquez describes a scene in which a young girl is surrounded by a cloud of butterflies fluttering around her. She is unafraid and entranced. article-2242598-1653D728000005DC-606_964x577Now imagine that the beautiful robot butterflies designed and made by Festo could be programmed to behave like this, fluttering around in such a way as to transport any human they surround to what might be described as a magical place. How marvellous would that be? Robotic-Butterfly-by-Festo-2You might even imagine that a literary young person who was studying D&T via the OCR GCSE specification might conceive of this as a possible solution to OCR’s exemplar contextual challenge of enhancing users’ experiences of public spaces. What a creative response! And by making contact with Festo the student might even be able to collaborate with their engineers in producing a prototype cloud of butterflies for deployment in a public place such as a park. But what of unintended consequences? Illah Nourbakhsh, Professor of Robotics at Carnegie Mellon University, has written a series of very engaging short stories in his book Robot Futures. They are all edifying with regard to the impact beyond intended benefit of robots in our society. In the story Robot Smog robot butterflies have been deployed in society for just this magical realism purpose but … the way the robot butterflies interact with humans is through eye contact. If you look at one or more of them they will flutter around your head making eye contact. And there is no off switch. They are powered via sunlight. When it gets dark they simply fall to the ground. Once the sun comes up they flutter off again seeking eye contact with humans. This has led to a situation where people walking in the park are afraid to look up and have taken to wearing sunglasses to avoid eye contact. I leave you to read about what else happens. So as with all things technological we need to be mindful of unintended consequences and ‘be careful what we wish for’. In my view Illah’s book would make excellent reading for Year 11 and above. I wonder how often we use these sorts of science fiction short stories to engage our students with the possible downsides as well as upsides of technology?

As always comments welcome

PS

And now this – cyborg dragonflies produced by genetic engineering to act as drones – not exactly biomimicry more bio combination!

dragonfly-drone-625x352

Build, Use, Damage, Mend and Adapt – an approach to learning through and about drones

A guest post by Ed Charlwood

What follows describes the work I’ve been doing in school that has led to me to set up a new Drones in Schools Google+ community for teachers.

A convergence of influences

As with much curriculum development, serendipity did its job at the outset of this endeavour, bringing together the opportunities offered by (1) the new GCSE and A Level specifications and their broader content requirements, (2) a growing dissatisfaction with a certain high-profile external “design / engineering” competition that really requires very little design and (3) the discovery of a very interesting little kit. Firstly, the long-awaited publication of the new GCSE and A Level specifications really was a wake up call that we could not continue to plough the same RM / Product Design furrow at either qualification level. I felt it important to embrace the specification in its entirety and that meant that at Latymer we would have to teach areas that were less familiar i.e. Systems and Control and Textiles. It also meant that we could fully embrace previously fringe areas that we had been pushing at for a few years but had been confined by old assessment criteria, namely the use of CAD, CAM and the circular economy. Secondly, I have seen our students be equally engaged and frustrated with external engineering competitions, they promised a glimpse into the competitive world of high level engineering but actually offered little real decision making, restrictive and difficult manufacturing processes and actually required a lot of luck and frivolous administration. I won’t name names. Lastly I came across a $99 / £78 kit from Flexbot, offering a 3D printable drone and the promise of an open source kit. A quick PayPal purchase later and I was the proud owner of a Flexbot Quadcopter (4 rotors), cleverly packaged, with a comprehensive and appropriate information booklet and a product that worked pretty much straight out for the box and could fly via an iPhone app. Bingo.

Drones are a great ‘hook’ for learning

Drones are popular in the media, comprehensible to most people and on a steep curve of becoming demonstrably better and cheaper at the same time. Currently they have the elusive “engagement factor” and this provides a ‘hook’ making them intrinsically attractive to students. Such a hook is, in my experience, vital. It is important to note that we are not coding experts, nor are we overly interested in programming. But we are interested in using electronics to do stuff. And it is here that the Flexbot Quadcopter meets our teaching intentions.

Our approach

Under the guidance of my colleague Nick Creak we handed the kit over to our students. They assembled the drone without difficulty. Then they had a play, crashed it and naturally broke it. They took the kit apart and made some key measurements, download CAD files from the Flexbot Wiki (SketchUp) and Thingiverse (.stl) and printed a replacement for the part for the one they broke. They then began to explore the files and started to design their own drone. Initially they did this by pretty much by simplifying and copying the existing design, a useful process in its own right to develop CAD techniques and collaborative skills.

A 3D printed Flexbot part

We then printed their chassis designs and used the slicing software to investigate various manufacturing options:

  • How long would the print take if it was “ultimate” or “low” quality?
  • What would happen if it had a low / medium / dense fill?
  • What were the implications of the design being aligned differently?

On average a “normal quality” high density print would take 2 hours. The booklet provided by Flexbot also has some interesting text comparing the economics of 3D printed manufacturing vs mass production techniques like injection moulding.

Students then could begin to design “iteratively” – a new key concept in the OCR interpretation of the new specifications.

“Iterative design is a design methodology based on a cyclic process of prototyping, testing, analysing, and refining a product or process. Based on the results of testing the most recent iteration of a design, changes and refinements are made.”

We also offered a number of design challenges: design a modular drone, alter your design to use as little filament as possible (make it cheap!) or to print as quickly as possible, design your drone to use a standard component – in our case this was a Lego axle.

Flexbot parts

The Flexbot circuit is robust enough to be shared between students and the batteries, propellers and motors are cheap enough to buy in bulk. If you do not have a 3D printer, jobs can be specified, costed and outsourced to a 3D print hub. The simulator (which is available once you have started the process of uploading parts for hub to print) shows it would cost approximately £6 for a basic chassis made from PLA by Fused Deposition Modelling. Some hubs even offer 25% student discount and most do almost next day delivery.

We additionally posed a number of extensions questions to our students, each eliciting a different design outcome: What is the effect of changing the alignment of the rotors? How big/small can the drone be? How much weight can it pick up?

Reflections

Design Decisions Pentagon

David Barlex has produced a design decision pentagon to describe the decisions that students might make when they are designing and making. So I was intrigued to use this to explore the decisions that our students were making.

Clearly they weren’t making any big conceptual decisions – the sort of product had already been decided – a quadcopter drone. The technical decisions in terms of how it would work had also been decided – four electric motors linked to flexbot circuit, controlled by the Bingo app. But there were lots of possibilities in the constructional decision-making.

Not 90°!

One student changed the alignment of the motors so that they were no longer at 90o to one another which made the drone faster but harder to control. And I suppose you could argue that this constructional change did in fact change the way the drone worked. A key feature of the pentagon is that the design decisions featured at each of the vertices aren’t independent of one another hence the lines between the vertices.

Interference fit

Another student responded to the modular challenge producing a design with four separate arms held tightly by an interference fit to the central node, taking advantage of the high degree of dimensional accuracy of additive manufacture. This required investigation and was in itself was a valuable learning experience.

Clearly it’s possible to set particular design challenges around constructional decisions e.g. making it more crash worthy.

Aesthetic decisions could also be made. Indeed changing the alignment of the motors could be seen as an aesthetic as well as a constructional decision. Devising light-weight covers that can be 3D printed or perhaps produced from nets that have been laser cut from thin sheet plastic might give the drone different ‘personalities’ and this may be seen as a marketing decision, changing the appearance to have appeal to different users. Marketing decisions can also be made with regard to how the drone gets to market – via a kit in a shop or on line, or via digital files for home or hub manufacture in collaboration with a circuit board/electrical motor supplier, related to this, deciding whether the product is open source or not is also a marketing decision. And just who the drone is for will make a big difference to what it might look like and additional features. And taking a step back how will the design decisions overall be affected by requiring drones to be part of a circular economy?

There is, of course, a “purer” engineering challenge, to design and make racing drones, where there are already a number of competitions with related rules and constraints.

The next area for us to consider is that of the consequences of drone technology, and its close cousin the Unmanned Aerial Vehicle (UAV) many of which have some more sinister applications; bombing, surveillance and smuggling as a counterbalance to the positive aspects; photography, delivery, surveying etc… each is a rich seam for discussion as well as the wider issues of automation, disruptive technologies generally or government regulation and control.

Far from this being a proprietary endeavour I want this to be a collaborative, open source one, so I invite you to join the Drones in Schools Google+ community to share your experiences, ideas and resources or add your comments to this post.

Ed Charlwood headshotEd Charlwood

Head of Design & Director of Digital Learning at Latymer Upper School, London

I am a passionate advocate of Design education who believes in the power of learning through analysis, designing and making. I am an Apple Distinguished Educator (class of 2013), a Google Certified Teacher (class of 2015) and the DATA Outstanding Newcomer to Design and Technology Award winner (2008), a particular focus of my work is to exemplify the notion that innovative and appropriate use of technology can redefine the traditional teacher-learner relationship and transform educational designing and making experiences. My vision is to inspire and empower students to make the things they imagine.