This is the fourth in a series of posts on the GCSE D&T Curriculum Reform Consultations that David and Torben have written together. It builds on the first post, which introduced some of the thinking of the Design & Technology Association working group in justifying a single GCSE title D&T, a second post that examined the proposed D&T content in some detail and the third post focused on the Areas of Interest. This post discusses the proposed assessment arrangements outlined in the Ofqual consultation (this consultation is being run independently of the DfE consultation that we have largely focussed on up to now).
The nature of the non-exam assessment (NEA)
The Design & Technology working group was taken with the idea of using open starting points to ensure that students explored the context of their design challenge before developing any sort of design brief.
(Starting points refer to situations/scenarios from which students are able to create and contextualise their own design brief.)
The working group thought that this was important because it would prevent students deciding what they were going to make at the start of the challenge – the ‘I’m going to make a coffee table for the living room at home’ syndrome. Open starting points refer to situations/scenarios, which students explore in order in order to establish sufficient understanding of the context to be able to create their own design brief. To ensure that this approach was embedded into the NEA the working group suggested that:
- each awarding organisation will release starting points that students must use in order to create their own individual design brief in order to contextualise their own design and making project
- the starting points will be released by all awarding organisations to students and teachers on 1 June in the year prior to certification
- new starting points will be released each academic year in accordance with the timescale above
- the starting points will not directly relate to any Area of Interest
- students will be required to apply their knowledge, understanding and skills of the designing and making and the technical principles
- students must submit clear evidence of their iterative design process and produce a practical outcome in the form of a final product or prototype.
However, there is no guidance at all in the Ofqual proposals about the nature of the non-exam assessment and this is a concern. Is the implication that awarding organisations will be free to make their own proposals here? And if so, by what criteria will these proposals be judged? The journey from open coursework projects to much more closed projects and ultimately the (tightly) controlled assessments was, we believe, a huge loss to the subject. We do understand that a move to open projects would require a great deal of teacher support, but we think that could be provided and the subject would be richer – and more rigorous, for it.
It seems to us likely that, without any pressure to do otherwise, AOs will default to the current form of non-exam assessments and we’ll get something very like controlled assessments offered again.
If you agree that the absence of any guidance from Ofqual on the nature of the NEA is problematic, then please say so in your response.
If you agree that the NEA for D&T should be based on open starting points, then make this clear.
Minimally invasive assessment
There is nothing in the proposals about how pupils will provide evidence for non-examined assessments. We suspect that, without external push, the AOs will default to what they know; death by portfolio.
Nick Givens (writing as long ago as 1998 in response to a consultation on Creativity and Cultural Education) clarifies the problem:
Our [the teacher’s] problem always has been, and remains, that of finding efficient painless ways of generating EVIDENCE that don’t stifle the creativity. So the ritualisation of designing, the conversion of the design folio into a product and the inflexible narrow interpretation of what constitutes design, represent a major problem. There needs to be scope for pupils to model and record their thinking in a variety of ways AND orders. We can’t carry on letting a narrow view of what constitutes EVIDENCE-of-design dictate the NATURE of design.
The work carried out by Richard Kimbell and his colleagues at TERU in the e-scape project provides an interesting approach to assessing capability in D&T. It involves using structured testing materials over two half-day blocks under examination conditions. This is a huge step forward compared to the stranglehold that the coursework portfolio has over pupil assessment in design & technology. However despite the undoubted achievement of this project we have significant reservations about this one-size fits all timed test as the best way for pupils to reveal their capability. It does not meet Nick Givens requirement for pupils to have the scope to record their thinking in a variety of ways AND orders.
We think it is important that any NEA should be minimally invasive, i.e. the way the pupil’s work is assessed will be such that it will not distort their educative experience in tackling their designing and making task. We suggest that the way pupils can be given appropriate freedom to decide on their own designing and making pathway is for the teacher to support them in the use of ‘job bags’.
The criteria for the contents of job bags are simple. The pupil’s work, be it in the form of written notes, annotated sketches, 3D models, working drawings, patterns, recipes, plans, schedules, still photos, video recordings, audio recordings, questionnaire data or calculations, must have utility. It must be present only because at the time it was produced it was done to help move the designing and making task forward. Such a miscellany would be personal to the pupil and it is likely that there would be considerable variation in content of such job bags even amongst pupils tackling identical design and make tasks. However, the job bag would not be the primary source of assessment evidence. It would be the evidence that the pupil called upon to reveal and justify their decisions. And it is the revelation and justification of the design decisions demonstrated at three points during a designing and making task that provides the bulk of the assessment evidence for the pupil’s designerly activity.
The first point is reached when a pupil has explored the context and developed his or her first ideas for a product in response to the context. A pupil will be asked to consider whether the developed proposals meet the revelations of the context and requirements of the brief and to clarify and justify the design decisions made so far. The pupil will also be required to review these decisions and consider whether what s/he is proposing is likely to be achievable in relation to resources of time, materials, equipment and personal skills.
The second point is reached when most of a pupil’s design decisions have been made through sketching, 3D modelling, and experimenting. This will be at the point where making is imminent or has just started. Again, the pupil will be asked to clarify and justify the design decisions made so far and then review these decisions and consider whether the design arrived at fully meets the requirements of the brief and whether the plans for making are achievable.
The third point will be reached when the product is complete and will include an evaluation against the brief and the specification.
The emphasis of these points or reflection will be on revealing a pupil’s response to the emerging demands of the task in terms of the decisions made and the extent to which they are realistic. Note that it is the contents of the pupil’s job bag that is the source of the information the pupils need to make the necessary reflection and it is the contents of this reflection that provides the assessment evidence. This approach to assessment is discussed in more detail here but we think this approach is worth exploring further by the Awarding Organisations as an alternative to current practice.
If you think that the Awarding Organisations should develop means of assessing pupil’s designing and making tasks that avoids death by portfolio yet allows a personal response such that they can record their thinking in a variety of ways AND orders then recommend this in your response to the consultation.
If you think that the Awarding Organisations should develop means of assessing pupil’s knowledge and understanding of the subject’s enduring ideas as well as their procedural competence in the NEA then recommend this in your response to the consultation.
The written paper
We note that in the DfE consultation, under Subject Content, point ‘7’ says:
Specifications must require students to study these principles in the context of one of the areas of interest defined in paragraph 13.
Our position is this. If we limit what young people should know and understand and be able to do to that which is required to successfully tackle a major designing and making task we are selling the subject short. Not that tackling such a task is an insignificant endeavour. It is not. It requires hard investigative work to appreciate the nature of the problem the task has to address, what we might call knowledge of the problem. For an authentic task this knowledge cannot be ‘taught from the front’ or looked up in a textbook. It has to be sought out through a user-centred approach to design. Techniques for doing this can of course be taught. Then in responding to the problem there are all sorts of knowledge, understanding and skill needed – what we might call knowledge for the solution. Some of this a pupil may have been taught but some may well be beyond what has been taught and the pupil will need to find out for herself. But however demanding any one project might be it cannot cover the breadth of knowledge required to appreciate a whole subject. So limiting assessment of design & technology to the procedural competence, however knowledge, understanding and skill dependent, is, we believe, insufficient. We want to assess the extent to which pupils have really grasped the enduring ideas that are important in design & technology in a way that is true to the nature of design & technology. And allied to this wider interpretation of what is worth knowing about and learning through a broad and balanced design & technology course is developing ‘technological perspective’. By this we mean giving young people insight into ‘how technology works’ such that they develop a constructively critical view of technology, do not become alienated from the technologically based society in which they live and are able to consider how technology might be used to provide products and systems that help create the sort of society in which they wish to live.
Hence we believe that the written paper should be completely independent of the Area of Interest that candidates have chosen and deal with assessing the understanding of enduring ideas that are important in the subject of design & technology and probing candidates technological perspective.
If you think that the written paper should be completely independent of the Areas of Interest that candidates have chosen, then please recommend this in your response to the consultation.
Of course rigorous assessment of the enduring ideas in design & technology will be difficult as this territory is not that well explored. However here is an initial foray into thinking about questions for consideration.
- Some questions will need to probe specifics, others should be synoptic, requiring candidates to draw on knowledge and understanding from a range of specifics.
- Most, if not all, questions should require the use of knowledge to show understanding as opposed to simply being able to recall particular pieces of knowledge.
- We see many questions as requiring candidates to respond to/resolve different sorts of designerly dilemmas.
- Some questions can require explanatory writing; in some cases quite extended writing. How such answers can be marked will be a challenge. However English and History teachers have that expertise and we can look to them for advice and guidance.
- The majority of questions should be structured into parts that scaffold the candidate in developing a solution to a problem.
- Some questions will require quantitative as well as qualitative reasoning.
- Some questions can be multiple choice.
- One type of multiple choice question that might be interesting to explore is the ‘assertion reason’ question. The overall form is
“Statement A” because “Statement B”
The two statements can each be either true or false and, in any case, Statement B might or might not be an explanation of Statement A.
An example assertion might be “Cows eat grass because the sky is blue”. “Cows eat grass” is true, “The sky is blue” is true, but the sky being blue is NOT an explanation of why cows eat grass. The question offers various possible answers that candidates can select from, these are permutations of the truth or falsity of the initial statements and the assertion.
These questions are demanding to write but really probing in terms of knowledge and understanding.
- Some questions will be like none of the above as we work out different sorts of questions for our purpose
Developing such questions will be a challenge but it is one to which the subject must rise.
If you agree that a good range of question types will be needed within the written examinations then please say so in your response and suggest that the Awarding Organisations should develop and disseminate details of such question types.
The proposed Assessment Objectives
These are shown below and, as far as they go, we are happy with them and the proposed weightings.
|AO1||Investigate design possibilities and considerations for development.||15%|
|AO2||Design and make products / prototypes that meet needs and solve problems.||35%|
|AO3||Justify design decisions and analyse and evaluate products / prototypes made by themselves and others.||20%|
|AO4||Demonstrate knowledge and understanding of designing, making and technical principles.||30%|
However, what this table doesn’t make clear is how these AOs are distributed between the written paper and the NEA. We think something like the following allocation would be appropriate:
|Assessment objectives||Exam||NEA||Total Weighting|
|AO1||Investigate design possibilities and considerations for development.||10%||5%||15%|
|AO2||Design and make products / prototypes that meet needs and solve problems.||0%||35%||35%|
|AO3||Justify design decisions and analyse and evaluate products / prototypes made by themselves and others.||15%||5%||20%|
|AO4||Demonstrate knowledge and understanding of designing, making and technical principles.||25%||5%||30%|
This is based on our clear view (above) that the marks for the NEA should be awarded on the basis of candidates’ ability to design and make within their Area of Interest. The marks for the written paper should be awarded on the basis of the demonstrated knowledge and understanding of the enduring ideas pertinent to design & technology and independent of their chosen Area of Interest.
But the fine detail isn’t important here; what is important is that Ofqual should prescribe these weightings for the Awarding Bodies.
If you think that the weightings of the AOs between the Exam and the NEA should be prescribed by Ofqual then please recommend this in your response to the consultation.
The lack of tiering is definitely to be welcomed. We know from other subjects, such as Science, that tiering is proving a real problem when it comes to question setting.
If you agree that having no tiering is appropriate for D&T, then please say so explicitly in your response.
Having discussed the move to a single GCSE for D&T in the first post, considered the proposed D&T content in some detail in the second and third posts and looked in detail at the proposed assessment arrangements in this post, the fifth and final post in this series will draw together all of the Consultation Actions we have suggested in the form of useable responses to the two consultations.