Take note!

The handwriting of someone who doesn’t use a pen that often…

I find it difficult to imagine a psychology teacher who doesn’t think that students’ notes don’t matter. But we don’t often think about why they matter and how to help students to get the most out of making them. I’ve been asked to develop some CPLD materials on the subject of notes so I’ve been trying to distil my ideas about this recently. What follows may be stating the obvious but I don’t think that’s necessarily a bad thing, since the obvious is easy to overlook.

Why do notes matter?

Working memory capacity is limited, and the encoding of stable, long term memories is a slow and cumulative process. Effective note making strategies are therefore important for at least two reasons. First, notes are a holding area for material that has been presented to students but which has not yet been encoded (fully) into LTM. If note-making is effective, the ideas are recorded in ways that they can make sense of and encode later. Second, note making is a process which helps students learn. It gives them opportunities to identify what’s important, construct meaning from it and re-encode it in ways that require deep processing.  

What does research tell us about note making?

There’s a fair bit of research into the features of effective note-making, and relatively little of it will surprise anyone who has a basic familiarity with cognitive psychology. What follows is based principally on Beecher (1988), Marzano et al. (2001) and Marzano (2017).

  1. Verbatim note taking is relatively ineffective. Note making is most effective when the student is engaged in analysing and synthesising incoming information.
  2. Notes work best when student and teacher view them as a work in progress. Note making strategies should allow for the review and updating of notes, and teachers should plan for this as a distinct activity.
  3. Notes should be used as the basis for study for tests and examinations. This sounds obvious, but a surprising number of students don’t use their own notes in this way. This may be because it does not occur to them that their notes are for them (as opposed to for their teacher) or because the notes they have made are unsuitable for exam preparation.
  4. With note-making, less is NOT more. Students are sometimes instructed to keep notes as brief as possible. This is poor advice. Students who include more information in their notes tend to learn more and learn better.

Should notes be handwritten or typed?

It is becoming more common for students to create their notes using devices rather than writing them by hand. The evidence is against this. Although the typical student types faster than they write and therefore can record more (see point 4 above), students who type are typically focused on creating a verbatim record of what the teacher said and, therefore, are not particularly focused on making sense of the teacher’s messages (see point 1). Because hand writing is slower, students are forced to think harder about what matters and how best to encode it, which gives handwriting a significant cognitive edge over typing because information is processed more deeply (cf. Craik & Lockhart, 1972). In addition, the act of writing an idea down involves encoding a unique set of pen movements which may later act as a retrieval cue for the idea being encoded (cf. Tulving & Thompson, 1973; this might be familiar to any crossword solvers who have used the strategy of writing quickly unknown partial words to cue recall of candidate solutions). All keystrokes are roughly identical, so the same unique retrieval cues would not be available to those who type their notes. Clear empirical support for the idea that written note-making is superior to electronic comes from Mueller and Oppenheimer (2014), summarised usefully here by Cindi May.

How should we teach note making?

It is easy to assume that your students arrive with appropriate and effective skills and strategies already in place but, often, they don’t. Students can achieve a lot at GCSE with relatively poor note-making skills but then struggle at A – Level because their skills are inadequate for the volume of material they now encounter and the type and depth of thinking they are expected to do. Note making should therefore be taught explicitly as a skill. This means directly instructing students on how to do it, giving them opportunities to practice and giving them improvement-focused feedback on how they are doing. Although there is no automatic ‘best format’ for note-making (it depends on the subject, level and learners) I have no problem with being quite prescriptive about how students on my course are expected to make their notes. I generally set reading and note-making as advance preparation for class and I teach and expect my students to use a basic version of the Cornell system for their notes (example below).

Useful strategies for teaching note-making include:

  1. Giving examples of your own notes so that students have a clear idea of what they are expected to produce.
  2. Modelling the process of note-making for your students. Take a short text and give a copy to your students to read first. Then create notes ‘live’ either on the board or under a visualiser, ‘thinking aloud’ while you do it, so that students get access to your decision-making process. Saying things like, ‘I’m going to read the whole thing first, because I don’t know what really matters until I know what the whole text says’, or ‘This paragraph has a topic sentence, so I think this is important’, helps sensitise students to the key features of effective note-making.
  3. Scaffolding the note-making process. This could mean you modelling your note-making process for the first two paragraphs of a text and then telling students to continue on their own, while you circulate and give feedback. Or it could involve you creating a partial set of notes the students need to complete.
  4. Practising note-making as a regular task. Show you value it by setting it explicitly, checking it has been completed and offering feedback on what students have produced. This need not be time consuming. I regularly set note-making as a preparation task for class and my students are used to a routine whereby they start the class by ‘comparing notes’ while I circulate, check completion, ask the odd question about something that catches my eye and comment on features I like or dislike.
  5. Teaching explicitly the component processes of effective note making. First and foremost of these is summarising. The capacity to identify the critical ideas in a topic and describe the relationships between them is extraordinarily influential on subject learning, and so should be taught directly. Other useful skills include generating graphic or non-linguistic representations of ideas e.g. tables, spider diagrams, timelines, which can be used as part of note-making.

We introduced direct instruction in note-making several years ago when we rethought the design principles that underpin teaching in our department. It is part of our induction course and we insist on the Cornell system throughout the A – Level course. We have met remarkably little resistance from students. The overwhelming majority simply adopt our expectations as  ‘how we do things around here’ and it just becomes their default approach to note-making. I believe it contributes significantly to the quality of their learning, and I also believe that, by teaching note-making well, we equip our students with a skill set that will continue to serve them long after they have forgotten all the psychology they ever knew.

References

Beecher, J. (1988). Note-taking: What do we know about the benefits? ERIC Digest, EDO-CS, (37): 88-12.

Craik, F.I.M. & Lockhart, R.S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671-684.

Marzano, R.J. (2017). The New Art and Science of Teaching. Alexandria, VA: Solution Tree/ASCD.

Marzano, R.J., Pickering, D. & Pollock, J. (2001). Classroom Instruction that Works. Alexandria, VA: ASCD.

Mueller, P.A. & Oppenheimer, D.M. (2014). The pen is mightier than the keyboard: Advantages of longhand over laptop note making. Psychological Science,  25(6), 1159-1168.

Tulving, E. & Thompson, D.M. (1973). Encoding specificity and retrieval processes in episodic memory. Psychological Review, 80(5), 352-373.

Visible thinking and complex examination questions

No students were permanently harmed during this lesson.

Since the introduction of the new A – Level psychology specifications, complex essay questions have become a prominent feature of the examination landscape. By ‘complex’ questions, I mean those that  impose several distinct skill requirements that must be addressed simultaneously in order to attract credit. For example, the Edexcel 2018 Paper 2 contained the following question in the ‘Criminological Psychology’ section:

Kylie witnessed a crime and had to go to the police station for an interview. The crime involved a robbery of a shop in a busy shopping centre. Kylie was walking past the shop with her friends when she heard the shopkeeper shouting for help, as the thief ran out of the shop. The police carried out a cognitive interview to gather as much information as possible from Kylie about what she witnessed.

To what extent would the cognitive interview be effective in gathering accurate information from Kylie about the crime she witnessed? You must make reference to the context in your answer. (16)

In order to answer this question effectively, the candidate must evaluate cognitive interviewing in the context of the crime witnessed by Kylie. This means she has to show knowledge and understanding (AO1) of how CI might be used with Kylie (AO2) and make judgements about its probable effectiveness in that context (AO3). This is rather more demanding than the ‘Describe and evaluate cognitive interviewing’ type questions that used to prevail at A – Level.

When these types of question started appearing my initial response was that they were needlessly difficult and represented nothing more than a new set of hoops I needed to train my students to jump through. However, a couple of years into the new specifications and I’m now more inclined to welcome them as a challenge for us and our students to embrace. After all, our task as psychology educators is to support our students in attaining mastery of core psychological concepts, research, methodologies and ways of thinking. Complex questions are a more valid test of mastery than straightforward ‘describe and evaluate’ questions as they are not amenable to a brute-force ‘rote learn the facts and the criticisms without understanding’ approach. More pertinently, it seems to me that by using these types of question as a teaching tool, we can support out students in becoming better psychological thinkers.

Because the context material that accompanies complex questions cannot be predicted in advance, they require students to construct their response on the fly, under examination conditions.  In other words, they have to think. As Dan Willingham (2010) memorably points out, thinking is difficult and students are disinclined to do it even under ideal conditions. The examination situation imposes substantial psychological demands that reduce students’ capacity to think effectively (Putwain & Symes, 2018). Consequently, it is our responsibility to teach our students to think in the right ways long before the exam, and support them in acquiring a degree of automaticity that will allow them to devote their already-stretched cognitive resources to engaging with the content of the question.

The trouble with thinking is that you can’t see it. That makes it difficult for us to explain the sorts of thinking we want our students to do. It also makes it difficult for us to access our students’ thinking processes so we can check whether they’re being directed in the right way.  In recent years, I’ve drawn a great deal on Ron Ritchhart’s notion of making thinking visible to support my students in learning how to think (see Ritchhart et al., 2011). Ritchhart’s approach relies on  manipulables (e.g. sticky-notes) to represent concepts and the use of spatial organisation to represent relationships between them. Together with simple, repeatable dialogues and structures, they present a powerful toolbox for reducing unhelpful cognitive load, establishing transferable routines for dealing with generic subject-specific thinking situations and getting students’ thinking out in the open, where we can see it.

A visible thinking routine for complex essay questions

Here’s how I’ve been using visible thinking to teach students how to address complex questions. I lay the groundwork by presenting a question and asking the students to consider how they should address it and what an answer should do.  For example:

Joe has been convicted of criminal damage.  The magistrate sentencing noted that Joe had been arrested a number of times for similar acts and had a record of disruptive behaviour going bad to his school days.  The magistrate accepted that most teenagers get into trouble but that most seem to ‘grow out of it’ whilst Joe had not.  When asked why he had committed this crime, Joe said, ‘mostly because I’m bored…but sometimes things just wind me up.  That day I was supposed to be meeting my mates but the bus didn’t come so I just lost it a bit, smashed the bus stop up a bit.’  Joe’s father and older brother both have a similar history of antisocial behaviour and offending. 

Evaluate personality theory as an explanation of Joe’s offending. (16)

I encourage my students to adopt a four-question routine to set themselves up to address the demands of the problem:

  1. What do I know about this topic?
  2. What’s relevant in the context?
  3. What am I making judgements about?
  4. How can I justify those judgements?

This type of subject-specific metacognition is best taught by modelling, in my experience. The aim is for the students to understand that, in order to address the question satisfactorily, they need to form a principled judgement (AO3) of whether personality theory (AO1) is a valid explanation of Joe’s offending (AO2).

Students are then given sheets of A3 paper and three colours of sticky note (in my case, green, blue and orange). Pairs or three work well. The visible thinking routine is as follows:

  • First, students recall as many facts as they can that represent the knowledge and understanding required to address the question. They write one fact per green sticky note. These are collected in the centre of the A3 sheet, arranging them such that more closely related ideas are grouped together on the page.
  • Second, the students are then asked to read the context material carefully and look for specific things in the text that relate clearly to the facts/ideas on the green sticky notes. Each of these is noted on a blue sticky note and added to the sheet, near to the relevant facts, but concentrically outward.
Different colours denote different skill elements/assessment objectives.
  • Third, the students are asked to identify material relevant to evaluating personality theory, for example, supporting or challenging research findings, conceptual strengths and weaknesses and so on. Each of these is added to an orange sticky note, again placed near to the relevant application (blue) and knowledge (green).

The students are encouraged to keep thinking and recalling more relevant facts, applications and evaluative points throughout the activity, as each point made and recorded may cue either recall of other material or provoke new links between the ideas, deepening understanding.

Lines of reasoning flow from the centre towards the edge.

The fact that all the ideas are present on the page reduces cognitive load, helps the students think more clearly, and tells the teacher where they can most incisively intervene.  The flexible nature of sticky notes allows the students to think and rethink by trying out different positioning and juxtapositionings of ideas. The different colours allow the students to keep track of the different skill demands of the question, allowing them to spot gaps and deploy material effectively. By now the students should be in a position to trace lines of reasoning about the question by working from the middle to the edge.

  • The final step is for students to reorganise the sticky notes into a linear plan from which they could write their response.  The different coloured notes help here, prompting the students to organise their writing into balanced paragraphs that address all the question requirements.
The linear plan supports thinking about the sequencing of material.

I’ve only recently started using this approach with my Year 13s in a consistent way. They have commented positively on how it is helping them keep track of task requirements and organise their ideas before writing. Of course, the long-term intention is to remove the physical placeholders and the prompts from the teacher that support the process, leaving a purely mental routine that the students can use independently and without prompting. My feeling is that the systematic withdrawal of the various elements will be a fairly straightforward thing to plan, and, at each stage, I can draw attention to what I’m removing and why (e.g. ‘last time I gave you three colours of sticky notes but time I’m not…’) so that the students can establish a conscious rationale for their own thinking when approaching this type of problem.

It’s much more complicated to explain this approach than it is to do it in practice; I hope the accompanying photographs make this clear. I believe that it has the potential to help more of my students access the higher essay mark bands in their examinations. More importantly, I also believe that it can play a part in helping my students to become better thinkers in and about psychology.

Thanks

The concentric planning approach on which this VTR draws was developed collaboratively with Charlotte Hubble.

References

Putwain, D. W. & Symes, W. (2018). Does increased effort compensate for performance debilitating test anxiety? School Psychology Quarterly, 33(3), 482-49

Ritchhart, R., Church, M. & Morrison, K. (2011).  Making thinking visible: How to promote engagement, understanding and independence for all learners. Hoboken, NJ: Jossey-Bass.

Willingham, D.T. (2010). Why don’t students like school?  A cognitive scientist answers questions about how the mind works and what it means for the classroom.  San Francisco: Jossey-Bass.

Teaching effective revision strategies.

I have declared a personal war on exam technique.

Actually, I haven’t. Familiarity with the format of an assessment is a significant influence on students’ performance. What I’ve declared war on is the use of ‘poor exam technique’ as an excuse for under-performance that is actually caused by students’ failure to learn the material on which they will be examined.

‘Exam Technique’ attributions

Confronted with evidence of failure, many students find the ‘exam technique’ attractive because it allows them to sustain the belief that they are ‘bright’ and ‘a good student’. Most of the students I teach invest considerable time and effort in learning and preparing for tests/exams. Cognitive dissonance theory (Festinger, 1957) suggests that the thought, ‘I have done badly’ is incompatible with the thought, ‘I worked hard for this’. This give rise to psychological discomfort. Consequently, the student is motivated to reduce the dissonance. This can be done by making a suitable attribution.

Three possible dissonance-reducing attributions are: (1) ‘I am not capable of learning’; (2) ‘I did the wrong things whilst learning’; and (3) ‘I had poor exam technique’. My suspicion is that (1) is unattractive because of its implications for self-image and (2) is unattractive because it implies the need to change longstanding beliefs and habits around learning and revision. That leaves (3), which preserves both positive self-image and entrenched learning habits by allowing the student to think, ‘It’s OK, I know this stuff really, it’s just my exam technique that let me down’.

I suspect this may also be true of some teachers, at least some of the time. Knowledge of a student’s failure is dissonant with our beliefs about our own teaching (most of us believe we are above average; Hoorens, 1993) and ‘exam technique’ usefully deflects doubts about whether the things we spend time and effort doing are actually working, especially since most of us (I believe) are apt to avoid attributing students’ failure to stupidity (cf. Dweck, 1999).

Like most teachers, I test my students fairly regularly, for a variety of reasons. I see relatively few examples of students’ performance being affected significantly by what I would characterise as exam technique (e.g. gross errors of time management, inappropriate application of material or misapprehension of question requirements). I wish it were otherwise, as problems of exam technique are, in my experience, relatively easy to fix. But, ultimately, problems of exam technique are reserved for students that actually know their stuff and, in the majority of cases, the core problem is that they don’t.

It’s students’ learning that needs fixing, not their exam technique.

Retrieval practice

There is now fairly unequivocal evidence that the learning strategy most likely to result in retention of material is retrieval practice, that is, the reconstruction, without prompts, of information previously learned and stored in long-term memory. Students who practice retrieving material from long-term memory forget less than those who do not (see this chapter by Karpicke, 2017, for a comprehensive review). Karpicke identifies several reasons why retrieval practice enhances learning and recall. First, retrieval practice is transfer-appropriate processing. That is, there is a large overlap between recall practice during learning and the way students will need to use material in their exams. Second, the effort involved in retrieval leaves memory traces strengthened. Third, retrieval practice incorporates retrieval cues into memory traces in helpful ways (semantic elaboration).

Although theoretical accounts of why retrieval practice works are under development, the empirical support for its use is unarguable. A study by Roediger and Karpicke (2006) is fairly representative. Student participants were given unfamiliar material to learn across four study sessions. One group was told to study (i.e. read and reread) the material in all four sessions (SSSS). A second group studied the material in the first three sessions and, in the fourth, tested themselves instead, by writing down as much of the material they could remember in free recall (SSST). A third group (STTT) were allowed to study the material only in the first session and then completed three free-recall tests (STTT). All the participants were then given a recall test. This was done 5 minutes after the end of the final session and then repeated after an interval of 1 week. After 5 minutes, students who had studied and restudied the material (SSSS) had higher recall than the other two groups. However, after 1 week, the STTT group had the highest recall, followed by SSST, with the SSSS group showing the lowest level of recall.

The problem of spontaneous adoption

This study, and the many confirmatory findings, demonstrates the superiority of retrieval-based learning over restudying for retention of material over the longer term. It also hints at why many of our students may fail to adopt retrieval-based revision methods even when advised to do so: immediate recall in Roediger and Karpicke’s study was better when the students ‘crammed’. Since a typical student probably doesn’t retest themselves over longer intervals in any systematic way, they remain unaware of how quickly they forget information that has been learned that way.

Ariel and Karpicke (2018) highlight a number of unhelpful beliefs that students (and teachers) often hold that militate against the adoption of retrieval-based study strategies. First, there is the belief that restudying is the most effective way of learning material. Second, there is the belief that, whilst retrieval is a suitable way of monitoring learning, it does not, in itself, provide benefits to recall. Third, even when students do use retrieval-based methods, students tend to rely on a ‘one and done’ strategy, whereas the evidence is that it is repeated retrieval that has the most significant impact on retention.

Ariel and Karpicke’s paper describes a study showing that a straightforward intervention increased the spontaneous adoption of retrieval practice in a group of student participants. They were given the task of learning English-Lithuanian word translations. They used software that allowed them to chose between ‘studying’ (i.e. reading and rereading) and ‘practising’ (i.e. being tested). Participants were randomly assigned to either a control group who were simply told to learn as many of the words as possible in preparation for a final test or to a retrieval practice instructions group who were given (1) information about the superiority of retrieval over restudying; (2) a graph supporting this information; and (3) the advice that the best way of learning for the recall test was to ensure that each translation had been recalled at least three times before dropping it from study.

Source: Ariel & Karpicke (2018)

Students who received the retrieval practice instructions made more spontaneous use of retrieval practice during learning and performed better on the Lithuanian translations than the controls. Importantly, in a transfer test given 1 week later, those who had received the retrieval instructions made significantly more use of self-testing on a task involving learning English-Swahili translations.

A card-based revision strategy

I was sufficiently impressed by these results to use them as the basis of an attempt to improve my students’ use of effective learning and revision strategies. I used ‘statistical test choice’ as the focus since it is a small and discrete body of material, it is straightforward to test both recall and transfer of learning and it is something my Y12 students had not encountered before. I taught the content in a conventional way. Then, after explaining and justifying the revision strategy I wanted them to use, I gave each student a set of revision cards for statistical test choice. These are set up so that, when photocopied back-to-back, there is a question on one side of each card and the relevant answer on the other.

I explained that revision with these cards should be done as follows (the strategy is closely based on the one designed by Ariel and Karpicke):

  1. Create space on your desk for three piles of cards: STUDY, PRACTICE and DONE.
  2. Start by testing yourself on every card.
  3. If you can answer a question fully and accurately, put it on the PRACTICE pile. If you cannot, put it on the STUDY pile.
  4. Alternate between STUDY and PRACTICE. Any card you have studied should be put on the PRACTICE pile. Any card you have successfully retrieved should be returned to the PRACTICE pile. Any card you have been unable to retrieve should be returned to the STUDY pile.
  5. If you have successfully retrieved a card three times, put it on the DONE pile.

During the ensuing study session, cards should gradually work their way across from the STUDY pile to the DONE pile.

I demonstrated this process, and then got the students to try it. I circulated and watched how they went about it, coaching where necessary. Over the course of the lesson I gave them opportunities to use the revision strategy. In subsequent lessons, I tested their recall using this Socrative quiz, which tests recall of statistical decision rules and has no applied element. I asked the students to use the revision cards for 20 minutes before their next lesson.

Here are the quiz results at the start of the next lesson (the following day):

The majority of students had 100% recall, although some students either had not acquired the material or had forgotten it very quickly. At the end of the lessons, the quiz was repeated:

Recall was higher; student 10 went from 13% to 100% correct. The quiz was repeated after a four-day interval:

Interestingly, whilst the majority of the students retained 100% recall, student 10’s recall had fallen to 38%. It is interesting to speculate whether this was due to individual differences in memory or to differences in strategy adoption. At the end of the lesson, recall looked like this:

Student 10’s recall had recovered, and, overall, recall was very high (3 incorrect responses in 120 recall trials).

What have I learned?

My informal investigations with my Year 12s suggest that the card-based revision strategy using retrieval practice is at least as effective as what the students were already doing. Their reactions to the Socrative assessment feedback suggested that they appreciated the impact the strategy was having on their retention. They also found the card-based strategy acceptable and even fun, particularly if they added a social element.

This is all quite encouraging, so I have now started investigating whether the strategy transfers well to less well-structured material. Studies in this area typically use very well-structured material as it’s easy to test recall unambiguously, so it is somewhat open to question whether this card-based strategy requires adapting for use with less well-structured content. I have created a set of revision cards for learning the classic study by Baddeley (1966), which is a requirement of the Edexcel specification and one on which my students performed poorly in their recent end-of-year examination. It will be interesting to see whether it has a similar impact, and whether the students find it as acceptable for this sort of content.

Assuming that it works, my intention is to develop the card-based revision strategy with my Year 12s over the remainder of their course. The aim will be to shift the students from relying on me to make the revision cards and spontaneously to create and use their own as part of their ongoing preparations for their final exams. Depending on how this works out, I would consider adding the card-based strategy to our induction programme at the start of Year 12, alongside the other elements we currently promote as essential, including reciprocal teaching and the Cornell note-making system.

Thanks

Many of the ideas for this post came out of conversations with Andy Bailey.

References

Ariel, R. & Karpicke, J.D. (2018). Improving self-regulated learning with a retrieval practice intervention. Journal of Experimental Psychology: Applied, 24(1), 43-56.

Baddeley, A. D. (1966). The influence of acoustic and semantic similarity on long-term memory for word sequences. The Quarterly Journal of Experimental Psychology, 18(4), 302–309.

Dweck, C.S. (1999). Self Theories: Their Role in Motivation, Personality and Development. Hove: Psychology Press.

Festinger, L. (1957). A Theory of Cognitive Dissonance. Evanston, IL: Row Peterson.

Hoorens, V. (1993). Self-enhancement and superiority biases in social somparison. European Review of Social Psychology. 4(1), 113–139.

Karpicke, J.D. (2017). Retrieval-based learning: a decade of progress. In J.H. Byrne (Ed.) “Learning and Memory: A Comprehensive Reference (Second Edition)” pp.487-514. Oxford: Elsevier.

Enhance learning and revision with mixed retrieval practice

Image: Eli the Bearded

When given the task of learning or revising material for examination purposes, the majority of our students adopt methods that are sub-optimal. Casual observation suggests they typically use a cramming strategy based on reading and re-reading textbooks or class notes. This strategy does not cause much learning, but repeated exposure to the material does create a sense of familiarity. Consequently, whilst it does little to actually prepare the student, it creates an illusion of learning whereby they think they know the material much better they actually do. Needless to say, this results in wasted effort, poor performance and disillusionment.

However, while it’s easy enough to tell our students to revise differently, much harder to actually get them to do it. There’s a lot to fight against, as our students may be drawing on study habits they have built up over many years and they are often inattentive to advice about studying until they hit some sort of crisis point. In addition, they may have been given relatively little specific guidance in the past on what to do when you want to learn something and the advice they have received may be inconsistent and not evidence-informed.

Cognitive psychology gives us some evidence-informed guidance we can pass on to our students. One principle is that revision must involve retrieval practice, that is, recreating from memory, without prompts, information that was previously learned (see this blog post from The Learning Scientists for more). A second principle is that practice should be mixed. In other words, the questions used for retrieval practice should be drawn from a variety of contrasting areas rather than all coming from the same area (which would be ‘blocked practice’).

It is probably a good idea for us to inform out students of the benefits of mixed retrieval practice and supply them with resources that support it. Here are some resources intended to do this. Each is a PowerPoint slideshow consisting of 250+ slides each containing a single question about Psychology. The order of the questions has been randomised, so the student never knows what’s coming next. The idea is that the student picks a starting point and then works through the questions in series, producing either an oral or written answer before moving on. Where they can’t answer, the student should make a note of the problem area for further study. The question sets cover Paper 1 and Paper 2 of the Edexcel Psychology specification. They would need some adapting for other courses. There is also an instruction sheet for students.

If you want to make your own mixed practice sets, this PowerPoint template contains a macro to randomise the order of the slides. Add as many questions as you want and then:

  • Press ALT+F8 to bring up the macro dialogue box.
  • Select sort_rand.
  • Press RUN.

PowerPoint will then reorganise your slides randomly. I generally do this a few times as I’m not convinced that the first pass produces a very random redistribution. With large question sets it will take a minute or so to work, so don’t panic if PowerPoint stops responding for a bit.

Two notes of caution. First, the majority of studies demonstrating the superiority of mixed practice have used relatively basic tasks involving recall of discrete, concrete concepts in well structured domains (e.g. arithmetic, vocabulary learning etc.) so there is relatively little direct evidence of its efficacy with more complex and abstract material. The contextual interference effect that limits learning from blocked practice is reduced as material becomes more complex (Magill & Hall, 1990). It is therefore probably wisest to use mixed retrieval practice primarily as a way of boosting students’ factual recall of fairly discrete ideas. That said, Blasiman (2017) provides experimental support for mixed retrieval practice with introductory Psychology concepts in university students.

Second, mixed practice is more difficult than blocked practice and results in more errors. Consequently, students using it may feel that they are learning less with this approach and this may cause them to shift back to blocked practice and cramming. They need warning about this, and you’ll need to keep encouraging them. It might be an idea to organise a classroom demo experiment so they can see the benefits for themselves.

Brown, Roediger & McDaniel (2014) give a comprehensive but very accessible account of what cognitive psychology can tell us about learning in education (including mixed practice) in ‘Make It Stick’, which I recommend if you haven’t read it.

Blasiman, R. (2017). Distributed concept reviews improve
exam performance. Teaching of Psychology, 44(1), 46-50.

Brown, P.C., Roediger, H.L., & McDaniel, M.A. (2014). Make it stick: The science of successful learning. Cambridge, MA: Harvard University Press.

Magill, R. A., & Hall, K. G. (1990). A review of the contextual interference effect in motor skill acquisition. Human Movement Science, 9, 241-289.

Better evaluation with spectacles

One way of developing students’ evaluation of research studies is to use the ‘spectacles’ activity. I got it from Geoff Petty’s (2009) ‘Evidence Based Teaching: A Practical Guide’, which I recommend.  It’s a variant of the jigsaw approach. In ‘spectacles’ students are already familiar with the material they are working with (unlike in jigsaw, where they are typically encountering material for the first time). Students are in small groups, each thinking about the material in a different way. Each way is presented as a different pair of spectacles that brings a different aspect of the material into focus. They are then rearranged into mixed groups where they share their insights with each other in a co-constructive manner.

I most commonly use it when students are developing evaluation of research studies, particularly the key ‘classic’/’contemporary’ studies required by Edexcel’s psychology specification. Students are required to read about the studies in advance. The five ‘spectacles’ groups correspond to the GROVE evaluation criteria I use (Generalisability, Reliability, Objectivity, Validity and Ethics). Generally, 10 minutes in ‘spectacles’ groups followed by 15 minutes in ‘sharing’ groups seems to work well for my students but, obviously, YMMV. As a follow-up I often give out an A3 summary sheet where students can compile an overview of the whole study for revision purposes. Here are a couple of these for Raine et al (1997) and Howells et al (2005).

Provided that students remain directed towards developing a shared understanding rather than simply dictating and copying ideas, it’s an approach with few downsides. See my previous post on Jigsaw for more background.

Petty, G. (2009). Evidence-based teaching: A practical approach. Cheltenham: Nelson-Thornes.

Teaching evolutionary concepts using a bowl of sweets

This lesson is how we lost our ‘Healthy Schools’ accreditation.

Evolutionary theory is an area where students often come to us with misunderstandings. The ‘bowl of sweets’ analogy is a handy way of providing a memorable concrete model of natural selection. The approach described here is based on an original lesson on natural selection by Carol Tang.  I’ve adapted it in ways that make it quicker and more to the point because I’m rarely trying to teach the concepts of natural selection ex nihil and more usually trying to assess how well my students have acquired the relevant ideas earlier in their education and fix things if necessary, so it’s more of an ‘entry check’.

Prepare by putting some sweets in a bowl. There needs to be a variety. I usually include Lindors, Starburst, Skittles and licorice. The licorice is important because most students don’t like it. Whatever you use, check it beforehand to make sure it’s medically (e.g. nuts) and culturally (e.g. kosher/halal) safe for your students. Put an equal number of each type of sweet in the bowl, ensuring the bowl contains a sufficient number so that about half the sweets will be left if every student has one.

This activity assumes either that you know your students have learned about natural selection previously or you have set some advance study on it.  At the start of the class, pass the bowl round, inviting the students to take one each. I teach this class near the end of Autumn term so I say it’s because Christmas is coming up. When you want to start discussing evolution, invite the students to gather round while you explain that you put a known number of each type of sweet in the bowl and you’re interested to see what’s left. Tip the bowl out onto a sheet of paper and separate them out, counting how many of each are left (typically for my students, all the Lindors are gone, about half the chews remain and all the licorice is left).

At this point, tell them that the bowl of sweets can be used as an analogy for the process of natural selection and ask them to consider why. From there you can develop a discussion of evolutionary concepts. Questions I generally find useful (obviously, it depends what they come up with) include:

  • How could the bowl of sweets represent an evolutionary process?
  • Which is the fittest sweet? (Answer usually Lindor.) What makes you say that?  What about if we look at it from the sweet’s point of view?
  • What do the sweets represent?
  • What do you (the class) represent?
  • What are the traits that help a sweet survive in this environment?

In my experience, the analogy of the bowl of sweets provides a useful bridge between the abstract ideas underlying natural selection and the usual exemplifications, most of which seem to involve moths. It’s a rewarding activity as it almost always provides lots of those ‘penny drops’ moments when students suddenly get what it’s all about.

Here are some resources for a lesson on evolutionary explanations of aggression with this demo as an element.  It starts with some definitional stuff around defining and classifying aggression.  Then comes the bowl of sweets demo.  Subsequently there is a transfer activity and a Socrative quiz on evolutionary misconceptions.  There is a slideshow to support the activities.

 

A demonstration practical: correlation between digit ratio and aggression

Source: wikimedia.org
Source: wikimedia.org

It’s blindingly obvious that students will learn things better if we model them first (see Rosenshine, 2012) and most of us are in the habit of modelling all sorts of things, including the sorts of thinking and writing skills that Psychology requires. However, with the recently increased emphasis on practical skills at A – Level (in Edexcel’s specification, anyway) I’ve found myself planning for lots of practical work and it occurred to me that I’ve never modelled the whole process of a practical investigation for my students. Bits of it, yes, but not the whole thing. On reflection, that strikes me as a bit of an oversight. Here is an attempt to put that right. The aims are twofold: (1) to show, all in one, the steps involved in carrying out a practical investigation so that students have an overview of what they will need to do and how it all fits together; and (2) model good research practices and set appropriate expectations about ethical conduct during research. It is based around a  practical investigation that can be done in 45-60 minutes depending on the size of the group.  It’s a correlational study of the relationship between D2:D4 digit ratio and aggression.  There’s a lesson plan, a slideshow, a PBAQ-SF questionnaire for measuring aggression an  Excel spreadsheet for analysing the results and a sheet for students to record their observations during the demo.  I’ve also written an example report, which is pitched for students studying the Edexcel specification (users of other specifications YMMV).

Rosenshine, B. (2012). Principles of instruction: research-based strategies that all teachers should know. American Educator, Spring 2012.  

Improving assessment with a single-point rubric

Source: www.cultofpedagogy.com

I’ve started using single-point rubrics for assessing and feeding back on essays since coming across them on www.cultofpedagogy.com  This post has a nice summary of the benefits which I won’t repeat here.

Here are a couple of essay questions and single point rubrics designed to develop and assess critical thinking and writing skills in line with Edexcel’s Psychology specification. They are both ‘context’ questions requiring a combination of analysis/application, critical thinking and knowledge and understanding. I’ve tried to construct them to facilitate the sort of structure that works with Edexcel (but which is also consistent good academic writing). There is one on different types of brain scanning/imaging and another on eyewitness testimony (weapons effect, postevent information). These are RTFs, so you can hack them about to make your own. If you do, please share in the comments.  

Action potential GIFs

Soon, action potential memes will be everywhere.

I needed to use this animation, which I made in PowerPoint, but I wanted to embed as a GIF in a Google Slides deck, because I use Google Suite for pretty much everything (what I lose on the bells and whistles I make back on the portability; I’m currently running my classroom off my phone).  It turned out it is possible but it’s a bit involved. In case you want to do it: I recorded the animation off the screen using Bandicam to create a .avi. This I edited in Microsoft Movie Maker and exported it as a .wmv file. This I then uploaded to Ezigif to create an animated GIF.

In principle, this should embed pretty much anywhere. However, I discovered, in the course of an hour-long experiment, that apparently animated GIFs don’t actually animate in a Google Slideshow if the source image is stored in Google Drive. I have no idea why. Therefore, I had to upload these GIFs to my own server and then use the URLs to embed them in the Google Slides. So this post is primarily for the benefit of those who run into the same problem as me and are frustratedly Googling for an answer. But in any case, the GIFs ended up on the Psychlotron server, so I thought I’d might as well share. Here’s a slowed down version, too.

Right click to save them.  If you want to embed them in your own Google Slides then use the image URL.

 

Teaching eyewitness testimony (and many other things) using the jigsaw approach

Image by Jared Tarbell; used under Creative Commons license.
An oblique approach to image choice would add subtlety but, frankly, it’s been a long week.

I’m a big fan of the jigsaw classroom (Aronson et al, 1978) to the point where I probably overuse it. If you’re not familiar, it’s a cooperative learning activity format in which students learn part of a topic so they can teach it to others and, in turn, are taught other parts by them. The aim is that all the students end up learning the whole topic. The students are organised into ‘jigsaw’ groups. Each jigsaw group is then split up and rearranged into ‘expert’ groups. Each expert group is given responsibility for mastering one part of the topic knowledge. The expert groups are then returned to their jigsaw groups, where they teach each other. There’s a good guide to the jigsaw technique here.

When it’s done well, jigsaw promotes a high degree of interdependence amongst learners and exposes all the students to the material to be learned, both of which contribute to its effectiveness as a psychology teaching strategy (Tomcho & Foels, 2012). Compared to non-cooperative methods (i.e. those that do not require interdependence) techniques like jigsaw provide more effective learning of conceptual knowledge, a greater sense of competence and more enjoyment of learning. This is particularly so when the activity is highly structured with assigned roles, prompts for self reflection, and both individual and group feedback on performance (Supanc et al, 2017).

When I use it I like to keep group sizes to a maximum of four. If you have 16 or 32 students in a class that’s great because you can divide the material into four and have four students in each jigsaw/expert group. A group of 25 also works well, with the material divided into five parts. It can be a headache to assign groups when you have inconvenient numbers of students so you need to plan ahead and think about how you will ensure that every student learns all the content.

In my experience, the jigsaw approach works best when:

  • You stress that the activity is all about understanding what they are learning and remind students throughout of their responsibility for both teaching and learning the material. The danger is that it can easily become an ‘information transfer’ exercise, with students copying down material verbatim and dictating to each other without understanding. It is sometimes useful to impose rules to prevent this (e.g. limit the number of words students are allowed to use when making notes in their expert groups, only allowing them to draw pictures etc.)
  • The learning material is tailored to the students. This means adjusting the difficulty/complexity level of the material to be just difficult enough so that the students need to engage with it and each other to co-construct an understanding. Too difficult and they can’t do it; too easy and it becomes trivial; either way, they lose interest.
  • The learning material is tailored to the timescale. Again, we want the students to create meaning from the materials and this takes time. If too little time is given then either some of the material won’t get taught, or students will resort to ‘information transfer’ and there will be no co-construction.
  • You actively monitor what’s going on in the groups, particularly the expert groups. This is how we moderate the difficulty of the materials. We don’t want the students teaching each other things that are wrong. At the same time, it’s important not to just charge in and instruct the learners directly. Doing that undermines the point of the approach. In any case, I wouldn’t use jigsaw to teach fundamental concepts for the first time; it’s just too risky. I prefer to use it to elaborate on and deepen understanding of ideas.
  • You have an accountability mechanism (i.e. a test). Multiple choice/online assessment is quick and effective if the test items are well written. Plickers and Socrative are useful tools for this. One approach that can work here is to tell the students that everyone will do the test but that each student will receive the average mark for their jigsaw group. This creates an incentive for students to ensure that everyone in the group does well (although it also creates an incentive to blame people if the group does badly, so YMMV).

Here’s a set of materials for teaching some of the factors that moderate the misinformation effect on eyewitness testimony using the jigsaw method. This is for a one-hour lesson with a 10-15 minute expert groups phase and a 15-20 minute jigsaw groups phase. There is a slideshow that structures the lesson and a set of learning materials covering the moderating effects of time, source reliability, centrality and awareness of misinformation. You can extend the activity by prompting students to evaluate the evidence offered.  If you are a Socrative user (free account with paid upgrades) you can get the multiple choice quiz using this link. As with all these approaches, there is no guarantee that it’s superior to the alternatives but the available evidence suggests it is worth trying.  And, like everything, its effectiveness is likely to grow when both teacher and students are practised in the technique.

Aronson, E., Blaney, N., Stephin, C., Sikes, J., & Snapp, M. (1978). The Jigsaw Classroom. Beverly Hills, CA: Sage Publishing Company

Supanc, M., Vollinger, V.A. & Brunstein, J.C. (2017).  High-structure versus low-structure cooperative learning in introductory psychology classes for student teachers: Effects on conceptual knowledge, self-perceived competence, and subjective task values.  Learning and Instruction, 50, 75-84.

Tomcho, T.J. & Foels, R. (2012).  Meta-analysis of group learning activities: Empirically based teaching recommendations.  Teaching of Psychology, 39 (3), 159-169.