I’ve got a couple of new posts up on the [UTS Futures blog on making learning visible]1. The core idea is that we need ways to make visible to both students and teachers whether students understand/have grasped what they’re learning. The idea isn’t to assess everything, but to foreground key concepts, etc., and to give opportunities to practice their application and receive formative feedback. This goes hand-in-hand with supporting students in being accountable for their own learning, i.e., they know what they should do, they take on responsibility for this, and reflect on how their learning is going to apply feedback and seek additional support where needed. To give one example, this kind of approach – common in school classrooms now, but maybe less so in HE – suggests that rather than asking for volunteer individuals to answer questions, having ‘whole-class’ response systems, whether via a quizzing tool, or analogue approaches (mini-whiteboards, voting with hands, etc.) pushes all students to engage (they’re accountable), and gives both student and instructor feedback (learning is visible). (nb assuming good questions, etc., and not to say that individual questions and modelling discussion with individuals isn’t useful even in large lectures/classes). # Part one: Making formative quizzes matter You’ve designed some tasks to support learning the key concepts and skills in your subject, and talked students through what they need to know. But how do you, and your students, check that they’ve understood? Last semester, in the undergraduate elective [Arguments, Evidence, and Intuition]2, I trialled an approach to try and make this learning more visible, in which we: * Ran (almost) weekly quizzes through google forms, where the content of the quiz was (a) directly related to the topics to be learnt that week, and (b) tied explicitly to a written assignment due mid-semester. * The quizzes were based on working with real data (more on this in a future post) and authentic examples – for example, looking at NSW housing data, just as they would need to do in their mid-semester submission. * And finally the last (optional) questions of the quiz asked three important ‘minute paper’ style questions: Q1: Imagine you were writing your report based on this quiz dataset. Write a short paragraph that highlights some of the key claims you’d make to create a data story based on the analysis above. Q2: Have you learnt anything new this week? Tell us about it! Q3: Is there anything you’re still unsure about or would like us to discuss more? Or any other feedback on this week? Let us know! These were reviewed each week, and used to calibrate future activity, and in some cases to contact individual students.# Q1: Imagine you were writing your report based on this quiz dataset. Write a short paragraph that highlights some of the key claims you’d make to create a data story based on the analysis above. Analysis of the last question allowed me to draw out and provide whole-cohort feedback on examples of student writing. These were also used as reference exemplars when the students were asked to peer assess their preliminary analyses for their own assignments, drawing attention to three types of response: (1) ones that describe the data, but don’t interpret; (2) ones that provide commentary but without reference to the data; and (3) ones that effectively integrated data into critical interpretation. We made use of these as part of a peer review guidance sheet, which provided the examples alongside feedback, to support students in thinking about their own data stories, and prompt whole class and peer discussion. You can download the guidance sheet here: [Writing with and about numbers]3.# Q2: Have you learnt anything new this week? Tell us about it! and Q3: Is there anything you’re still unsure about or would like us to discuss more? Or any other feedback on this week? Let us know! These two questions in particular gave feedback on what was working to support learning, and where there might be misconceptions or gaps in knowledge that might need addressing. For each of the topics covered far more students flagged that they had learned the content than that they required further support. Some students made general comments indicating information had been learned, or further support was required, and in a couple of cases students gave useful feedback regarding the accessibility of resources (small URLs on a projector, poor colour choices for accessibility). These responses included perspectives that also led to interesting further follow-up. On the one hand, students noted the importance of ethical use of data, and the need to scrutinise data – great stuff! Others said things like: “how easy it is to manipulate [data] to depict what you want”, or “how to trick people with statistics”. These kinds of response prompted useful discussion in class about the need on the one hand to be critical, but also not to be so critical as to assume false equivalence across uses of statistics.

Other tools… If you’re interested in taking a similar approach, but

want something really simple, one method lots of academics use is an [interactive poll (‘feedback in a flash’)]4. You might also like to explore the previous post [Are weekly interactive quizzes the answer?]5 in which Natalie Krikowa talks through her approach to entry-quizzes with questions that: 1. Review – what did we do last week? 2. Raise – what are we doing this week? 3. Reinforce – what should I be remembering? # Part two: Structuring opportunities for practice

In my previous post I talked about how formative quizzes could be used to foreground misconceptions, and provide opportunities to practice towards a much bigger written assignment. But how do we tie week-to-week learning and formative feedback to support students in completing their assignments?

Approaches

In the last post I talked about how quizzes with some ‘minute paper’ style questions could help to make learning visible to students and academics. The other part of this approach was to:

  1. Introduce an assignment template, with an appendix component. In the appendix we asked the students to use the same statistical procedures they were learning week-to-week but on a dataset that they chose themselves. Each section of the appendix indicated what they should include, when they could do it by (tied to the weekly quizzes), and what kind of data story each analysis helps us to write
  2. 2 weeks before the deadline, students completed a peer-assessment for the written assignment that involved reviewing the completion of the appendix. This encouraged students to get their analysis done in plenty of time, leaving two weeks to write the report, and an opportunity to discuss the data story they then intended to write, supported by the example data-story paragraphs extracted from the quizzes.
  3. The students were strongly incentivised to complete the appendix because their mark was capped if they didn’t (happily a penalty that wasn’t applied).

The idea of this structure was to:

  1. Make students accountable for using their skills and knowledge to apply a particular set of analyses.
  2. Make learning visible through having these analyses common across assignments, and thus highlighting any misconceptions.
  3. And make really explicit the opportunity that the quizzes and peer assessment provided to practice towards the written assignment, and receive formative feedback.

I’m particularly excited by the second point. Certainly it isn’t the case that all students suddenly understand(!), however, because they had to make an attempt, we have a much clearer idea of where gaps in understanding are. For example, I recently conducted an analysis of a random 25% of the assignments in autumn 2018 and summer 2017. Across both sessions the vast majority of assignments included frequencies/proportions, etc., However, to give an example, only 19% vs 86% included scatterplots in Summer and Autumn respectively, and of those, only 17% and 36% of the uses were appropriate…that suggests that more of the students learnt the skill, and it helped us spot a widespread misconception.

Try for yourself

Here are some documents you can use if you’d like to try this approach in your class.

Download the guide here, and the peer discussion worksheet here.

Footnotes

  1. https://futures.uts.edu.au/blog/author/simon/

  2. http://handbook.uts.edu.au/subjects/36201.html

  3. https://s3-ap-southeast-2.amazonaws.com/wordpress-futures-prod/static/2018/10/17130436/Writing-with2c-and-about2c-numbers.docx

  4. https://futures.uts.edu.au/blog/2018/09/12/power-polling-feedback-flash/

  5. https://futures.uts.edu.au/blog/2018/02/05/weekly-interactive-quizzes-answer/