The MaRBLe experience of: Florian Wimmenauer
30-04-2018
CHEPS report: “More time and money needed for innovation in higher education”
03-05-2018

The UM Assessment Seminar: a Teach-meet special

On Friday 13 April, 2018, EDLAB organized an Assessment Seminar for teachers of Maastricht University. At the same time that the Tapijn area was under construction, teachers were having constructive discussions about assessment practices in the PBL context. About 60 teachers joined this special edition of the UM Teach-meet to deepen their knowledge about several assessment related topics.

The seminar kicks off with an insightful keynote speech by Professor of Education Cees van der Vleuten (FHML/SHE) about current assessment practices in a PBL. Van der Vleuten addresses a notable lack of alignment between learning objectives and the educational philosophy. He explains that in the current higher educational system, learning objectives are defined as competencies or 21st-century skills. However, practice shows that 21st-century skills such as team-functioning, communication, professionalism are sparsely assessed. In fact, the most dominantly used assessment method is summative end-of-block assessment. Van der Vleuten highlights the negative effects that this assessment method can have on student behavior: “We see a lot of poor learning styles, grade hunters, competitiveness between students, and grade inflation”. He also addresses the issue of reductionism which is a result of lacking feedback, poor alignment with curricular goals, missing longitudinal elements, and tick-box exercises amongst others. Another phenomenon that is also linked to summative assessment explained is the forget-curve: “If students pass their assessment, they are not inclined to learn from the provided feedback. As a result, they quickly forget a substantial part of the acquired knowledge.” Looking at these issues, we are left with one main question: How should we deal with individual assessment?

The assessment of competencies requires a real-life setting and involves a judgement of a professional. However, if it is true that performance is inconsistent across different contexts, how can one guarantee the validity and reliability of more subjective assessment methods? Van der Vleuten argues that objectivity is not equal to reliability. In fact, many subjective judgements are actually well reproducible and reliable. Good assessment practices require a mixture of both standardized and unstandardized assessment combined with the provision of proper longitudinal feedback. He illustrates this with example of the Physical-Clinical Investigator Master’s programme where teachers and students work with Programmatic Assessment.

This assessment method allows students to develop their skills in a professional setting in a constructive way. Throughout their programme, students build on their individual portfolio (E-Pass) and are guided by a coach whom they meet with on a regular basis. Essential built-in elements are the regular midterm tests where students are assessed on their progress, combined with the personal feedback sessions with a personal coach or councilor and. Van der Vleuten emphasizes that the provision of solely feedback is not suffice. Feedback should be a dialogue and requires the involvement of the student and the tutor. The strength of this formula is the attention for the individual learning journey. This approach puts assessment in a whole perspective. Van der Vleuten concludes by stating that the so often-heard mantra ‘Assessment drives learning’ is in fact reversed: Learning drives assessment.

Parallel Sessions

For this special edition of the UM Team-meet, EDLAB invited three assessment experts to give workshops about different assessment-related topics.

Dr. Carlos Collares (FHML/SHE) shared his knowledge about the psychometrics behind computerized adaptive testing (CAT). In addition he explained how CAT algorithms work, and how CAT is being applied to progress testing at UM FHML as a strategic assessment tool of, for and as learning with a high degree of validity and reliability.

The second parallel workshop was facilitated by Elissaveta Radulova (FASoS). She explained the basics of Constructive Alignment, a method is applied in instructional design which is used to optimize the coherence between curricular goals and assessment goals. She focused on the use of well-formulated learning outcomes, a requirement for Constructive Alignment.

The third parallel workshop was given by Dr. Donna Carroll (EDLAB).  She explained the selection process of assessment criteria in a rather creative and engaging manner. Teachers were encouraged to think about what is important in selecting criteria for assessment.

A few doors down the hall, participants attended two demonstration sessions about assessment in the digital environment. Annette Peet (SURF) gave a demonstration of the Secure Assessment Workbook which was released by SURFnet in April. This workbook can be used by higher education institutions in making assessment process secure and can be applied to digital and paper assessments. She emphasized the importance of involving people in the process making assessment processes secure and creating a safe testing environment. Teachers had a vivid discussion about who is responsible for secure assessment. They concluded that even though everyone involved in the assessment process has a role, the dean of education has the overall responsibility. Paul Adriaans (LAW) continued the session with demonstration of Remindo, a digital assessment tool which is going to be tested at the Faculty of Law among 25 students.

Harvest

Assessment expert Joost Dijkstra (MUO/AA) closed the seminar with a harvest of the Assessment Seminar. He summarized the lessons learned in a brief overview:

  • Quality is not a solution to make students behave in the right way; it is about finding the right approach to learning. It requires an approach to learning where education needs a lot of rich information;
  • Assessment requires feedback in the form of a dialogue between a teacher and a student;
  • Instead of focusing on the assessment tool, we should focus on the user of the tool and how the tool is being used;
  • As teachers, we should avoid reductionism and make information rich again;
  • Coherence between learning outcomes and assessment requires teamwork and a common mental motivation;
  • When organizing assessment, we must pay close attention to the personal student journey. Assessment should be about the individual. There is no one-size-fits-all solution.

Presentation Slides

Leave a Reply

Your email address will not be published. Required fields are marked *