Embedding Inclusive Assessment: lessons from large-scale assessment change

In this blog post, we introduce and discuss a recently completed QAA-funded Collaborative Enhancement Project that aimed to explore and understand the relationship(s) between assessment outcomes and inclusive assessment designs for different groups of students during the pandemic-affected academic years 2019-20 and 2020-21. There have been few large-scale empirical studies of this kind conducted and shared with the sector despite changes in assessment practices attracting significant scrutiny and evaluation throughout the pandemic. The project brought together eight institutions from across the University Alliance mission group and comprised a three-phase approach: 1) an analysis of assessment outcomes for specific cohorts across each partner institution capturing the range of design/policy changes alongside those course/programmes displaying the largest percentage reduction in attainment/awarding gaps (for 2019-20) and improved student continuation rates (for 2020-21). 2) interviews with academic staff and focus groups with students from those courses identified by each partner with the latter facilitated by a cadre of student researchers employed by each institution to garner student feedback on the inclusivity of assessment arrangements. 3) staff interview and student focus group data were subjected to a process of thematic analysis to capture key themes and sub-themes at a course/programme level.

This collaborative project work culminated in the production of a series of outputs developed as practical resources with the aim of supporting HE leaders, academics, and students in higher education to review, plan for, and evaluate enhancement-led inclusive assessment policies, initiatives, and interventions. Each resource is framed by an overarching position statement we developed for the project that offers the lens through which we now invite universities and practitioners to critically consider their own assessment policies and practices. We believe inclusive assessment:

‘… is realised through holistic and flexible approaches that recognise value and reflect student diversity, facilitating choice and enabling every individual to demonstrate their achievement with respect to academic/professional standards and empowering them to take ownership of their learning journey. To achieve this, assessment needs to be strategically designed as an embedded element of the curriculum to proactively consider students’ needs and to remove systemic barriers in institutional policies, processes, and practices.’

A set of inclusive assessment attributes was collectively developed to reflect the insights generated through the research work undertaken. These attributes formed the basis for an associated toolkit and suite of case studies as a way of illustrating the types of approaches that were deployed, alongside their impact on student learning and performance. Together these resources provide a framework to assist universities and practitioners in reflecting upon their current institutional policies and practices.

The project has produced a series of practical, evidence-based insights into the impact of alternative assessment arrangements on student outcomes, highlighting areas of good practice and creative implementation. Project findings and outputs are illustrative of how clear, positive outcomes can develop from adversity and how agile thinking and responses to change enabled institutions to put creative solutions and inclusive practices in place within a short period time with the culminative effect of positively impacting student outcomes.


Sam Elkington is Professor of Learning and Teaching at Teeside University, a National Teaching Fellow and Principal Fellow of the HEA
LinkedIn
Twitter: @sd_elkington

Finding Your Purpose in HE

Like many of us, I came to higher education because I believed that it was a profession that could allow me to foster community and be part of social change. But I have often struggled with the feeling that both my work and institutions fell short of the goal of making the world a better place through research and teaching.

Finding Your Purpose” is a workbook that I developed for justice-oriented scholars who are struggling to align their work with their values. Justice-oriented scholars can be instructors who try to counter mis-information or bigotry in their classrooms. They can be librarians who work to create networks of care. They can be students who lift one another up, or researchers whose writing challenges systems of violence and oppression. 

As Anne Helen Peterson has described really clearly, especially as we enter the third year of the coronavirus pandemic, I think a lot of us in and beyond HE feel disillusioned with work. The goal of “Finding Your Purpose” is to help us navigate that uncomfortable space so we can feel like our lives, our work, and our values are more closely aligned.

“Finding Your Purpose” is influenced by the writing of adrienne maree brownMariame Kaba, and others who talk about hope and change as a daily practice. For me, the idea of a justice-oriented scholar is aspirational. It is for everyone whose work turns towards justice.

How can we find our purpose?

The “Finding Your Purpose” workbook offers one approach to finding purpose at work by thinking about the people who inspire you, the communities you belong to, the values that guide you, and the work that gives you pleasure.

“Finding Your Purpose” is a workbook for individuals, but I’ve found it can be really powerful to do the work with others. This project has driven home to me just how much we need a sense of collective purpose as we face the many overlapping crises facing our profession and our communities, from climate change to racism and transphobia.

What if we find our motivations are at odds with the professional context we find ourselves in? 

I don’t know a lot of people who feel like their motivations are perfectly matched with their professional context. So we all navigate that in different ways: by adjusting how we do our work; by creating change in our workplace; or by seeking new professional opportunities elsewhere. “Working your wage,” as people like @saraisthreads call it, is one way to do this. So is organizing with a union. So is applying for new jobs or even exploring a new career. 

One of my favourite outcomes from this project is when people say it helped them set better boundaries at work, so that they were able to focus more closely on the aspects of their jobs and lives that are meaningful or satisfying or joyful. 

How can educational developers help those we work with find their purpose?

When I worked with the Visionary Futures Collective, a pandemic advocacy group, we followed a three-part model of social change: transparency, vulnerability, and collective action. 

Transparency means making the reality of our working lives more visible. Surfacing the fact that many of us struggle to find purpose at work, even in a values-driven profession like higher education, is really powerful. I wish I had understood that especially early in my career. 

Vulnerability is about creating a space where people feel safe working through difficult feelings and experiences. This requires building reciprocal and trusting relationships. The Collective Responsibility Labor Advocacy Toolkit has really useful resources for thinking about how we prepare to talk honestly with one another, especially under conditions of unequal power. I think this is an essential precondition for talking about what purpose-oriented work can look like. Finally, collective action is what happens when we get together and try to create change. We all have to figure out how to navigate these broken systems for ourselves. But I hope that in doing so, we can find a way to ease the harm these systems cause and create more space for justice-oriented scholarship to flourish.


Hannah Alpert-Abrams is a writer, speaker, and justice-oriented scholar who has published broadly on technology, labour, education, and the humanities. She is a founding member of the Visionary Futures Collective, the Academic Job Market Support Network, and the Postdoctoral Laborers. Her newest project, “Finding Your Purpose: a Higher Calling workbook for justice-oriented scholars in an unjust world,” is freely available online. Find her on Twitter @hralperta

From Knowledge Acquisition to Knowledge Management: A developmental Framework for staff and students

Disciplinary knowledge and understanding are at the heart of HE. The sense of belonging that a shared enthusiasm for the subject brings to a learning community underpins student success (Davis, 2019), builds graduate identity (Jensen, 2019) and, in our experience, can transcend difference amongst a diverse learning community.

The current pace of knowledge creation is almost overwhelming and will increase as the Fourth Industrial Age progresses (United Nations Education, Science and Culture Organisation (UNESCO), 2015; WEF, 2018). To ensure students might keep their disciplinary knowledge current beyond graduation (Coonan and Pratt-Adams, 2019) we must develop their critical, independent thinking alongside capabilities to ethically source, select, and safeguard information. These are skills the World Economic Forum (WEF) recognises as crucial to the future workforce (WEF, 2018), and UNESCO deems integral to education for the common good (UNESCO, 2015). They are also inclusive to students with diverse backgrounds and characteristics.

As such the University of Hull’s competence-based higher education stresses the importance of knowledge management, rather than acquisition (Lawrence, 2020), which we define as the ability to source, understand and communicate knowledge. It consists of theoretical and practical skills for effective academic practice and success in the workplace – a rudimentary Linked-in job search demonstrates knowledge management is creeping from essential skills into actual job titles. 

To support staff and students to manage this pace of change and transition to managing rather than acquiring knowledge we developed a Knowledge Management Framework (KMF) in consultation with students, recent graduates, Hull University Students’ Union, and colleagues involved in teaching and supporting learning. Recent graduates indicated they often lacked clear expectations or purpose in relation to knowledge management leaving them confused and feeling disadvantaged. Staff were keen for steer on supporting their students and their own academic practice.

In response, we presented the Knowledge Management Framework as an open access online resource within the Library’s suite of Skills Guides, it:

  • Guides academics in embedding knowledge management in curricula to ensure clarity and move away from assumptions about students’ previous experiences, opportunities or characteristics;
  • Is sufficiently flexible to support different disciplinary needs;
  • Explicitly explains what students might expect in their course of study.

The Elements of Knowledge Management page suggests how knowledge management can be written in curricula and includes links to further guidance. We suggest:

  • Identify and critically assess appropriate sources; 
  • Understand, question and clearly communicate knowledge to a diverse audience; 
  • Practice effective, ethical information management; 
  • Communicate with a diverse audience in person and through written, digital & media technologies professionally and confidently. 

The Knowledge Management Framework page offers ten considerations for designing courses, and ten for study or research. Presenting these side-by-side makes explicit their interrelationship, fosters understanding of the purpose of knowledge management and the Framework, and encourages reflection.

Data show that the Framework is being used daily, and user feedback suggests its utility for both staff and students in building their capability in Knowledge Management is transformational. Computer Science has embedded the Framework into level 4. Dr Neil Gordon, Senior Lecture said “As we have moved to a competency-based assessment, a key aspect of our degree and the module is to develop the students’ ability “to conduct guided investigation across” various topics. The Framework helps to provide a way to consider that, with the focus on how to identify and use different information sources. This is critical for students as they embark on their study from level 4 (typically year 1) with us.”

We hope the Framework is useful to others across the sector working with their learning communities in making the move from building capabilities in knowledge acquisition to management.


Maggie Sarjantson, FHEA, is Collections Development Manager at the University of Hull Library. Her interests include how libraries can provide equitable access to resources, and how they are used in learning and teaching.  Maggie led the development of the Knowledge Management Framework and the relevant supporting materials. M.Sarjantson@hull.ac.uk

Dr Jenny Lawrence, AFSEDA, PFHEA, NTF is Director of the Oxford Brookes Centre for Academic Enhancement and Development, she worked with Maggie in developing the Framework, lending expertise in competence-based HE, inclusive pedagogic practice and academic development. @jennywahwah

References

Coonan, E. and Pratt-Adams, S. (2019) Building Higher Education Curricula Fit for The Future: How Higher Education institutions are responding to the Industrial Strategy Links to an external site. York: Advance HE.

Davis, S. N., & Wagner, S. E. (2019). Research Motivations and Undergraduate Researchers’ Disciplinary Identity. SAGE Open.

Jensen, T.L. (2019) ‘Back to Billdung’: A Holistic Competence-Based Approach to student Engagement in Innovation Learning Process in HE. In Lund, B. and Arndt, S. (Eds) (2019) The Creative University: Contemporary Responses to the Changing Role of the University. Leiden: Brill Sense

Lawrence, J. (2020b) Assessing competencies could equip graduates for an uncertain post-Covid future. WonkHE blog

World Economic Forum (2018) The Future of Jobs report 2018

A triple filter test for educational development activities

The classic triple filter test attributed to Socrates asks us to speak about a person only if what we’re going to say is true, kind and necessary. It’s a great way to limit salacious comment and gossip. In this blog I consider what might be the triple filter test that educational developers apply before embarking on projects and taking forward requests?

Why might educational developers need a triple filter test?

Well, first, there’s an awful lot we could get involved with. Over the last ten years educational development work has taken on more strategic prominence and become more aligned to cross-institutional projects that are focused directly on student success. This is in addition to our more established foci of supporting individual teaching staff and teams with their own professional development, and the enhancement of courses and programmes. This expansion of the sphere of influence and action of educational developers has been particularly pronounced in England where ‘new’ work is aligned to OfS priorities like access and participation and student outcomes. Second, educational development teams are periodically subject to the critical gaze of senior managers. When senior managers with the portfolio responsibility for education and/ or student experience change, the new incumbent often look closely at the educational development centre and seeks to find ways to ensure it can best deliver on their ambitions. Cleaver and Cracknell, 2022, Jones and Wisker, 2012 and Gosling 2008, all attest to the precarious nature of educational development centres. Third, educational development centres are usually quite small units with staffing costs that are relatively high per fte compared to other university services. This in part reflects the need for peer esteem from the academic community (consider the number of institutions that provide educational developers with academic contracts.) It reflects the complex expertise of educational developers including foundational characteristics, diverse skills, abilities, competencies and knowledge (see, for example, McDonald et al 2016), that overlaps extensively with academic work. These small, expert teams need to make sure they invest their time in the right ways.

What would the educational developer’s triple filter test be?

I propose that the first question would be, ‘will this activity demonstrably improve teaching and/or student learning?’ It’s a broad first test of the proposed action. The inclusion of the word demonstrable suggests we need to have some reasonable evidence that the action will work (research elsewhere, an action that will be research-informed) or that we will monitor our action (we will adopt a scholarly or research-based approach to the activity.) The test separates teaching and student learning. Is that really necessary? I think so for this first test which intends to filter out the odd-ball requests and possibilities. The form of words also allows us to consider developing teaching as an activity in its own right, not always linked, by evidence, to improving student learning.

For the second question I propose, ‘does the activity align to the mission and purposes of the educational development centre?’ Obvious perhaps but it speaks to how crucial it is to have a centre mission to describe the ultimate goal of all our educational development actions. The mission can also articulate the ways that goal will be achieved and the values of the centre. Centre missions can and do change but they should be able to withstand changes to senior leaders’ priorities and the development of university strategies to which the educational development centre contributes.

Finally, for the third question I propose, ‘will the educational development centre monitor the effectiveness and impact of the activity and report on that to stakeholders?’ This is the final test to establish if the work has value and strategic importance within the institution and it establishes the accountability of the educational development centre for the work. It verifies that the work will be seen and visible within the institution when it is reported forward to someone – a client, sponsor or other audience for the work. By its explicit description of what is to be monitored (effectiveness and impact – in relation to teaching and/or student learning) the question states the ways the importance of the activity will be evaluated.

So, in summary, the proposed triple filter test for educational developers to use to determine whether they should invest their time and expertise into an institutional activity is:

  • Will this activity demonstrably improve teaching and/ or student learning?
  • Does the activity align to the mission and purposes of the educational development centre?
  • Will the educational development centre monitor the effectiveness and impact of the activity and report on that to stakeholders?

It would be great to hear what you use already to determine work priorities and what you think of this proposal. It Intends to allow educational developers to make a priori decisions about the activities they spend time on and to ensure the greatest positive impact of educational development work. Do please share your thoughts by replying to this blog or contacting me directly.


Jackie Potter is Dean of Academic Innovation at the University of Chester and Professor of Higher Education Learning and Development. She is the current Chair of the Heads of Educational Development and a member of the Staff and Educational Development Association. @Jac_Potter @uochester @HEDG_UK @SEDA_UK_
She can be contacted at jackie.potter@chester.ac.uk

References

Cleaver, L. & Cracknell, L. (2022). Highs and lows, ebbs and flows: buckle up for the educational development rollercoaster ride. SEDA Blog. 05.05. 2022

Gosling, P. (2008). Educational Development in the UK. Report for the Heads of Educational Development Group. HEDG.

Jones, J. & Wisker, G. (2012). Educational Developments in the UK. Report for the Heads of Educational Development Group. HEDG.

McDonald, J., Kenny, N., Kustra, E., Dawson, D., Iqbal, I., Borin, P., & Chan, J. (2016). Educational Development Guide Series: No. 1. The Educational Developer’s Portfolio. Ottawa, Canada: Educational Developers Caucus.

Can we use multiple-choice questions for assessment in any subject?

Can multiple-choice questions be used effectively for assessment in any academic subject? Having worked mainly in arts and humanities, I admit I’d never seriously considered this. But what with one thing and another over the last couple of years (!) I found myself grappling with this exact question about … errr… questions.

Recent moves by universities towards more blended and online learning contexts have necessitated more consideration of online assessment and the trickle down consequences this might entail. For the managerially oriented, this offers the intoxicating waft of increased efficiency and even automated marking in the virtual air. You can almost sense the technology vendors circling, pitching Martin Weller’s Silicon Valley Narrative on how education is broken, and only private sector solutions can fix it. Check out Weller’s ed tech pitch generator to get inside knowledge on the next big ed tech tornado coming our way!

So it was with some chastened surprise that I learned about the nuance, challenge, and even validity (insert shocked emoji if required) that MCQs can offer, according to a range of very credible advocates. I started with Phil Race’s Lecturers Toolkit, which lists a host of advantages, many perhaps unsurprising:

  • Reliability
  • Time efficiency
  • Availability of multiple-choice and multiple response options

Other strengths of MCQs were, to me, less obvious. As mentioned above, Race contends that good question items can be meaningful and valid – they test what you want them to – whilst also covering a greater extent of the syllabus (Race, 2015, p. 61). Having close correspondence to much real world decision-making was a further benefit listed. If well-constructed MCQs can be combined with other forms of assessment, it seems even greater validity, syllabus coverage and efficiency might be possible.

At this point, a request for advice on the SEDA mailing list produced the usual very generous and well-informed response. Colleagues in the fields of science, medicine, agriculture and educational development (among others*) have been producing great work in this area for some time. Of course, the community produced the same answer to my question as to virtually any in HE: ‘it depends … [how deep you want to go?]’.

Dr David Smith at Sheffield Hallam devises unGoogleable exam questions and gives tips and resources for teasing out those higher order thinking skills including “the ability to apply, analyse and evaluate information”. Rebecca McCarter and Dr Janette Myers note that single best answer questions (select the ‘most correct’ option) can test application of concepts over simple recall, whilst Peter Grossman uses degree of difficulty estimation (“think scoring of Olympic diving”). His idea of setting in-class tasks for students to construct their own questions is also valuable for a range of educational reasons. Colleagues Linda Sullivan (MTU Cork) and Ruth Brown (SEDA consultant) added further nuance by highlighting how we can ask students to rank or confidence-weight their responses. They also link to psychological research by Butler (2018) outlining 5 best practices in MCQ construction. It might, however, be challenging to balance Butler’s overall recommendation for simple item formats with the more nuanced demands of confidence weightings, rankings and explanations of answers discussed above.  

This tension between simplicity and effectiveness gets to the heart of the issue on writing MCQs, and to the conclusion of my brief research into this area. Doing this well will take substantial time, expertise and input from a range of stakeholders. The usual suspects will be required: lecturers, educational developer, learning technologist, student voice, quality and standards and more I’m sure. Good items take time to produce; banks of items much longer. Quality control and piloting is essential – I was amazed how many ways there are to get this stuff obviously wrong, in ways which aren’t obvious during question construction. Colleagues from Harper Adams highlighted the analysis techniques needed to assess both the difficulty of items and the extent to which each one discriminates student level, correlating student scores for each question to the overall assessment mark. Fascinating, but tricky and time-consuming. So, MCQs – better than I thought as a tool, even more challenging to make than I realised. Reassuringly, Butler (2018, p.323) notes that aside from simply measuring things, MCQs can “cause learning”. Phew. I’ll try that line next time someone asks what I do: “I cause learning”. I wonder if the OfS will accept that as evidence when they next come knocking?


Steve White has been lurking in teaching, learning and research-related third spaces in HE for about 20 years. Most relevant for this article, he’s dabbled in online materials and test item writing for Oxford University Press. He worked in intriguingly ill-defined roles while developing online MA courses and MOOCs for the University of Southampton, leading him to complete PhD research on the third space in HE. More recent roles have straddled Learning Development and Educational Development at Arts University Bournemouth and Southampton.

*Many thanks to contributors to the discussion on MCQs: Ruth Brown, Dr David Smith, Peter Grossman, Dr Janette Myers, Clare Davies, and colleagues at Harper Adams. My apologies if I’ve missed anyone out – I’ve recently changed employer so lost access to some emails.

References

Here’s a list of resources and references I received from the SEDA community, including a number of items suggested by Clara Davies:

Burton, R. F. 2005. Multiple-choice and true/false tests: myths and misapprehensions. Assessment and Evaluation in Higher Education. 30 (1), 65 – 72

Butler, A. C. (2018). Multiple-Choice Testing in Education: Are the Best Practices for Assessment Also Good for Learning? Journal of Applied Research in Memory and Cognition, 7(3), 323–331.

Case, S.M. & Swanson, D.B, 2002. Constructing Written Test Questions For the Basic and Clinical Sciences

Gronlund, N.E. 1991. How to Construct Achievement Tests. Allyn Bacon, 4th Edition.

Race, P. (2020). The Lecturer’s Toolkit (5th ed.). Routledge. (Ch.2) Race, P. (n.d.) Designing Multiple-Choice Questions. Phil Race: Assessment, learning and teaching in higher education.

Vision? What vision?’ An un-occluded view of the importance of vision in educational development

Vision is defined in the Oxford Dictionary as ‘the ability to see’. Yet the vision statements we see adorning the covers of university manifestos are typically less about what can be seen and more about what is hoped to be seen in the future. Such ‘visions of hope’ can sometimes appear distant or unrelated to our day-to-day work as educational developers. They may also become occluded from view due to the day-to-day complexities of working in educational development (EdDev) functions. So, just how important is vision for an EdDev function, and what are the steps a leader should consider when developing a vision to guide function practice?

Quinlan (2020, p. 79) states that no matter where leaders may be found in an institution ‘having a passion, a purpose, or a vision is foundational’. Developing and actioning a vision, however, is not nearly as simple as putting pen to paper then holding colleagues to account. As a leader of learning, developing and actioning a vision requires patience and consensus, as a lack there of can have a significant impact on attainment of function cohesiveness, internal utility, and ultimately institutional reputation (Light, 2020). Important also is the sharing of vision as a means to drive purpose and provide meaning for all it services. Thus, as Wilson (n.d.) writes, not only is it important to have a vision, it is equally important to have it articulated (through conversation), displayed (through content), and embodied (through leadership).

Vision is only as strong as those empowered to enact it. Thus, a core requirement of vision development is involvement of EdDev function colleagues. It is also important that a vision statement represents more than just goal achievement. It is about the creation of culture and the rewarding of behaviours to uphold and drive said culture. As Cooke (2020, p. 223) reflects, ‘throughout my working life the nature of the culture and the journey have come to assume far more importance to me as a leader than just striving for outcomes and goals’. So, what steps should be considered when developing and actioning a vision for your EdDev function?

  • Step 1: Consider vision as a ‘what we want’ statement and engage each function member to help develop it.
  • Step 2: To support a culture of behaviours geared towards pursuit of an EdDev function’s vision, an appreciation of what the function is responsible for is required. Thus, consider next the co-development of a mission statement or a ‘what we do’ statement (Wilson, n.d.).
  • Step 3: Finally, consideration should be given to how the vision and mission of the function will be actioned. An action statement/s or ‘how we do it’ statement/s is required.

Steps in Action

As a new member of staff within the EdDev function at the University of Greenwich, I am responsible for leading a new team of university learning technologists. The steps outlined above were used in May this year to develop a sense of direction and identity for the team. The following statements were co-developed by the team at an away day and have since helped the team plan our day-to-day practice as well as our most effective ways-of-working:

Vision statement: To make teaching with technology stimulating and rewarding for all

Mission statement: To develop competent and confident users of digital pedagogies

Action statement: Lead on innovative digital pedagogy projects to explore, evaluate and recommend appropriate learning and teaching technologies to enhance student success


Kendall Jarrett is an Associate Professor of Learning and Teaching (HE) at University of Greenwich where he has strategic responsibility for curriculum design and digital pedagogy.

References

Cook, C. (2020). Leaving leadership. In K. Jarrett & S. Newton (eds). The practice of leadership in higher education: Real-world perspectives on becoming, being, and leaving. Abingdon: Routledge. Chapter 15a (pp. 222-227).

Light, R. with Razak, Md. S. (2020). The influence of experience and culture on leadership. In K. Jarrett & S. Newton (eds). The practice of leadership in higher education: Real-world perspectives on becoming, being, and leaving. Abingdon: Routledge. Chapter 4 (pp. 61-70).

Quinlan, K. (2020). Leading for learning: Building on values and teaching expertise to effect change. In K. Jarrett & S. Newton (eds). The practice of leadership in higher education: Real-world perspectives on becoming, being, and leaving. Abingdon: Routledge. Chapter 5 (pp. 71-86).

Wilson, L. (n.d.). Crafting a vision: Who we are, what we do, where we’re going.

Further reading on the importance of vision in HE leadership can be found in the book The practice of leadership in higher education: Real-world perspectives on becoming, being, and leaving available from Routledge.

Towards Expertise: Fostering a Culture of Professional Learning Embedded in the Everyday

In a 2019 SEDA blogpost, I outlined a model of expertise for teaching in higher education which arose from engagement with the extensive literature on expertise, discussions with colleagues, and SEDA-funded research with nine National Teaching Fellows. This model has been refined and further explored through an international symposium and subsequent Routledge / SEDA series book (King, 2022). To summarise: I categorised the generic characteristics of expertise into three, overlapping features and translated these into the context of teaching in higher education:

3 inter-related elements of expertise: pedagogical content knowledge, artistry of teaching, and professional learning

Two of the blogposts in this series, by Rich Bale and Erika Corradini, explored facets of the Artistry of Teaching and Pedagogical Content Knowledge respectively, Deanne Gannaway’s post considered a whole-university approach to developing expertise and, in this post, I will look at Professional Learning.

Professional learning, development or practice is essential for the development and maintenance of expertise (e.g. Ericsson). There is no shared definition for professional development in higher education but the literature commonly refers to processes and activities “that, through strengthening and extending the knowledge, skills and conceptions of academics, lead to an improvement in their teaching and consequently to an enhanced learning experience for students” (Inamorato et al, 2019, p.4). These might include attending workshops or conferences, reading relevant literature, conversations with colleagues, peer observations and so on (e.g. Ferman, 2002; King, 2004). As well enhancing student learning, engaging in professional development can also have a positive impact on career progression.  But, despite evidence for the benefits, the engagement of teachers with these activities is often variable and unsystematic. Barriers to engagement include resistance to change, lack of formal requirements or incentives, and lack or perceived lack of time (Inamorato et al, 2019; King, 2019).

The term ‘professional learning’ has been advocated as being more appropriate for the higher education context than ‘professional development’ (e.g. van Schalkwyk et al, 2015; Trowler & Knight, 1998) as models focus on the intrinsic actions and goals of practitioners themselves. This intrinsic motivation is more likely to engender expertise development (Ericsson et al, 1993) and engagement with good teaching practices (Stupnisky et al, 2018). As the expert practitioner progresses in their career their professional learning activities become more autonomous and self-determined (e.g. Schön, 1982; Dreyfus & Dreyfus, 1982; Eraut, 1994), and it is this progressive problem solving (Bereiter & Scardamalia, 1993) or proactive competence (Perkins, 2008) that distinguishes the expert from the experienced non-expert. This is the approach to expertise development that I saw in my research with the National Teaching Fellows and that led me to suggest a way of conceptualising professional learning in higher education as a “self-determined and purposeful process of evolution of teaching and research practices, informed by evidence gathered from a range of activities” (King, 2019). It’s a definition I use in workshops with early career and experienced lecturers in helping them to consider and plan for their own development.

But, “excellence in higher education is commonly assessed through outputs, in this case measures such as student satisfaction or graduate outcomes…This effectively ignores [the] critical feature that distinguishes those with expertise from those with experience: a commitment to professional learning. If higher education institutions are to achieve their missions of excellence in education, then they must also foster and enable a culture of professional learning for teaching that is integrated into everyday practice rather than being seen as an add-on. Without this active institutional-level commitment, expertise in teaching will only ever be a subculture of the few.” (King, 2022, pg. 10)

The importance of learning and development has been promoted and supported through SEDA’s work over the last 29 years and the work of educational developers across the UK and internationally. It’s heartening to see it recognised in policy through the Office for Students in England in the recent TEF Guidance which suggests two examples of evidence for the quality of the student experience might be:

“e. Evidence about how the professional development of staff enhances academic practice.

f. Staff feedback or other evidence about how recognition and reward schemes are effective in delivering excellent teaching” (OfS, 2022, pg. 32)

Research and anecdotal evidence suggests that the lived experience of those who teach and/or support learning in many institutions does not include an embedded conception of professional learning, and that workload models do not always support this essential component of expertise development and maintenance. Will the new TEF guidance help to shift institutional cultures, at least in England? What else might change the current thinking about professional development, and what examples already exist of a successful culture of embedded professional learning in higher education? Examples on a postcard (or in the comments below)…


Helen King is Professor of Academic Practice at the University of the West of England and Co-Chair of SEDA. She holds an NTF, SFSEDA and PFHEA.
E: helen5.king@uwe.ac.uk
T: @drhelenking

References

Bereiter, C. & M. Scardamalia (1993) Surpassing ourselves: an enquiry into the nature and implications of expertise. Open Court, Illinois

Dreyfus, H. & S. Dreyfus (1982) Mind over machine, Free Press, New York

Eraut, M. (1994) Developing professional knowledge and competence, The Falmer Press, Basingstoke

Ericsson, K.A., Krampe, R.T. & Tesch-Römer, C. (1993) The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363-406

Ferman, T. (2002) Academic professional development practice: What lecturers find valuable, International Journal for Academic Development, 7(2), 146-158

Inamorato, A., Gaušas, S., Mackevičiūtė, R., Jotautytė, A. & Martinaitis, Z. (2019) Innovating Professional Development in Higher Education: an analysis of practices. EUR 29676 EN, Publications Office of the European Union, Luxembourg

King, H. (Ed: 2022) Developing Expertise for Teaching in Higher Education: practical ideas for professional learning and development. Routledge / SEDA

King, H. (2019) Continuing Professional Development: what do award-winning academics do? Educational Developments, 20(2), 1-5

King, H. (2004) Continuing Professional Development in Higher Education: what do academics do? Educational Developments, 5(4), 1-5

(OfS) Office for Students (2022) Regulatory advice 22: Guidance on the Teaching Excellence Framework (TEF) 2023.

Perkins, D. (2008) Beyond Understanding. In: R. Land, J.H.F.Meyer & J.Smith (Eds.) Threshold Concepts within the Disciplines. Sense Publishers, Rotterdam

Schön, D. (1982) The Reflective Practitioner: how professionals think in action. Routledge, Abingdon

Stupnisky, R.H., BrckaLorenz, A., Yuhasb, B. & Guay, F. (2018) Faculty members’ motivation for teaching and best practices: Testing a model based on self-determination theory across institution types. Contemporary Educational Psychology 53, 15–26

Trowler, P., & Knight, P. (1999) Organizational socialization and induction in universities: Reconceptualizing theory and practice. Higher Education, 37, 177–195

van Schalkwyk, S., Herman, N., Leibowitz, B., & Farmer, J. (2015). Reflections on professional learning: Choices, context and culture. Studies in Educational Evaluation, 46, 4-10

Developing pedagogical content knowledge through the integration of education research and practice in higher education

The third blog in our four-part series of blogposts drawing on chapters from the Routledge SEDA Series book “Developing Expertise for Teaching in Higher Education: Practical Ideas for Professional Learning and Development

The interplay between theory and practice underpins the development of expertise. In the field of education as well as in other professions, expertise is the ability to continue to develop competence as opposed to plateauing on established routines. Is it possible to develop expertise in education without applying scholarly practices to our teaching? A study exploring lecturers’ attitudes to adopting scholarly approaches (Corradini, 2022) to teaching in higher education has shed (some) light on the impact of this approach on the development of teaching practices vis a vis a meaningful student experience. Arguably, the adoption of scholarly practices in teaching can influence both students and teachers; for example, in driving forward curriculum enhancement and innovation, in engaging educators’ in the continuous development of their practice, in creating and sharing expertise with positive outcomes for the student learning experience. Specifically, the benefits for academics emerging from the above mentioned study are: an ability to transfer research processes to their teaching, an ability to use data to improve the student outcomes, an awareness of the effects a scholarly approach to teaching can have on the development of learning experiences.

The identification of these areas of development provide ground for exploring new approaches to the support educators receive in their teaching jobs and for reflecting on the importance of building intra- and inter-institutional networks of support. The creation of support structures can influence the teaching behaviours and values of individual lecturers (Healy et. al, pp. 32-34) leading to increased uptake of practices which would encourage academic staff to measure and monitor the quality of their teaching accurately, responsively and responsibly.

Engaging lecturers with education research regularly proves to be a demanding task, however. This is especially true for early career academics who find themselves under the pressures of disciplinary research and are often unsupported in their long-term development but for attending teachers’ development programmes. Finding the time and space to develop teaching practices is often overlooked or simply a low priority. While the data sets analysed broadly indicate that a positive attitude and a sense of confidence derive from questioning and making sense of teaching practice (Webster-Wright, 2017), they also reveal a reticence to take risks in areas such as evaluation of teaching, which are not part of the disciplinary identity of most academics.

How can faculty and education developers cultivate the ability and sustain the capacity to integrate evaluation into teaching and learning design and subsequently into practice? Placing particular emphasis on the teacher/educator in context has revealed areas that educators find difficult to navigate; these are time pressures, limited support from mentors and senior management, difficult access to networks of support. Furthermore, HEIs, especially research-intensive universities, have a research remit. In these institutions, academics research in their own discipline; how can they be supported to transfer the same curiosity to their teaching? Similarly, there is an expectation that academics teach students in research-rich environments to encourage the development of research skills and inquisitiveness (Kreber, 2002); educators should do the same by modelling a scholarly approach to teaching, which they often have an opportunity to develop in academic development programmes such as postgraduate certificates, whose reach is however limited.

Some of the areas key to further support HE teachers are: the development and integration of evaluation methods into teaching practice, a coordinated support in obtaining ethics approval, pedagogical content knowledge and transfer, formation of institutional support/engagement networks, and the integration of sustained evaluation practices into curriculum design.

If acquiring pedagogical content knowledge is important for the development of expertise for teachers in higher education, then scholarly educators will need to interface with the above dimensions in order to engage with and sustain research-integrated, evidence-based practices in their teaching routines (Chi, M., Glaser, R., Farr, M., eds 2009). An ability to navigate institutional dynamics and access institutional support networks seems, therefore, worth reflecting on within the community of practitioners. Creating support networks, protecting spaces for experimenting with new methods and encouraging academics to think outside their comfort zones and practise outside established routines would support a culture in which knowledge and expertise are not only developed but also sustained along career trajectories. Educators need to be supported to acquire pedagogical content knowledge and to integrate SOTL into their practice long term, as, owing to time and other priorities, this cannot happen naturally. When spaces are created for doing so and time is protected, there are numerous benefits to the quality of teaching and of student learning.


Erika Corradini is Principal Teaching Fellow in Higher Education at the Centre for Higher Education Practice, University of Southampton. Her activity is centred on supporting academic colleagues in developing their teaching and the academic profession. Erika is active on Twitter @eriCorradini

References

Corradini, E.,(2022) Developing Pedagogical Content Knowledge through the integration of education research and practice in higher educationin King, H. ed. Developing Expertise for Teaching in Higher Education.Practical Ideas for Professional Learning and Development (Routledge), pp. 142-154

Chi, M., Glaser, R., Farr, M., eds (2009) The Nature of Expertise, (Lawrence Erlbaum Associates, New York) Healy, M., Matthews, K.E. & Cook-Sather, A. (2019) Writing scholarship for teaching and learning articles for peer-reviewed journals. Teaching and Learning Inquiry, 7 (2), 28–50.

Kreber, C. (2002) Teaching excellence, teaching expertise, and the scholarship of teaching. Innovative Higher Education, 27 (1), 5–23

Just make it up as you go along? Improvisation and adaptive expertise for teachers

The second blog in our four-part series of blogposts drawing on chapters from the Routledge SEDA Series book “Developing Expertise for Teaching in Higher Education: Practical Ideas for Professional Learning and Development

Is teaching an art or a science? Do teachers engage in routine, structured activities, or do they adapt creatively to what unfolds during learning and teaching encounters? Do teachers facilitate, instruct, or perform? The answers to these questions are complex and almost certainly do not lie at one end of a binary. While I would not conceptualise teaching as a performance, as an act, I would argue that teaching has elements of artistry, creativity, and adaptability – even performance. An interesting artistic, creative, and adaptive area of performance is improvisation which, in the theatre, is a type of live, unscripted performance, where the performers and audience members co-create a piece of live theatre. For a long time, I have found the element of co-creation in improv theatre to be an interesting parallel with learning and teaching, particularly in active learning contexts where students are active participants in their education rather than passive recipients of learning.

From here, I started thinking about the expertise that higher education teachers need to develop. I explored some of the literature on expertise, drawing in particular on the concepts of routine and adaptive expertise (Hatano & Inagaki 1986). In the context of teaching, routine expertise might include a whole host of procedural knowledge and skills, such as aspects of curriculum and course design, session planning, and assessment and feedback methods. But what about the aspects of teaching and learning that cannot be fully planned or predicted? Learning and teaching are complex, messy, social, sometimes frustrating endeavours, which require adaptive expertise; the ability to work in and react to unpredictable situations, creating new knowledge and experiences in the process (Siklander & Impiö 2019). Adaptive experts need not only to be able to bring their existing knowledge, skills and experiences to bear in new, unplanned situations; they also need to be able to observe and notice what others are doing and how events are unfolding. This calls for a highly social, collaborative and team-working mindset, which is where adaptive expertise meets improvisation.

The basic principle of improv: “Yes, and…” – helps to create a collaborative environment, where ideas can be generated and exchanged non-judgementally (A Mind Apart Blog, 5 January 2019)

At the first Expertise Symposium in 2020, I presented some work that I had been doing with graduate teaching assistants (GTAs) on the use of performing arts skills in teaching. In a workshop as part of institutional training for GTAs, we had been exploring improvisation as a way of helping GTAs to develop adaptive expertise in their teaching. The workshop gave GTAs opportunities to discuss any issues or anxieties they had about teaching, and a space to explore some improv games and activities, with the aim of improving confidence, spontaneity, and skills of observation. For me, in the context of teaching and learning, the real reason for developing adaptive expertise and improvisation skills lies in the fact that it makes for an inherently student-centred approach to teaching.

Viewing teaching as a performance activity can attract criticism that there is too much of a focus on what the teacher is doing. But this, I would argue, could not be further from the truth when it comes to improvisation. Roger Kneebone puts this better than I can in his book on expertise and mastery: “In addition to practising, learning to listen, getting things wrong and putting them right, improvisers have to have made the transition ‘from you to them’” (Kneebone 2020, p. 221). An adaptive, improvising teacher is likely to have developed increased awareness of self and others, increased empathy, increased skills of interaction, and an increased ability to facilitate dialogue. But perhaps the most compelling aspect, for me, is this shift in mindset ‘from you to them’: the collaborative, empathetic, student-centred approach that is inherent is an adaptive, improvising teacher’s practice.


Richard Bale is Senior Teaching Fellow in Educational Development in the Centre for Higher Education Research and Scholarship, Imperial College London. He has interests in performance aspects of teaching, expertise development, and intercultural feedback literacy. He is the author of Teaching with Confidence in Higher Education: Applying Strategies from the Performing Arts.

Twitter: @RichBale Email: r.bale@imperial.ac.uk

References

Hatano, G., & Inagaki, K. (1986) Two courses of expertise. In H. Stevenson, H. Azuma & K. Hakuta (eds.) Child development and education in Japan. New York: Freeman, 262–272.

Kneebone, R. (2020) Expert: understanding the path to mastery. London: Viking Penguin.

Siklander, P. & Impiö, N. (2019) Common features of expertise in working life: implications for higher education. Journal of Further and Higher Education, 43(9): 1239-1254.

Developing teaching expertise is a contextualised journey

This four-part series of blogposts draws on chapters from the Routledge SEDA Series book “Developing Expertise for Teaching in Higher Education: Practical Ideas for Professional Learning and Development” (March 2022, Ed: Helen King). The book was the outcome of the popular international Expertise Symposium held online in October 2022, and features contributions from over 30 authors (videos of all presentations are available on YouTube). The second Expertise Symposium takes place live online on Friday 14th October with watch parties the following week. It is hoped that a second SEDA Series book will also published from this event. Take a look at the Routledge website for details of all 32 books currently available in the SEDA Series.

The first post in the series explores a whole-university embedded approach to professional learning and developing expertise:


Why do university human relations divisions continue to ignore the fact that professional learning and development of professional expertise – such as teaching – is not something that can be ‘delivered’ in short-term, atomistic activities? And that, if you really want organisational change, didactic, generic “training” is not the way to go?

While expertise in teaching may be a process that is accessible to all, professional development activities can be inaccessible to the very people needing to develop expertise. Often, professional development activities for university teaching staff are offered through central teaching and learning units, meeting governance requirements rather than the individual’s needs.  These programmes can seem to be too generic, irrelevant to those that they seek to engage and perceived a voice for senior management alienated from the trials of the classroom or the culture of the discipline (Trowler & Bamber, 2005, Roxå & Mårtensson, 2009). They can be limited to induction programs, ignoring the learning needs of experienced teaching staff seeking to expand horizons. They can ignore that learning to be an expert is a lived experience, embedded and constructed in practice (Webster-Wright, 2010). Most importantly, these types of programmes can fail to develop the expertise needed to meet long-term, systemic change or the immediate adjustments in teaching practices prompted by the COVID-19 pandemic. Perhaps one of the reasons for this situation is because, when designing professional learning programmes, it appears that the fundamental principles of curriculum design are ignored and the notion of didactic training remains.

We wanted to change the status quo when we came to redesigning our professional learning programmes at an Australian research-intensive university. Instead, we drew on seminal curriculum work (Laurillard, 2010; Fung, 2017) to develop a professional learning curriculum for all university teachers. We wanted a programme that would work for all teachers: from tutors to programme conveners, learning designers to clinical educators. We focused on recognising university teachers’ existing expertise and personalising participants’ professional learning (Keppell, 2014). A centre piece of this work was the development of a Teaching Expertise Framework that foregrounds teaching expertise as a continuum where it is possible for different people to be at different stages of development. The framework now outlines the learning outcomes for our professional learning programmes and as well as our recognition programmes. It has also allowed us to model personalised and learner-centred approaches, that offer relevant and authentic professional learning experiences; core aspects of our University’s vision for our students’ experience.


Associate Professor Deanne Gannaway is the Academic Lead for Professional Learning in the Institute for Teaching and Learning Innovation, University of Queensland, Australia

References

Fung, D. (2017) Connected Curriculum for Higher Education. UCL Press, London. DOI: doi.org/10.2307/j.ctt1qnw8nf

Keppell, M. (2014), “Personalised Learning Strategies for Higher Education“, The Future of Learning and Teaching in Next Generation Learning Spaces (International Perspectives on Higher Education Research, Vol. 12), Emerald Group Publishing Limited, Bingley, pp. 3- 21.

Laurillard, D. (2010) An Approach to Curriculum Design. Institute of Education, London,

Roxå, T., & Mårtensson, K. (2009) Significant conversations and significant networks – exploring the backstage of the teaching arena. Studies in Higher Education, 34(5), 547–559.

Webster-Wright, A. (2010) Authentic professional learning. In Authentic Professional Learning. Springer Netherlands, Dordrecht, 107–142.