Virna Rossi – Ravensbourne University London
In collaboration with:
John Baird – Reykjavik University – Iceland
Maha Bali – American University in Cairo – Egypt
Mari Cruz García Vallejo – Digital Learning Consultant – Spain
Shalini Dukhan – University of the Witwatersrand, Johannesburg – South Africa
Olivia Yiqun Sun – Xi’an Jiaotong-Liverpool University – China
With thanks to the over 50 colleagues who have contributed to the Padlet and to the wider SEDA community.
Following a few days of intense discussion around uses and abuses of the ChatGPT software on the SEDA jiscmail list, on Monday 30th January about 50 colleagues in educational developer or similar roles, from all the continents, met online to share ideas about the use of ChatGPT in the specific context of staff development courses such as the PGCert (in the UK).
Ideas were shared synchronously during the live session and asynchronously after the live session had ended via a Padlet – a digital, real-time collaborative platform in which users can upload and respond to posts on a virtual bulletin board. This event was part of a series called ‘Padlet Forum’ initiated by Virna Rossi and hosted on the Educational Developers Thinking Allowed (EDTA) website.
The synchronous Padlet discussion was structured around 9 questions, focusing on the affordances and limitations of ChatGPT for teaching, learning and assessment in PGCert courses. The quality and quantity of responses shared on the Padlet is impressive with each question receiving about 15 responses during the live event. This resource remains live and, we hope, will continue to be added to.
During the live event, Virna intentionally posted on the Padlet the answers that ChatGPT gave to each of the questions above, but only after about 10 replies had been received by colleagues during the live forum. She did this firstly to highlight ChatGPT´s text generation capabilities relative to human responses. We observed that ChatGPT responses were standard and generic, while human contributions to the Padlet were much more context-driven and included aspects that ChatGPT did not. Secondly, she wanted to provide an example of how ChatGPT can be used to support teaching and learning as this type of Padlet activity can be done in class and can lead to interesting discussions about ChatGPT potential and limitations.
This blog post, in collaboration with colleagues from Iceland, Egypt, China, Spain and South Africa, summarises the ideas shared on the Padlet and points to further ways in which this conversation could continue.
The general feedback shared by participants during the live event was that ChatGPT and other AI tools and platforms are not something that the HE academic and professional community should fear or restrict per se, but an opportunity to engage with our students in new educational paradigms that promote a more equal and creative relationship between the educator and the learner: AI can be the key to start considering our students as equal partners not only when it comes to content creation but assessment literacy as well by transforming the ways that we assess our students. As AI continues to evolve and proliferate, and it will probably be very difficult to restrict student access to this technology, it is necessary to consider how student learning opportunities could benefit by the integration of AI in teaching approach and practice so that academic integrity and the quality of student learning development is maintained. It is also necessary to extend the conversation to potential inequalities that could emerge if applications such as ChatGPT became paid resources.
1.What do you think are the affordances and limitations of ChatGPT (in relation to PGCert courses)?
On the one hand, ChatGPT can act as a 24/7 assistant, providing immediate feedback to natural language queries. It can also help students get an overview of the main themes and troublesome concepts in their courses. On the other hand, ChatGPT lacks emotional intelligence and empathy, and has limited understanding of context and nuances in language. It also has the potential to generate biased or inaccurate information and has difficulty in evaluating subjective or complex information.
It can be a useful learning and writing assistant for staff-students who are not familiar with reflective writing or have difficulty writing in English. However, it is important to keep in mind its limitations when it comes to critical analysis and source credibility.
PGCert programmes should also include discussion of creative learning to draw out the human dimensions of assessment and invite participants to look at assessment from a human perspective, to counterbalance the limitations of ChatGPT and its lack of ‘voice’.
2. What are your concerns about the use of ChatGPT within PGCert courses?
The use of ChatGPT within PGCert courses raises a number of concerns. Firstly, issues with attribution arise as the work produced by ChatGPT cannot be reliably detected by existing anti-plagiarism software as the model´s outputs are ‘original’, simply not human generated. Secondly, students’ digital literacy levels vary greatly and it may be difficult to accommodate everyone’s needs and comfort levels with the technology. There is also the potential for students to become overly reliant on the technology and/or to use it inappropriately, to the detriment of their own critical thinking and problem-solving skill development. Furthermore, ethical concerns around privacy and data security; the use of AI tools to grade and provide feedback on student work; and academic integrity policies specifically addressing the issue of AI-generated content need to be addressed. While the advent of Chat GPT has resulted in welcomed renewed scrutiny of current practices, the on-going twin issues of content and assessment overload, may in fact push students into ChatGPTs arms in search of ‘shortcuts’ and a means of cheating.
Overall, what came to the fore is the difficulty in reconciling the exciting teaching, learning and assessment possibilities of ChatGPT with its technical and ethical limitations and shortcomings.
3. How could ChatGPT enhance learning on PGCert courses? How can these tools allow us to achieve our PGCert intended outcomes differently and perhaps better?
ChatGPT can provide students with a more interactive and personalized learning experience, freeing up time for instructors to focus on other tasks. Additionally, it can spark discussions about the nature of academic integrity, and the role of AI in assessment design. ChatGPT can be used to support students in thinking creatively and critically in assessment, encouraging a move away from content-driven learning to activity-driven learning. It can also facilitate discussions about the relationships between learning and the wider world, such as citizenship and critical thinking. There is potential for ChatGPT to be used in co-authoring academic papers and for exploring the accuracy of subject-specific knowledge generated by the technology. However, it is important to consider the potential for AI to reproduce bias and reinforce stereotypes, which may limit innovation. The use of ChatGPT in PGCert programs could prompt reflection on the nature of teaching and assessment, and provide a catalyst for positive change in education.
4. Inclusivity: What’s the added value of ChatGPT for staff-students with learning difficulties or disabilities?
Note: ChatGPT is currently blocked in a number of countries e.g. Egypt and Saudi Arabia. Additionally, it is reported that ChatGPT is moving to a paid subscription/licensing model which may limit accessibility and/or functionality to any free version that remains available.
ChatGPT has the potential to provide a number of benefits for staff and students who are working through English as a second or other language or who have learning difficulties or disabilities. Chat GPT is browser-based and so is an “always on” assistant that only requires an internet connection to access. Once accessed, the interface is simple and intuitive to use. ChatGPT can provide real-time, personalized learning support in a range of ways e.g. answering questions, summarising and/or paraphrasing course content, generating samples of work for comparative or review purposes etc. Used appropriately, the model has the potential to build confidence and stimulate thinking in all students.
5. Critical AI: what ethical considerations matter in the use of ChatGPT within PGCert and beyond? How can we include critical AI literacy on PGCert courses?
In the use of ChatGPT within PGCert and beyond, there are several ethical considerations that must be taken into account. These include bias and fairness in its training data, privacy and security of sensitive information, issues of attribution in generated content as well as ownership of same and responsibility and accountability for its actions and decisions. To address these ethical considerations, it is important to include critical AI literacy in PGCert courses. This can be achieved by encouraging students to discuss the ethical implications of AI, analysing AI systems and their design, data, and algorithms, and developing ethical decision-making skills to evaluate AI systems based on principles of fairness, privacy, and accountability. It is important to acknowledge that ChatGPT is multilingual but monocultural, which can impact its responses and raise questions about its limitations – ‘algorithmic justice’ can be used to address tech bias and promote fairness.
Plagiarism is a significant issue. New knowledge invariably builds on the knowledge of others. Attribution and referencing are corner-stones of our existing knowledge-creation process and it is vital that our students understand and respect this. It is also important to note the reported exploitation of Global South labour in the process of training ChatGPT so as not to produce offensive and/or unlawful content.
Overall, the posts pointed to the need to consider the extent to which PGCert programmes are teaching critical digital pedagogy more broadly, beyond just ChatGPT.
6. Assuming ChatGPT is here to stay, are new rubrics and assignment descriptions needed on PGCert? Should we create an AI writing code of conduct for PGCert participants?
The integration of ChatGPT in particular into PGCert courses may require the development of new rubrics and assignment descriptions that reflect the capabilities and limitations of the technology. A code of conduct for AI writing could also be created to set clear guidelines for ethical, responsible and appropriate use. These could include principles such as fairness, transparency, and privacy, and would help students to understand their responsibilities when using AI tools. Of course, this should be in place for all courses, not simply for PGCerts.
The curriculum should encourage students to use ChatGPT to enhance their own practice, rather than simply repeating or replicating the work of others. For instance PGCerts could emphasise the learning process rather than the outcome and students’ ability to effectively evaluate the technology, rather than just use it towards their submission. Students should be encouraged to critically evaluate AI generated text and use their knowledge and voice to improve it. Assessment rubrics could also incorporate aspects of interpretation and the role of the “reflective practitioner” where students reflect on their own practice.
7. How can we plan for more human-centric assessment forms to counter the excessive/unethical use of ChatGPT on PGCert? How can we (re)focus the PGCert course/assessment on the learning process rather than the (summative) performance?
Planning for more human-centric assessment forms to counter excessive/unethical use of ChatGPT on PGCert requires a shift in focus from performance to the learning process. The following ideas were shared on the Padlet:
- Encourage collaboration and active learning pedagogies
- Emphasize formative assessments such as in-class discussions and presentations
- Provide opportunities for students to reflect on their learning process and their use of technology including ChatGPT
- Incorporate discussions and activities promoting ethical and responsible use of technology
- Foster authenticity in assessment by incorporating real-world and personal experiences
- Promote anonymous assessments to build trust between students and staff
- Offer choice in media to demonstrate learning
- Focus on skills-based rather than knowledge-based assessments
- Consider implementing ‘Live+’ exams for offshore and online students or increased use of pen and paper exams.
In short: we are challenged to design inclusive and authentic assessments that are ‘ChatGPT proof’.
8. Students as partners: How can we invite/involve PGCert participants in the decision-making process regarding the use of ChatGPT on the PGCert course?
Students can be involved in the design of the PGCert curriculum at various points and in various ways, and this should include shared decisions about the potential uses of ChatGPT.
To gather feedback and perspectives, surveys and feedback from students can be collected and used to inform decisions. Discussions and debates facilitated by students themselves can also provide a platform for them to share their opinions. Student representatives can be appointed to serve on committees or task forces responsible for decisions about ChatGPT and other technology in the classroom. Students can be involved in co-creating a code of conduct and acceptable use policy, considering the ethical use of ChatGPT in different subject areas. This approach not only helps to create a more inclusive and collaborative educational environment but also values students as partners in their own learning journey.
During the PGCert learning, the introduction of moral dilemmas and case studies about the use of AI resources and AI-generated content in assessments can encourage students to think critically about its use. By encouraging students to create projects and initiatives that explore the use of ChatGPT and other AI technologies, they will have the opportunity to shape their own learning experiences and contribute to the overall direction of the PGCert program.
9. Any other relevant questions?
Some further fascinating ideas and challenges that colleagues contributed to the Padlet and that could be further explored by the sector are:
- The broader historical context of technological innovation and the unease that often accompanies new technology in education.
- The possibility of fuelling a moral panic around the use of AI in education. The potential for over-emphasizing the use of ChatGPT, at the expense of more traditional learning processes and the development of critical thinking skills.
- The potential double standards in the use of AI in education, as compared to their use in other areas such as digital transactions.
- The need to consider the role of race, social class, and AI code and bias, especially in the context of countries with low social mobility.
- The uncertainty about when AI will get too good, and how this will impact education coupled with the concern that the cost of ChatGPT (once it is no longer free) may widen the digital divide and perpetuate inequality.
We hope that the Padlet and this blog post can be valuable resources that colleagues can use to further the ChatGPT/AI conversation.
A note about the way this blog post was written: Virna wanted to experiment with using ChatGPT to support academic writing. She used ChatGPT to collate the responses to each of the 9 questions, but then the six of us collaboratively reviewed and improved the text through a shared Google document. We credited ChatGPT as second author of this blogpost.
Virna Rossi is the course Leader of the PGCert for Creative Courses at Ravensbourne University London. A passionate teacher since 1998, she has worked in all educational sectors: Primary, Secondary, College (FE), Adult Education, Higher Education. She is the author of the forthcoming book on ‘Inclusive Learning Design in Higher Education’ (May 2023 by Routledge).
Pingback: #ChatGPT articles, videos, podcasts | totallyrewired