ChatGPT in the Classroom
Paul Maher, Educational Developer – Digital Learning
The Studio for Teaching and Learning
What is ChatGPT?
ChatGPT is an Artificial Intelligence (AI) chatbot released in late 2022. Like other chatbots, ChatGPT is a text-based interface where an AI responds to queries or prompts from a human. What makes ChatGPT notable is the degree of its sophistication in its responses. Originally taught by human trainers, ChatGPT can apply procedural knowledge gained from solving problems to guide its approach to solving future ones (transfer learning). The ChatGPT interface is relatively easy for anyone to use. While it may require few attempts to get the phrasing right for a specific request, the interface works with plain language, meaning you don’t need to know code to use it.
ChatGPT will remember what it states in previous threads of the dialogue and can build upon these answers, meaning the AI is able to respond to complex and process orientated tasks. A human user can ask ChatGPT to help with anything text based. Examples of tasks include:
develop a presentation outline on quantum mechanics
write a poem on love in the style of Emily Dickenson
write a 1,000 essay on how archaeological records from Pompeii explains the impact of eruption of Mt Vesuvius
compare the business models adopted by tech giants Google and Amazon
Is ChatGPT a good thing?
There are divergent perspectives on this topic. ChatGPT, is the culmination of extensive machine learning. The result of significant success in technology and human training working in tandem to produce the interface. The ability to access, filter and synthesise volumes of information makes ChatGPT a powerful research tool.
Many are concerned that ChatGPT will result in a surge in instances of plagiarism. This concern is legitimate, as the text ChatGPT provides has a degree of sophistication sufficient to appear as human. The writing produced is clear, coherent, personable, can present an argument and incorporate multiple viewpoints. While some strategies exist to identify text generated by AI, it is possible for a student to ask the AI to generate an ‘original’ essay, poem, or report, which they submit as their own work.
The issue may not be the existence of the AI, but rather the manner students and researchers interact and use this tool. For example, the human user can ethically use the information generated by the AI to scaffold and support the gathering and synthesis of research when preparing an assignment for submission. The dialogue can provide a diagnostic to ensure a good degree of scope of research material has been factored in.
ChatGPT draws on an array of information sources and follows a rigorous self-review process there are substantial limits to the information it provides. The AI draws on information in existence and is limited to knowledge before 2021. The sources the AI draws on also are likely to contain bias, reflective of broader systemic inequities and power imbalances.
How prevalent is its use?
There is already evidence of widespread use of ChatGPT in academia. Users are often locked out from the interface for extended periods, indicating how much activity on Open AI has exponentially increased since its release in late 2022. Developers post humorous comments indicating that the AI is overwhelmed by requests asking users to be patient as they look at ways scale access and meet the rapid increase of activity via the online interface.
ChatGPT is a tool that is likely to improve on its performance quickly meaning an increase in activity is expected to continue at an exponential rate. It is safe to assume that use of AI has occurred throughout university communities, either as a preparatory tool to support students and faculty, or to fully produce text which is then presented as original research–without crediting the source.
What are people saying about potential impacts for higher education?
There is early indication that ChatGPT will significantly impact on learning and assessment in higher education. The following are some examples of the range of expert commentary on the topic, some of the suggestions may run counter to considerations of accessible and inclusive education:
Calling for a return to handwritten invigilated in-person exams.
Concerns this tool may jeopardise the practice of open tests and take-home exams. The practice of which enable students to demonstrate their developed understanding of applying the knowledge they have acquired in the class as well as providing a viable alternative assessment for those students who experience anxiety with timed in-person tests and exams.
Advocating the use of oral tests or on-the spot demonstrations of competence. While this could have negative accessibility impacts, when done well public displays of competence are a high-impact practice.
Higher education needs to adopt this as a tool to augment learning, because this is the way of the future. Humans and machines are increasingly going to enter into dialogic relationships and future generations need to understand the nature of these relationships.
Practices of assessment will need to evolve, and new conceptions of ethical collaboration will need to be developed.
Heightened expectations for critical thinking and digital literacy both in relation to identifying the limits to information as well as qualities of what it is to belong to an intellectual community.
Is there a way to identify ChatGPT?
Yes and no. Plagiarism software, such as Turnitin are looking to address the issue. There are also alternative free access AI detectors, such as https://openai-openai-detector.hf.space/ However, it should be assumed that students can make modifications to the text which bypass the detection software.
Which courses and assessment are less likely to be impacted?
Some courses and assessment practices with the following attributes may be less significantly (or not) affected by this:
Reflective and metacognitive activities which require course content be integrated and assessed in a personal way and through lived experiences. OR
Any learning or course activity which is personally meaningful to students in that it connects to their sense of purpose or specific values. Examples include achievement, service learning, experiential learning, research, and self-directed learning.
Courses which have established strong cultural norms and expectations on academic integrity and ethical work practices. This could be related to discipline based professional
Many of these qualities are already associated with experiential learning, high impact practices or learner centred pedagogies.
Potential solutions
1. Use digital diagnostic tools to identify instances of plagiarism.
2. Require a summary or synthesis of larger assignments, through which students demonstrate their understanding of the topic:
3-minute thesis, an abstract, short oral presentation, A GIST statement,
3. Ask students to incorporate context specific or unique content into their assignment, like a demonstration, a graphic element or analogue processes.
4. Incorporate meta-cognitive processes into your assessment.
Ask student to explain, and evaluate their process,
consider the implications if they followed an alternative process.
5. Leverage reflection. At best AI can only scaffold students to prepare a thorough reflective response. The deeply personal and holistic characteristics of reflection, when done well, offer faculty a very useful measure on how thoroughly a student has learned something.
6. Establish cultural norms on academic integrity, for example
Students prepare a meme, video or statement on why cheating is unethical and will negatively impact on their learning.
Students agree to a pledge that confirms they will conform to specific ethical academic behaviours.
7. Review your course and the context it operates in to identify any conditions which make cheating or unethical behaviours an attractive option for your students. Look at ways to remove or mitigate these factors to reduce the attractiveness of cheating.
8. Incorporate the AI into the course and potentially invite students to take leadership on how it can be ethically used in completing course requirements. OR
9. Ask students to generate material with ChatGPTand than critically reflect on the information. One example could be to use Socratic questions e.g.
Do I agree / not agree with the perspective provided by ChatGPT?
What alternative perspectives could be considered?
What are the limits to the information it has generated?
10. Design assessment that requires students draw on or use course materials, not available to the AI.