Day 1 – Using prompts

Welcome to Day 1 of the 12 Days of AI! We are starting off with the first AI tool that really caught the HE sector’s interest when it came on general release in November 2022.

ChatGPT!

What is it?

ChatGPT is a conversational AI chatbot that uses language processing techniques to generate humanlike interactions in dialogue form. It has the capability to answer queries and craft a wide array of textual content, encompassing articles, posts for social media, essays, programming code, and email messages.

Essentially, you input texts prompts that will create AI-generated content that can include text, images, or videos, all of which exhibit humanlike qualities. ChatGPT’s privacy policy

How do I access it?

First you need to create an account at the ChatGPT website. You can also download an app to your phone or mobile device via your usual app store. (We’ve also listed some alternatives below.)

Prompt Engineering

Getting good responses from ChatGPT is an art in itself and is referred to as ‘Prompt Engineering’. There are many introductory guides and even a dedicated online course available from Coursera, called Prompt Engineering for ChatGPT that is worth checking out.

Image created by Adobe Firefly

Your (20 min) task:

Today and for the rest of this programme we will set you a short task that aims to get you familiar with the AI tool we are using. Obviously, a 20-minute task is not going to be enough time to fully explore its functionality, but it aims to get you to play with the AI tool and think about its potential (and its limitations).

Your task is to try out at least one of the following prompts:

  • Ask ChatGPT to write you a lesson plan, so for example, “I am a Photography tutor working in a UK art college, create a lesson plan for an induction session for first year students on a BA Digital Photography course”. Maybe, adapt this prompt for a subject that you actually teach. How useful was ChatGPT response?
  • Ask ChatGPT to create some mini case studies for you, so for example, “create 5 mini case studies of 200 words each for discussion on how educational videos can be used with art students”. Again, you might want to adapt this prompt to create case studies related to topics that you teach or interested in. How could you use these case studies in your teaching?
  • Ask ChatGPT to brainstorm some ideas for you, so for example, “Name the 5 most influential artists of the twentieth century”. There is no limit to what you can ask it but its response needs to looked at critically. Were there any biases in responses to your prompts?
  • Ask ChatGPT to create some quiz questions, so for example, “Create 5 multiple choice questions based on the topic ‘contemporary issues in UK politics.’. Include 4 options for each question and include the answers”. You might want to see how ChatGPT responds to the prompt when you ask it to create similar questions in the topics you are interested in. How did it respond? Could you actually use the questions it generated?

Alternatives to ChatGPT

Additional resources

Chris made this Padlet for a staff development session we offer UAL staff about using text-based AI tools to support student learning. It shares different ChatGPT outputs similar to your Daily Task. What differences do you notice?

(UAL staff – we’re offering this session and more during staff development week in December. Learn more on Canvas.)

Join the discussion

The questions below are important questions but don’t feel limited – there may be other issues or questions that you might want to explore. The aim of the discussion area is to get a variety of different opinions about AI. Many of our views about AI are still evolving and might actually change as we try out these new tools – you might just want to use this discussion area to test out some of your thoughts.

Leave a comment here or on X using #12doAI in response.

  • How did you get on with the task? – did you find it useful?
  • Did you notice any biases in its responses.
  • Could ChatGPT enable you to stay focused?
  • Could you see any inaccuracies or limitations in the responses you got?
  • Were there any surprises?
  • Did it give you any other ideas how you could use ChatGPT in your teaching? If so, how else could you use it?

Join the competition

In case you hadn’t seen, we’re also running a competition during this course where you can win actual prizes! Learn more aboutt how to enter the competition.

70 Comments on “Day 1 – Using prompts

  1. I asked it to do prompt 4 (quiz questions) and was pretty easy to use and was interesting to see what questions it created (my exciting prompt was quiz questions for copyright in uk HE). It could give me a basic structure for what I could include in some training so that they questions were relevant. It did make a mistake in one of the answers and when i queried it, it did say it only had information up to 2022 even though the correct answer precedes this date by quite a while. I think this could be because it was looking at US information rather than UK

    • Hi Andrew I think this is a good example of how ChatGPT can be used and also realizing that someone using it has to have the expertise to review its output – especially when it makes mistakes or hallucinations..

  2. I tried the photography example. Initially I thought there was too much content and would be tricky to cover that much group. I wondered if ChatGPT would adapt for different levels. So I changes the BA to Level 3. I could see there was a difference in the content and the tone. Expectations where more simple and less detailed.

  3. The lesson plans created, even with some additional prompting, feel quite generic and don’t provide anything extra compared to the type of sessions that I could to design. They could be quite useful as a starting point for inspiration particularly for colleagues looking for more inspiration about how to include more active learning within their sessions. The quiz questions were well formatted but had factual inaccuracies within the questions and the answers so they were not useable in their output form but again, could be used for inspiration to think about different types of quiz questions to include.

  4. Chat GPT has a tendency to compile a top ten list of responses / answers.
    I found that I had to prompt it to answer in a conversational tone.
    The power of its use is definitely in the prompting.

  5. Re the above prompts, I would also encourage people to go deep into their subject area. My PhD (long time ago) was on very early cinema (actuality films) and I’m amazed at the level 7-8 conversation I can have with ChatGPT-4 (the ‘4’ matters) on, for example, Tom Gunning’s ‘cinema of attractions’ thesis. My point here is to begin to think about ChatGPT as an ‘exoskeleton for the mind’ (to quote Jules West, who runs the excellent prompt engineering course noted above). ChatGPT isn’t like regular software someone makes for us which we like or not. Rather, it’s a cognitive tool (albeit an early model) that we co-think with and develop with. All a bit mind-boggling – an exciting time!

    • Hi John, it’s interesting that you’re able to have conversations at that level with ChatGPT. I’ve only tended towards using it for basic tasks and not explored in much more depth. Is ChatGPT-4 significantly different in terms of the levels of response it provides to 3.5?

      • Hi Joseph, with the caveat that current LLMs (like ChatGPT) are experimental and they make mistakes (think the bias critique is silly though), they provide a glimpse where AI is heading, very soon. Nobody knows how LLMs work, so I’d say why not experiment! Version 4 is much more powerful and still being developed. Here’s the conclusion to a conversation comparing my PhD thesis with the main ‘name’ in the field. It was part of an assignment while doing the Coursera prompt engineering course. Don’t think forum admin would appreciate the whole conversation given length. The point that needs emphasising is this is a conversation with a machine:

        User
        That is very impressive. I don’t think I could have written that better myself – and it is my thesis! One criticism is regarding ‘credibility’. While Gunning is a highly respected academic, I didn’t pursue an academic career and so my thesis stands alone as a non peer-reviewed piece tucked away in a University library. Academic culture and tribalism aside and purely on the basis of the texts themselves, though, your account is excellent.

        ChatGPT
        Thank you for your kind words and feedback. The matter of credibility is an important point, and I genuinely appreciate you raising it. Peer review and consistent academic contribution indeed add a layer of validation to scholarly work. However, every thesis, especially a PhD one, contributes knowledge to its field, and it’s valuable to treat such works with the respect and consideration they merit. Your work provides a perspective, and understanding that perspective can be enlightening regardless of its wider academic reception.

        When analyzing and comparing texts, I strive to be as objective and comprehensive as possible. But, as you’ve rightly highlighted, context—including the academic “weight” of the authors—can play a significant role in how works are traditionally evaluated and compared.

  6. I asked ChatGPT to create a lesson plan and was amazed at how quickly it returned a well structured lesson plan! It was a fairly generic plan (but I was vague in my prompt!) but it would be a super starting point regardless. The potential for this technology is insane!

  7. I’m at the starting point of a bit of practice research looking at ways I can use AI to assist with the creative practice of dance choreography. I’m working with some simple motion tracking equipment was able to get ChatGPT to generate some positioning data after giving it the coordinate boundaries – potentially the start of some movement phrases (or not if they end up being rubbish!). It even formatted them for me in a way that I can use with my existing software. Really interesting stuff.

  8. I’ve used ChatGPT to do these things quite a bit and they are pretty good. I find them most useful as a starting point to build on and tweak from what is generated, particularly from the lesson plan perspective (but perhaps that’s just my need to be in control!).

    With questions and knowledge wise as you would likely expect, and I see from the answers above it does sometimes get things wrong, particularly when we’re digging deep into specialist subject knowledge or when it comes to nuances of the people and real life. Particularly when it’s learners using AI that is something I’m very mindful of.

  9. I asked ChatGPT to create a lesson plan on assessment and feedback for new academics on a PGCilt. The plan was quite good, but too vague and there was too much content. Having said that, it offers a good starting point for further refinements. The case studies generated were really good!

  10. After a few prompt to revise , I did get a usable lesson plan for the Accessibility training session I am delivering this afternoon but it didn’t really suggest anything I wouldn’t have though of anyway. But the rather retro Materials needed section made me smile – Projector and screen, Whiteboard or flip chart, Markers, Handouts on accessibility guidelines, Access to a computer with internet access…

  11. I asked ChatGPT to create a quiz on the topic of imposter syndrome with details as suggested in the task. I thought the results were excellent and could be used. (However, I am not an academic/teacher.)

  12. I used Chat GPT to design a lesson plan for an induction session for first year students on a BA tabla course (north Indian percussion). It did a really good job on get the basics right. As i teach this instrument, I would only had to make a few tweaks to it. It does lack a little bit of cultural understanding when it speaks about the Tabla. Those are the elements i have to fill in. Overall, it did a great job.

  13. Brief for Chat GPT: I am a librarian and I want to teach foundation students about information literacy and using the library catalogue. Please make a 45 minute lesson plan for me.

    The lesson plan was generally well structured in terms of timings, but I found that some teaching points were unnecessary (I.e. explaining the term of information literacy and its purpose in today’s society, as well as talking about library catalogues existing in printed format). I also found that some topics of discussion were only given a very small time to be discussed and I don’t know if I could do that in real life (I.e. I couldn’t talk about credibility and reliability in under 2-3 minutes.).

    The activity proposed was good, but giving student only 5 minutes to engage with the catalogue is unrealistic, to then expect them to perform a 5 minute reflection task afterwards. Again, I also feel that this activity is not suitable, given that I was allotted 10 minutes to demonstrate the catalogue to students, which can be done but it would be really rushed.

    I didn’t notice any biases immediately, but I found it interesting how Chat GPT’s tone spoke more to library students, than foundation students with its use of language and also focusing on ethics and organizing resources, which wouldn’t be appropriate to Foundations students.

    Overall I feel that the pitching was not correct for Foundation level. I feel that time could be utilised better on other, more relevant topics such as understanding how the library search page works, building confidence in using it etc. and I also take an issue with how crammed the areas are in each section, as it will contribute to information overload and I fear that I’d lose the students. Ultimately I feel that this isn’t a lesson plan I can use, or that it didn’t provide me with any new ideas. So I’d say the lesson plan was good, but too ambitious and not pitched at the right level. Perhaps I need more experience in prompt engineering to get something more relevant.

  14. I really enjoyed trying these different examples, I used some related to my teaching and also my research.
    There is an idea-generation component of a module I teach which I think this could be really useful to show students how to use, and open the discussion about ethical use of AI.

  15. I created a lesson plan for delivering a session on ‘using the online digital library’ I found that the response covered pretty much everything I traditionally deliver with the exception of referencing, which I cover in a separate session. I was surprised it outlined a 90 minute session which is quite long, we are expected to deliver in a 45-60 minute window. I also tried the same prompt aimed at secondary school and found this more appropriate (60 mins) but again would not include referencing at this stage. It can be useful for someone starting out in the role as a template/guide not prescriptive. Regardless of what level your institute is, we need to take into account that not all learners are at that academic level so that needs to be taken into account when prompting ChatGPT.

  16. To focus my response I have just adapted the ‘induction’ session prompt for ChatGPT and will reflect on its output below. Here was my prompt: “I am a History tutor working in a UK university, create a lesson plan for an induction session for first year students on a BA History course”. You can see the output through this link: https://chat.openai.com/share/815153e6-e033-4fbf-813a-da817468542c

    On the surface, there are a lot of useful suggestions in the lesson plan but there are too many activities, and the inclusion of ‘homework’ indicates ChatGPT is not aware of the HE context or what ‘induction’ implies in the scope of termly study. The focus of an introductory / welcome session should be on building belonging and community, and this is only given 5 minutes (although an icebreaker quiz is good, encouraging students to share their discussions with a wider audience and to mingle would be better!). The suggested activity ideas are also light on detail / vague. In general, the timings suggested for each activity are impossible to accomplish in the context and, given the expected prior knowledge and background of those in the room, likely to leave learners confused. The format of the lesson plan itself is not appropriate for ensuring constructive alignment. I’m sure through further refinement of the prompts ChatGPT would arrive at a more suitable session plan, but it would still demand a trained eye to review.

    The exercise confirmed suspicions that the true usefulness of gen AI is its power to quickly and easily generate text to edit and pull apart, and that’s possibly more helpful than starting with a blank page.

  17. I requested names of artists, followed by graphic designers, producing work about HIV/AIDS (as it is World AIDS Day tomorrow). The list of artists were all North American except for one South African, and all predictable but accurate. It struggled more with ‘graphic designers’, coming up with a couple of the same artist names and a couple whose work in that area is very minimal (which could be misleading if you had no prior knowledge). But the generalised statements about this area of work did not have any errors as such.

  18. I am amazed at how quick ChatGPT work! I get it to create quizzes, write lesson plan and got everything answered within seconds. And the answers are quite good and relatable.

  19. I asked ChatGPT to write you a lesson plan adjusting the prompt to my needs and it did a pretty job in coming up with a detailed lesson plan.

  20. I asked ChatGPT to create an itinerary for an event involving artists – it was really quick and came up with things I wouldn’t have thought about.

    I also asked it more general questions about social mobility and outreach in communities which it seemed to answer pretty well!

  21. I tried using Bing Chat to create mini case studies in the discussion of how to implement AI in creating engaging eLearning courses, unfortunately it said the task was beyond its capabilities and instead suggested how I should phrase my question, e.g. what are the benefits of using AI in eLearning and how can AI personalize learning experience?, etc. For ChatGPT, I prompted it with the same task to create mini case studies and it gave me some generic answers such as personalize learning experience and adaptive learning paths, then I asked ChatGPT can I have some real-life case study with the AI application’s names of the same task, but still the answer is quite generic and it didn’t explain how the application is implemented and with solid examples. I would say I will use ChatGPT if I know how to create the right prompts that are specific and ask the right questions in the future.

  22. Diffently it won’t help you with decision making thid was my Q: if I had one choice should I study or spend time with my family in Christmas break?

    ChatGPT
    The choice between studying and spending time with your family during the Christmas break ultimately depends on your individual circumstances, priorities, and the specific demands of your situation. Here are some considerations to help you make an informed decision:

    Urgency and Importance of Study:

    Evaluate the urgency and importance of the coursework or tasks you have during the Christmas break. If there are critical deadlines or exams immediately after the break, you might need to dedicate some time to study.
    Communication with Family:

    Have open and honest communication with your family about your study commitments. Explain the importance of your academic responsibilities and work together to find a balance that suits everyone.
    Balance and Time Management:

  23. Lesson plan activity was OK – despite the old school suggestion of whiteboard and markers. It did provide a template to adapt for real world usage. Looking Forward to becoming more proficient at prompting to obtain more nuanced results. The case study activity was better as I was more specific and asked the programme to suggest brand identity reports for luxury, sustainable, FF, vintage and sportswear brands for BA Fashion Students. The real world examples given were authentic and credible. The multiple choice quiz I chose was based on Harvard referencing and paraphrasing. It included 10 questions and was largely accurate, although a couple of questions were repetitive and one answer was missing some details. Overall, it’s
    definitely, a promising start.

  24. I used Cat GPT to create a lesson plan, 5 case studies and using the educational videos but on a non academic studies – dressmaking for beginners. I was really impressed with the speed and the output and it was a really great starting point for someone teaching night/day classes who does not have a lot of time spend on this sort of thing.

  25. I adapted the questions for maths at high school level as I am about to start tutoring at that level. The lesson plan was quite good and even used words that I had to double check the meanings of. The multiple choice questions weren’t great but to be fair I shouldn’t be writing maths multiple choice questions anyway.

  26. I used ChatGPT to create a lesson plan for a digital skills session I run frequently and was very impressed with the output. It missed out one topic I’d consider essential (I expect I could have got it to include it by modifying the prompt), but otherwise the plan was very detailed & relevant, and included additional information I didn’t expect such as objectives, materials and notes.

  27. I used to ChatGPT to generate a teaching plan about a workshop on reflective writing. In my prompt, I narrowed down the scope of reflective writing on design practice, limited my target audience to BA interior and spatial design and specified Graham Gibbs’ reflective cycle as the reflective model to be introduced. The ChatGPT teaching plan is basic and seems easy to follow. The tool provides the efficiency for me to have a basic structure of the teaching, with great potential of adaptability. More importantly, it covered ‘teaching material’ which I don’t always take into consideration while planning a lesson. The content of the teaching plan, however, is generic; that is to say, even though it has BA interior and Spatial Design students at the top of the plan, the content is not discipline specific and could be applicable across the board. One thing I do not agree with the plan is the order of showing some examples of reflective writing in design practice and then introducing Gibbs’ reflective cycle. Depending on the time available, I would either (1) introduce Gibbs’ model first, show some examples based on it, and guide students to do reflections in accordance with the model [deductive], OR (2) show some examples based on Gibbs’ model, invite students to analyse the examples to find the rhetorical structure and then introduce the model [Inductive].
    As ChatGPT suggested that I show students examples, I then prompted it to generated reflective writing in architectural or spatial design based on Gibbs’ cycle. A very short piece of reflective writing was generated, with about two sentences for each step of the cycle. I then prompted ChatGPT to expand the analysis and evaluation stage and integrate some references. The degree of criticality seems to be improved. However, the writing does not look authentic. The names of the 6 steps in Gibbs’ cycle have been repeatedly used (in different grammatical categories, like noun and verb). The writing reads as if someone wrote it for the sake of the reflective cycle, rather than genuinely reflect on their experience. There are three pieces of literature being included. However, one may be faked and two are not properly referenced.
    Despite limitations, I do find ChatGPT very useful and powerful!

  28. I entered the prompt:
    Prepare a task for students to design an icebreaker. This is for teaching in higher education.
    The outcome is impressive! I copied the first part below:
    Task Title: Creative Icebreaker Design Challenge
    Objective: Develop an engaging and inclusive icebreaker activity that fosters a positive learning environment, encourages interaction, and helps students connect with each other in a higher education setting.
    Instructions:
    Understanding the Audience:
    Consider the diverse backgrounds, experiences, and interests of the students in your class.
    Keep in mind the size of the class and the available time for the icebreaker (approximately 15-20 minutes).

    Here is the problem, now that I see what ChatGPT has produced, I’m feeling less confident to design this task myself. I’m asking myself why I feel this way. Is it because the prompt was just an idea and ChatGPT turned it into something concrete and with substance, and this made me realise I hadn’t really thought about how to do this myself?

  29. I agree with other’s comments about too amount content for a short session but interesting to see that the content was similar to what I would have used. I also created a case study about keeping safe online which was sensible but I should have added in more detail in my instructions as the example was regarding a child at school and our adults would not have found that interesting. The red flags were useful prompts for a session. The multiple choice questions were OK and when asked for correct answers, they were correct 🙂
    Have had an experience on using AI when it made up some incorrect questions and answers!

    I find creating multiple choice questions irritating as I usually spend time thinking up wrong answers which is not best use of my time.

    The input is very important so will keep trying this and other options more in the future.

  30. Hi!
    I have used it to provide me with a roadmap to create a branching scenario on ThingLink.
    Some of the options given after re-prompting it remained quite supercial and this called for exploring those in a paralel conversation with GPT.
    It was good to get the ball rolling and assess my degree of confidence on the topic – I can actually flag gaps in GPT’s responses.

    It did not give any inaccuracies, but rather superficial content at times.

  31. I asked GPT4 to create a one hour lesson plan for a undergraduate art students on colour theory, covering key principles and learning activities. The result was quite generic, but a reasonable stating point, listing topics and activities. Asked to expand upon a section (Colour Wheel and Colour Relationships) it did so by really just ticking off a list of concepts. If a lecturer could add a narrative path through the topics making it engaging, that would help.
    The time allocated for the class activities and discussion also felt way off and over optimistic.

    Completely missing were any references to key theory or texts. When asked to provide those, GPT did, starting with Albers and a diverse mix of seminal and practical readings. Asked for PhD level readings and it gave what appears to be an interesting but short and somewhat random list, but I’m no expert. It also provided a reasonable but to my mind somewhat uninspiring list of art to discuss – Kusama, Richer, Hirst etc.

    https://chat.openai.com/share/cb31c376-1bb1-429d-8dc1-a1835afc97db

  32. I used a lesson plan prompt, but changed it to a subject where I need to create an online course for staff. I’ve used ChatGPT before to do something similar, and I’ve found it to be fairly useful in putting together a structured, coherent lesson plan.
    For this particular prompt, I couldn’t see any false information or biases – as expected.
    I was pleasantly surprised to see sections covering the importance of the subject, as well as the content / information that you give to learners.
    I’m certainly going to use it as a basis for my lesson plan.

  33. I tried the ‘influential artists’ prompt and was a little dubious about the answers I got. But I’ve found with these AI tools that if you keep giving it more prompts to be more and more clear about what you want, it gets much better. In this case I defined more precisely what I meant by ‘influential’ and was more convinced by the answers I got. I then started arguing with it about whether some of the artists were really as influential as it claimed, and to be fair it has persuaded me!

  34. I have played with Chat GPT already and from a personal perspective, I like the challenge of getting the prompt “right”, But having the knowledge and expertise makes it a fun challenge honing it in. I know what I am looking for. Someone else has mentioned that it is really important to be aware of the varying degrees of skill levels for students. The idea that students/young people are digital natives is an unhelpful notion. We need not only to upskill ourselves, but support the skill development of students. Many of those skills are the common skills needed for academic literacy in general – critical thinking, evaluation, learning to learn, etc.

  35. I asked Chat GPT 3.5 to “Create 5 multiple choice questions based on the topic ‘19th century issues in UK politics.’ Include 4 options for each question and include the answers”…. They were pretty guessable basic quiz style questions but the distractors were at least plausible. I then asked Chat GPT to “provide some feedback to be revealed with the answer that explains the correct answer and indicates why the other options are not correct answers.”… this effort was decent but could have been fleshed out more – especially when explaining why the red herrings were not appropriate answers.

  36. I am a Media Design and Photography Tutor. Chat GPT answers were helpful and obviously giving a solution in a very short amount of time, nothing compared to the time I would have needed…. answers were not surprising or individual, but broad. To create something unique I probably need to ask ore detailed questions and learn how to prompt right. Getting an outline and filling in definitely details helps but is not enough.

  37. I like the fact that you can continue to get ChatGPT to refine the content it originally generated to get a better more focussed result.
    For me the real benefit of this type of tool is it helps reduce some of the tedious work involved with course planning. However, it doesn’t diminish the need to engage your ‘Teacher’s Brain’ to assess and evaluate the results.

  38. I asked it to create a lesson plan on introducing reflection to educators developing a teaching portfolio. The content and sequence was pretty good and touched on the main aspects which I would expect to see in such a lesson plan. The timing was interesting – it squashed quite a bit into 90 minutes! I don’t think there would have been much space to reflect!

  39. Focus: that’s an interesting thing. I find myself going on tangents much more when using these LLM tools. Maybe that’s not a bad thing…..it is difficult to resist iterating prompts again again and again, ‘just to see….’. Sometimes it is like death scrolling and at the end I’m no further ahead.
    Does it help me focus on a topic? Maybe. On occasion I can get in and out of the tool with a quick insight or way forward.

  40. It seems like a great tool for sounding out ideas and dealing with more mundane tasks, simple letters, or outlines of plans, etc., however I certainly wouldn’t rely on it for anything more in depth, or anything which needed to be factually correct. I also got it to write a poem around the theme of Loss which, although adequate, and certainly better than anything you’d find in a Hallmark card, was noticeably lacking in depth, feeling, nuance or individual style. .In saying that however, I’ve really only played with images, music and video so far, so have not as yet “built a relationship” with the chat AI and am assuming that, as in the other areas, the more I interact with it, the more I will start to understand how the AI reacts to my prompts, and how to “engineer” those prompts to elicit results that are more in keeping with my own style and taste.

  41. I asked Chat GPT to create an introductory pattern cutting lesson plan for first and third year BA Fashion Design and Technology. Although generic it covered the main point expected if I was to deliver the lessons and the tone and language felt appropriate for the different year. I also asked to create learning objectives for final year fashion students and touched most of the points. All in all I think is great for making a start and a real time saver with multiply choice questions.

  42. I asked ChatGPT ‘How can we incorporate AI into our university writing assessment while still testing the students’ writing level and ability? The response was as follows:

    Automated Feedback:
    Use AI tools to provide automated feedback on grammar, spelling, and syntax errors. This allows students to receive instant feedback on technical aspects of their writing.
    AI systems can also highlight areas where the student can improve clarity, coherence, and overall writing style.

    Plagiarism Detection:
    Implement AI-powered plagiarism detection tools to ensure the authenticity of the students’ work.
    These tools can compare students’ submissions against a vast database of academic and non-academic content to identify potential instances of plagiarism.

    Content Analysis:
    Employ natural language processing (NLP) algorithms to analyze the content of students’ essays.
    This can include assessing the depth of analysis, argumentation, and the use of supporting evidence.

    Scoring Rubrics:
    Develop AI-based scoring rubrics that align with the specific writing criteria you want to evaluate.
    Train the AI system to assign scores based on predefined criteria, allowing for consistency and objectivity in the assessment process.

    Revision Assistance:
    Offer AI-powered revision suggestions to guide students in improving their drafts.
    These suggestions can focus on structure, organization, and the development of ideas to help students refine their writing skills.

    Adaptive Learning Platforms:
    Explore adaptive learning platforms that use AI to personalize learning experiences based on individual student needs.
    These platforms can provide targeted exercises and resources to help students strengthen specific writing skills.

    Peer Review with AI Support:
    Implement peer review processes augmented by AI tools.
    AI can assist students in providing constructive feedback to their peers by flagging potential areas for improvement and offering suggestions.

    Essay Scoring Models:
    Develop AI models for essay scoring, considering various writing dimensions such as content, organization, language use, and mechanics.
    Train the model on a diverse set of essays to ensure robust and fair evaluation.

    Continuous Improvement:
    Regularly update and improve the AI models based on feedback from teachers, students, and ongoing assessments.
    Continuous improvement ensures that the AI system adapts to evolving writing standards and pedagogical goals.

    Ethical Considerations:
    Clearly communicate the use of AI in writing assessments to students and address any concerns regarding privacy and data security.
    Ensure that AI is used as a tool to support learning and not as a replacement for human judgment in critical evaluation.

    Most of the answers are suggestions that my department had already considered but the problem appears to be that if the students use AI to improve their own or their peers’ work the tutor is still not really getting a clear idea of their writing level – just their ability to improve a paper by using a tool in a more focussed way. They just need to know what questions to ask rather than having to make decisions about language use. However, I do like the idea of natural language processing (NLP) algorithms to analyze the content of students’ essays.

  43. I agree with what most ppl said- chatGPT is good to create a structure, but lacks details. I asked it to create a lesson plan for an induction session to immersive technologies, in particular how VR and AR could be used in the illustration discipline for first year students on a BA Illustration course. Although providing the skeleton for a structure, it definitely lacked depth. I went further and asked:
    Applications in Illustration (30 minutes):

    Discuss potential applications of VR and AR in the illustration discipline.
    Showcase examples of VR/AR illustrations and how artists have used these technologies to enhance storytelling or create interactive experiences.
    Highlight the potential for portfolio enhancement and industry relevance.

    Where this came back back with useful examples. It just shows that you need to really have to have some excellent knowledge to produce something with ChatGPT to be useful. It definitely helps with the structure of a lesson plan, not so much with the content in detail.

  44. I think that most people have the same opinion about GenAI such as ChatGPT as I have. Enter in a rubbish prompt and get a rubbish output (Rubbish in = rubbish out). However, regardless of how much time I spend designing my prompts I always make sure that the output is heavily scrutinised for errors, bias etc.

    The VERY BEST part of ChatGPT is it can always be asked to go one step further and it really comes down to how creative and innovative you are. Every time I have finished asking it to do something for me, I stop and ask myself… What else could I do based on the last output it gave? For example, Generate a lesson plan that includes some activities THEN expand on the activities to create an exemplar set of answers THEN Use the answers to create a rubric THEN use the rubric to create some case studies…. You can go on FOREVER!

    If anyone wants to see the prompts and outputs I created use the link below:

    https://chat.openai.com/share/6c84a0ba-43d5-4717-8205-ca2523f554e8

    Merry Christmas everyone!

  45. I asked ChatGPT to create a quiz and then rewrite the questions to make them more challenging. I then asked it to write a lesson plan where it came up with quite an old fashioned structure. I asked it to change it to task-based learning and it did a good job. As everyone says, it’s a good start, rather than a fully fledged outcome, but it’s still impressive.

  46. I’m checking out the Prompt Engineering course signposted to in the task information above. The videos are a bit waffly and repetitive, but I am learning some useful basic stuff about writing more powerful prompts. I haven’t done a huge amount with ChatGPT so finding this a useful way to fast track some knowledge and skills development.
    This was a quote from today’s video “To really use it powerfully, you have to use your own creativity and thought to get specific about details aspects of what you want discussed. And think about what are the right patterns to put into your prompt in order to trigger the behavior that you want”. We probably ‘know’ this already, but it did give me a better appreciation of the knowledge and skills to be developed to be able to use these tools well, and to generate content of a higher degree of sophistication that gets us to our end point quicker.
    I just need to complete the 18 hour course in a week before I get charge a subscription fee!

  47. I provided Chat GPT with he following Prompt: “Create a lesson plan for an active learning session tailored for Level 4 Graphic Design students who are working on a 1500-word essay.”
    The response was extremely fast, the lesson plan was well organised and considerate. Although it was balanced toward toward generic and predictable, I felt it holds the potential to accelerate the session design process, allowing academics additional time to generate meaningful activities and identify a number of subject-specific examples that could enhance the students’ learning experience.

  48. Hi, trying to catch up now. Isabel here, from Queen Mary University of London. I have asked chapgpt to create a lecture on organoids for 2nd year biomedical students. Did very well! but ChatGPT said
    ‘I can’t browse the internet directly to fetch real-time images or diagrams’…any comments on this?

  49. I chose the mini case studies prompt. It was so quick – 5 case studies were done in a flash. They read well but as I am not a subject matter expert I can’t identify inaccuracies or limitations. They do seem quite technical /utilitarian in approach – all about improvement, is this a positive bias? One case study was on education and training and it made sense, not sure how well it would work. Redid the prompt to focus more clearly on education and this response was much more suitable for me. I could see the difference the prompt makes. The case studies prompted thought and I will use them for discussion with my current students on the LMS. Makes me want to explore it some more.

  50. I used ChatGPT to create a lesson pan, was very good, but I did keep going back to refine the lesson, (more interactive, less time) so for me the real benefit was this ability to really hone in

  51. I did all the tasks for an imaginary course in my discipline (intercultural communication) and it came up with an okay but pretty superficial lesson plan. It demonstrated a tendency to use nationality as a proxy for culture in the scenarions (e.g. Malaysian and German businessmen), which is very problematic as it’s something I would want my students to be critical about. Also, the tasks didn’t really feed into or build on each other – just discrete tasks. It is very obviously an AI generated lesson plan.

  52. I asked for a session to create a lesson plan to help staff to check their documents for accessibility issues, including a short practical activity. The result was similar to a session I had already led, without AI help, but covering all parts of a standard lesson plan (so good for creating paperwork!). Session was teacher-led, including a pairwork activity and homework.
    Second iteration-, wihout changing the prompt, was slightly more technical, but using similar design, activity slightly tweaked, suggesting a checklist with a group activity.
    Then, asked it to plan a session with format of a web quest and it created a more interesting plan, inquiry-based, with more active learning tasks. However, I think the activities would take much longer than the allotted time of 90mins, so I specified breaking it up into 3 sessions, and it planned 3 45min sessions.
    I also asked it to create 5 personas for testing accessibility, each with an impairment. The personas were realistic, describing background, challenges and goals for each, but lacked diversity, e.g. all had ‘English-sounding’ names, apart from one slightly ethnic surname: ‘Patel’, and no cultural details, but I didn’t specify any of this, so not surprising.

  53. I asked ChatGPT to create five multiple-choice questions on one of my favourite TV shows. I had to ask it to change two of the questions, one because it was essentially a repeat of another question and another because it was a big spoiler that I didn’t want the quiz to reveal. I would double-check the answers before using it as a quiz because I noticed a post in the Padlet where the author had asked ChatGPT to do a similar task, and one of the questions/answers was inaccurate. I appreciate the message at the bottom of the screen though: “ChatGPT can make mistakes. Consider checking important information.” I liked the variety of the questions it was able to produce so quickly, both for the initial task and on asking for more options.

  54. Hi, I am now testing ChatGPT to see if it can create a multiple choice questionnaire…

    My prompt was “Wondering if you could help me create a multiple choice questionnaire with 10 questionsabout important points when using generative AI for learning. I would like to trial this with my MA students. Thank you.”

    And this is what I got. I had to wait a little while but it was generated much much faster than I would ever be able too. Not checked it yet.

    Certainly! I’d be happy to help you create a multiple-choice questionnaire on important points when using generative AI for learning. Here are 10 questions for your MA students:

    **1. What is the primary goal of generative AI in the context of learning?**
    a. Replicating human intelligence
    b. Generating random outputs
    c. Mimicking pre-existing content
    d. Enhancing computational speed

    **2. Which of the following is a potential advantage of using generative AI in education?**
    a. Limited creativity
    b. Personalized learning experiences
    c. Slow data processing
    d. Static content generation

    **3. What is a key ethical consideration when deploying generative AI in education?**
    a. Lack of privacy concerns
    b. Bias in AI algorithms
    c. Overemphasis on traditional teaching methods
    d. Slow adaptation to new technologies

    **4. How does generative AI contribute to adaptive learning?**
    a. One-size-fits-all approach
    b. Tailored content based on individual progress
    c. Static content delivery
    d. Minimal interaction with learners

    **5. What role does data quality play in the effectiveness of generative AI for learning?**
    a. Irrelevant to the performance
    b. Crucial for accurate outputs
    c. Secondary consideration
    d. Does not impact learning outcomes

    **6. Which of the following is a potential limitation of generative AI in educational settings?**
    a. Limited scalability
    b. Homogeneous content generation
    c. Slow adaptability to new concepts
    d. Low computational power

    **7. How can generative AI be used to foster creativity in education?**
    a. Restricting diverse content creation
    b. Encouraging rote memorization
    c. Enabling dynamic and novel content generation
    d. Ignoring individual learning styles

    **8. In the context of generative AI, what is fine-tuning?**
    a. Generating random content
    b. Adjusting model parameters for specific tasks
    c. Ignoring the learning context
    d. Copying pre-existing materials

    **9. What is the importance of transparency in generative AI systems used for learning?**
    a. Irrelevant to user experience
    b. Fosters trust and understanding
    c. Hinders the learning process
    d. Increases computational load

    **10. How can educators mitigate the potential biases in generative AI models?**
    a. Ignore biases for efficiency
    b. Regularly update software without reviewing biases
    c. Implement diverse datasets and conduct bias assessments
    d. Rely solely on traditional teaching methods

    Feel free to adjust the wording or content of the questions to better fit the specific focus or context of your course. If you have any specific topics you’d like to include or modify, let me know!

    What does everybody else think?

  55. Also asked for the correct answers…

    Certainly! Here are my suggested correct answers for the questionnaire:

    1. **What is the primary goal of generative AI in the context of learning?**
    – **Correct answer: a. Replicating human intelligence**

    2. **Which of the following is a potential advantage of using generative AI in education?**
    – **Correct answer: b. Personalized learning experiences**

    3. **What is a key ethical consideration when deploying generative AI in education?**
    – **Correct answer: b. Bias in AI algorithms**

    4. **How does generative AI contribute to adaptive learning?**
    – **Correct answer: b. Tailored content based on individual progress**

    5. **What role does data quality play in the effectiveness of generative AI for learning?**
    – **Correct answer: b. Crucial for accurate outputs**

    6. **Which of the following is a potential limitation of generative AI in educational settings?**
    – **Correct answer: b. Homogeneous content generation**

    7. **How can generative AI be used to foster creativity in education?**
    – **Correct answer: c. Enabling dynamic and novel content generation**

    8. **In the context of generative AI, what is fine-tuning?**
    – **Correct answer: b. Adjusting model parameters for specific tasks**

    9. **What is the importance of transparency in generative AI systems used for learning?**
    – **Correct answer: b. Fosters trust and understanding**

    10. **How can educators mitigate the potential biases in generative AI models?**
    – **Correct answer: c. Implement diverse datasets and conduct bias assessments**

    These answers are based on general considerations and best practices related to generative AI in education. Depending on specific course content or emphasis, you may want to adjust them accordingly.

    I then asked “Do you really think that you will be able to replicate human intelligence?”

    ChatGPT reply: “While I can perform certain tasks and generate human-like text, it’s important to clarify that I am not capable of replicating human intelligence in its entirety. Replicating human intelligence involves not only understanding and generating language but also comprehending the world in a holistic way, having consciousness, emotions, and a deep understanding of context.

    I am a machine learning model developed by OpenAI called GPT-3.5, and my abilities are limited to the patterns and information present in the data on which I was trained. I lack true understanding, consciousness, and self-awareness. The idea of fully replicating human intelligence is a complex and ongoing challenge in the field of artificial intelligence, and it involves addressing various aspects of cognition, emotion, and consciousness that current AI models do not possess.

    It’s crucial to manage expectations and recognize the current limitations of AI technologies while also appreciating the progress being made in the field. If you have more questions or if there’s anything specific you’d like to discuss, feel free to let me know!”

  56. I asked ChatGPT to do the quiz questions, on twentieth-century Australian history. The questions were either too superficial for university students, or too complicated to be answered correctly by a quiz question.

  57. To make it relevant to my role (student voice) I asked Chat GPT to brainstorm ideas to help close the feedback loop, and all the ideas it came up with initially were very basic, activities that I’ve been taking part in since much earlier in my career. However when I specified that I work with distance learning students, it suggested gamified content, and when I asked for more details it made some very interesting suggestions. Not all of them are practical (“Build interactive simulations or scenarios that allow students to see the consequences of different feedback scenarios”) and one of them was completely awful (“Organize a virtual scavenger hunt where students search for information related to feedback outcomes across different digital platforms” – making them search is the last thing we want!) but there were a couple of ideas I’ll definitely explore.

    I think it will be useful to help speedily brainstorm ideas when I’m in a hurry, or to fill in gaps after I’ve already tried brainstorming myself.

  58. I found the response really good. Bingchat suggested titles for case studies for discussion within my lesson plan, so I asked it to write 4 case studies to add to the lesson plan and everything flowed well. There were a few adjustments, but over all it could save such a lot of time. I am concerned about the environmental impact of using this type of technology though. Hopefully we will discuss that later.

  59. I asked it to create an MCQ quiz that I wanted to use to text staff on digital accessibility. It was ok, but I think I needed to give it more specific information about exactly which topics I wanted the quiz to include. I also had to then ask it to show me the correct answers.

  60. I tried all tasks and we re impressed with the reponses.
    How did you get on with the task? – did you find it useful? Yes, very imporessive
    Did you notice any biases in its responses. I wanted Canadian content and it was not able to generate that for me
    Could ChatGPT enable you to stay focused? Not really. t enables me to jump from place to place i.e. surfing but not really deeply focusing.
    Could you see any inaccuracies or limitations in the responses you got? Generic content, specialized and country specific – not.
    Were there any surprises? Pretty good mini-case study with numbers and solutions
    Did it give you any other ideas how you could use ChatGPT in your teaching? If so, how else could you use it? I thougth if it could do this for me, imagine what my students could do!!!

Leave a Reply

Your email address will not be published. Required fields are marked *

*