Guidance for AI Best Practices in FWS

AI-Assisted & Generated Writing in FWS Classrooms

Artificial Intelligence (AI) programs like ChatGPT are an emerging technology with the ability to generate a variety of texts, and such technologies are constantly evolving in their capabilities and products. Although their iterative nature makes it difficult to state definitively what AI can or cannot do at any given moment, we recognize that many people are already using them to assist the writing process and/or fully generate written texts. As such, we in the Knight Institute for Writing in the Disciplines want to offer some advice to help FWS teachers better understand and address the possible role(s) of AI in the classroom.

While this new technology understandably elicits a range of reactions–everything from excitement, to confusion, to fear– as teachers of writing we see AI as a resource more akin to widely-accepted ones like spell- and grammar-check, predictive text features, translation apps, and citation and summary generators, than a complete break or fundamentally new way for students to produce written work. The reassuring truth is that many pedagogical strategies you are likely already using to support active learning in your classrooms continue to be best practices for responding to AI technologies. 

Here we outline some of these strategies and approaches in a Q&A format. Our aim is to address how existing pedagogies can be adapted to emerging concerns about AI’s place in the writing classroom.  

Key Questions

  • How might I broach the topic of AI tools like ChatGPT with my FWS students?
  • What is the minimal response I should have to ChatGPT as an FWS instructor?
  • Can I completely ban students from using ChatGPT or other AI-assisted writing programs in my FWS? 
  • How can I use AI programs pedagogically in my class with students? 
  • How should students cite the use of AI tools? 
  • Are there any limits to how I can use AI writing tools in the FWS classroom? 
  • What should I do if I suspect a student is turning in an entire paper that was AI-generated as their own work?  

How might I broach the topic of AI tools like ChatGPT with my FWS students?

First, teachers might reflect without judgment on how FWS students could use ChatGPT or other AI tools at specific stages of the writing process in your course. That is, you might think expansively about the ways students could use AI programs to brainstorm ideas, to help them locate sources, to develop outlines, to check grammar and assist them with editing their writing, or to develop writing for them. Students, indeed most writers in and out of the academy, are already using various AI tools to write. You might consider, perhaps with students, which AI tools are most and least useful, in what ways, and the extent to which they undermine or interfere with, support or promote course learning goals. While AI writing tools’ writing capabilities (strengths and weaknesses) are always changing, it is useful to talk with students (even practice doing so together) about the potential limits of AI writing. 

Second, once you have determined exactly when and how you feel comfortable allowing students to use AI programs, we strongly recommend that instructors develop their own statement on the use of AI-generated writing and AI writing process support (an example is included below). Such statements might be included on the course syllabus in a section on  Academic Integrity (as with Dr. Carrick’s example below), or they could be stand-alone statements in the syllabus or on the course Canvas site. As a follow-through to your statement, we recommend reminding students about your policy at least once or twice during the semester. If you do  not have a statement for this semester, it’s never too late to have a conversation with students and create boundaries and rules around the use of AI writing tools. 

What is the minimal response I should have to ChatGPT as an FWS instructor?

Ultimately, how you choose to respond to generative AI is up to you – Will you disallow its use completely? Will you accept some uses and applications but not others, and if so, where will you draw the line between the acceptable and unacceptable use of AI tools? At minimum, the Knight Institute recommends banning completely AI-generated writing for final drafts. Just like a plagiarized paper from any source, one fully written by AI is academically dishonest. 

Can I completely ban students from using ChatGPT or other AI-assisted writing tools and resources in my FWS?

Yes. As discussed above, FWS instructors should design policies that best suit their teaching and learning goals. That said, we urge FWS instructors to carefully consider the possible implications of a total ban on AI-assisted tools and resources, given that:

  • A complete ban on AI-assisted writing tools and resources could disproportionately affect our most vulnerable FWS students. 
  • A complete ban on AI-assisted writing tools and resources could disrupt efforts to create inclusive learning environments.
  • A complete ban on AI-assisted writing tools and resources could undermine a FWS classroom culture built on the kind of trust that fosters exploration, co-inquiry, and risk-taking. 
  • It is not possible to enforce a complete ban on AI-assisted writing tools and resources. 

For further consideration, follow this link to a KNIGHTLYnews post → Can FWS Instructors Ban ChatGPT?

How can I use AI programs pedagogically in my class with students?

In addition to the below suggestions, the Knight Institute offers some guidance on using ChatGPT via classroom activities throughout the research process and the Cornell University report on AI has a specific section on writing classrooms that ends with a list of particularly useful activities. What you may notice about all of these suggestions is that typically students are asked to try doing something on their own first (e.g., locating sources, building an outline, writing a paragraph) and then employ generative AI to reflect on differences, provide feedback, or offer new perspectives and insights. 

To start the conversation 

  • Have an honest conversation with students about AI to better understand their attitudes towards the technology, rather than starting from the assumption that they are pro-AI. How might advances in AI technology affect their career prospects, either positively or negatively?

As part of a research project 

  • Using AI at earlier stages for brainstorming, asking it to aggregate background information or “introduce” a student to a topic as an entry point to research.
  • Asking AI for search terms for a research project after the student has spent some time searching on their own. 

During revision process for an early essay

  • Having students read and critique an AI-generated version of an essay, maybe alongside examples of actual student writing on the same topic. What does the AI-generated text do well and what does it fail to do well, or at all? Can students tell the difference between the version written by a human and the one written by a machine, and if so, how? 
  • Asking AI for feedback on their own drafts, and having students write a reflection on the feedback they received and whether or not it was useful in their revision process, and if so how.

How should students cite the use of AI tools?

If you are allowing and/or encouraging students to use generative AI tools in their writing process, we would strongly suggest discussing how to cite it, too. MLA and APA offer these basic guidelines for citing AI. Anuj Gupta, a mixed methods researcher at the University of Arizona, suggests writing a reflective paragraph on exactly how the writer used AI. We have also seen suggestions that students might include the exact prompt they used with AI or download a transcript of the conversation they had and include it as an appendix. 

Are there any limits to how I can use AI writing tools in the FWS classroom?

Yes! Cornell has laid out the following rules: 

AI tools should not be used to assess student writing. 

Never upload student writing, Cornell information, or other private information into an AI tool. In an email from Cornell Vice President and Chief Global Information Officer Curtis L. Cole (Sept. 27, 2023), the following statement was offered about confidentiality and privacy: “If you are using public generative AI tools, you cannot enter any Cornell information, or another person’s information, that is confidential, proprietary, subject to federal or state regulations or otherwise considered sensitive or restricted. Any information you provide to public generative AI tools is considered public and may be stored and used by anyone else.”  

Uploading copyrighted materials (including any published texts behind a paywall) is questionable at best. While the above statement from Cornell does not explicitly regulate against uploading copyrighted materials, based on other copyright guides (e.g., how much you can post in Canvas of a book for educational purposes), it seems safer to not input an entire article or text into an AI tool. This means that you should probably not input an entire scholarly article or published essay into any generative AI tool. 

What should I do if I suspect a student is turning in as their own work an entire paper that was AI-generated?

You should never upload student work into any AI tool. Despite their promises, AI-detection applications are not trustworthy: they are often unreliable, and research shows that they disproportionately target writers from certain disciplinary, cultural, and language backgrounds. Furthermore, as stated above, such an action is a violation of student privacy. 

MULTIPLE STUDENTS: If you notice that several (or all) of your students are using the same exact phrase or including writing that seems strikingly similar across the group, it’s useful to acknowledge and discuss this observation with everyone in the class. For example, you might copy all examples of the statement/phrase in students’ essays (removing student names) and ask the class to reflect on what they see. This observational approach avoids calling out individual students while creating a learning experience and a conversation around the [potential] use of AI. 

You might also reflect on whether or not your assignment prompt itself made a response from generative AI  easy to create, or if students are struggling too much to work with a particularly difficult text (and were thus tempted to turn to AI for help). 

INDIVIDUAL STUDENT: If you suspect an individual student has turned in writing for a final paper that is AI-generated, and they are claiming this writing as their own, the process is the same as it is for any plagiarism case. We recommend you do the following:  

  • Invite the student to a conference and ask about their writing and writing process. That is, you might ask them: Can you describe your writing process for this essay? What were the steps you took to get started and develop this  draft? Can you talk me through your main ideas and organizational structure? What evidence helped you get to these ideas? 
  • Ask the student directly if they have used AI to generate their paper and to what extent. 
  • Talk to the student about your policy around academic integrity and what your expectations are for writing in the course.
  • Create an appropriate response given the conversation, which might involve: asking the student to rewrite the paper and turn in a reflection about their process; failing the student on this particular assignment; learning about the process of an academic integrity hearing and requiring the student to go through a more formal conversation. As you make these choices, remember the goal is for students to learn from their decision and move forward in your class. 
  • Read more about the process of requiring a student to go through an academic integrity hearing here. While the explanation of an academic integrity hearing is extensive and may feel overwhelming, we believe these hearings can function pedagogically. That is, an academic hearing can be an opportunity to have a serious conversation with a student about their writing process and approach to assignments that can yield fruitful growth and engagement. 

Resources

Academic Integrity (from Tracy Carrick’s FWS syllabus)

All the work you submit in this course must be written for this course and not another and must originate with you in form and content with all contributory sources fully and specifically acknowledged. Make yourself familiar with Cornell’s Code of Academic Integrity. In some courses, you are not permitted to work with others to complete assignments. In this course, however, collaboration is essential. Here, collaborative work of the following kinds is not just authorized, but encouraged and sometimes required: peer review, shared note taking, and co-writing in pairs or small groups of students.

Special note: If you submit assignments that include AI-generated text, you may be violating Cornell’s Academic Integrity Code. You may also be undermining the most fundamental learning goal of this course: to use writing to clarify and deepen the ways you think and make meaning. AI text generators, like ChatGPT, Magic Write, QuillBot, and WordTune, cannot think, and the ways that they make meaning are risky. This technology captures and integrates material from the internet that may or may not be accurate, logically connected, or interesting, and then inserts this selected material into basic templates that are unlikely to reflect the nuance of your voice, values, and ideas. Or those of your readers. That said, as a learning community of writers, we should be rightly curious, and so we will consider the impact of this emerging technology on how we communicate and its ethical and practical uses by trying it out -- together and in class.

Academic Integrity & AI (from Kate Navickas’s syllabus)

All the work you submit in this course must be written for this course and not another and must originate with you in form and content with all contributory sources fully and specifically acknowledged. Become familiar with Cornell’s Code of Academic Integrity.

In this class, you will learn how to ethically use A.I. programs to help you throughout the writing process. When you do use A.I. programs for support (like ChatGPT, Magic Write, QuillBot, and WordTune), you will be responsible for including a paragraph at the end of the essay that explains how you used A.I. to help you write your paper (specifically at what stages in your writing process and in what sections of your paper). However, submitting writing assignments that have been fully generated by an A.I. program without acknowledgement will be violating Cornell’s Academic Integrity Code. Since writing is a form of thinking and learning, when an A.I. program generates all of the text, you lose out on the learning experience and opportunities for growth.  

Indispensable Reference Home

Top