Principles for GenAI Use in First-Year Composition (FYC) at the University of Central Florida (UCF) were created January-December 2025 by the Composition Committee, a group of twenty teacher-scholars in the Department of Writing and Rhetoric (DWR).
These principles outline how the First-Year Composition Program at the University of Central Florida approaches GenAI as part of writing instruction. It supports instructors and students in engaging GenAI as a site for critical inquiry and informed practice, and it offers external audiences a research-based account of our program’s commitments around writing, language, literacy, and rhetoric. Recognizing the range of perspectives on GenAI, this document articulates how GenAI can be approached to support student learning and align with core values and beliefs in First-Year Composition at UCF.
AI Policy Statement [Recommended]
In this First-Year Composition course, writing is treated as a human, social, and reflective process. You will develop ideas through drafting, feedback, revision, and reflection. Your writing should represent your thinking, experiences, and voice.
GenAI tools (such as ChatGPT) may be used in limited and purposeful ways when explicitly allowed for a specific writing activity or assignment. These tools cannot replace the work of thinking, drafting, revising, or reflecting, and they should not be used to produce final drafts or complete writing tasks for you. Because GenAI often reflects standardized language and dominant norms, any use should critically examine and acknowledge limits, biases, and impacts on language and learning. When GenAI is permitted, you may be required to write an “AI Use Statement” that discloses how and why you used the tool. The AI Use Statement will explain the affordances and limitations of using AI to assist you with writing and learning.
This AI Policy Statement is designed to center your voice as a writer. Your thinking, agency, literacies, and experiences are assets that shouldn’t be outsourced. If you are ever unsure about whether or how GenAI can be used for a specific activity or assignment, ask before using it. The “Principles for GenAI Use in First-Year Composition at UCF” document is an additional resource to help navigate conversations around GenAI and writing.
Definition of Generative Artificial Intelligence (GenAI) and Large Language Models (LLMs)
Generative artificial intelligence (GenAI) refers to AI systems designed to generate content such as text, images, audio, video, or code. These systems are trained on large datasets and produce outputs by identifying and predicting patterns in existing information rather than creating knowledge or meaning. GenAI requires human prompting and guidance; it responds by predicting likely sequences of words, images, or symbols based on statistical probability.
Large language models (LLMs), such as ChatGPT, Copilot, and Claude, are a subset of GenAI that generate text by predicting word and character sequences. LLMs are not search engines, and what gets generated is not neutral. While these technologies may appear conversational or creative, they do not have lived experience, intention, critical judgment, or emotion. Their outputs reflect the data they are trained on and the choices made by human users. It reflects the values, assumptions, power relations, and linguistic hierarchies embedded in the training data and design. As a result, the quality, direction, and limitations of GenAI-generated content are shaped by human input and the sociocultural contexts embedded in the systems themselves.
1. First-Year Composition Instructors
1.1. Emphasize writing as a human-centered, socially and culturally situated activity.
Writing is something people do with and for other people. It occurs in specific social, cultural, institutional, and disciplinary contexts, and always anticipates a reader. Writers draw on prior knowledge, experiences, literacies, and community practices to make rhetorical choices about language, purpose, and audience. Writing involves planning, reflection, drafting, feedback, and revision. GenAI may assist parts of this process, but it cannot replace the relational, reflective, and meaning-making aspects of writing. Instructors should guide students to notice how their own social and cultural identities, experiences, and communities shape what they write and how they write it. Encouraging revision and reflection helps students integrate authentic voice and perspective into their work. AI cannot replicate the nuance, memory, or personal insight writers bring to their writing.
1.2. Critically examine the affordances and limitations of GenAI.
Writing is a technology, a transformative tool that shapes how knowledge is produced and understood. Instructors should help students evaluate what GenAI can and cannot do while reinforcing core disciplinary values: writing as process, reflection, and revision; language variation as the norm; and students’ right to their own linguistic and rhetorical choices. Instructors can draw on research in writing studies, multimodality, and digital literacies to help students understand how tools shape writing, meaning, and learning. Faculty should scaffold exercises that show how AI can support certain writing tasks and activities but may obscure nuance, context, or critical reasoning. Discussing AI’s limitations in detecting audience, social and cultural context, or purpose helps students use tools thoughtfully rather than as a shortcut to a complex, recursive process.
1.3. Emphasize writing as a recursive process and inquiry-based practice.
Writing is a practice of exploration, meaning-making, reflection, and revision. Instructors should guide students to see writing as a space to discover ideas, refine questions, and revise understanding. AI can interrupt these recursive processes by emphasizing efficiency and final text production. Helping students critique AI in this context supports engagement with writing as inquiry. Faculty should model reflective practices such as iterative drafting, peer feedback, and revision. Emphasizing process over product and inquiry over speed helps students resist the temptation to rely on AI. Connecting recursion to critical thinking reinforces writing as a learning process rather than a product.
1.4. Emphasize writing as contextual, flexible, and rhetorical.
Writing shifts across audiences, purposes, genres, cultures, and power structures. Instructors should guide students in making deliberate rhetorical choices that fit each context and their purposes for writing. AI often produces standardized outputs that may not match specific rhetorical situations, so teaching universal rules or fixed formulas risks flattening rhetorical and linguistic dexterity and students’ voices. Writing instruction should focus on helping students adapt and transfer knowledge while considering purpose, audience, context, and genre. Raising students’ metacognitive knowledge requires decision making and viewing writing as contextual and dynamic, not algorithmic. Discussing writing variation across contexts helps students understand writing as adaptive instead of formulaic. Encouraging context-specific choices develops rhetorical awareness and agency.
1.5. Recognize GenAI concerns and frame GenAI as ideological rather than neutral.
AI reflects the assumptions, values, and biases of its creators and training data. Treating it as neutral hides how it privileges certain languages, genres, and perspectives while marginalizing others. Instructors can use AI as a site for critical inquiry, asking whose voices are amplified or erased and how power shapes meaning. This also includes ethical issues such as privacy, labor, environmental impact, and accuracy. Addressing these concerns explicitly in the classroom promotes critical engagement and digital literacies. Faculty should discuss real-world consequences of uncritical AI use, encouraging students to interrogate power, equity, and access in technology-mediated writing. Exploring bias and ideology in AI outputs helps students make informed, ethical decisions in their own writing.
1.6. Reaffirm language variation as the norm and critique GenAI linguistic bias.
Large language models are often trained on dominant language practices, reinforcing standardized English and marginalizing other dialects, languages, and rhetorics. Instructors should help students critically examine how AI reflects, reproduces, or distorts linguistic diversity, and challenge hierarchical assumptions about language. This practice supports equity and linguistic justice in the classroom. Centering students’ rich cultural knowledge and diverse linguistic practices reaffirms their agency as writers. Faculty can create assignments that encourage exploration of multilingual and multimodal practices, emphasizing that linguistic pluralism and multiliteracies are valued in the writing classroom. Discussing bias in AI outputs fosters critical awareness of language, power, and inclusion. Engaging students in analysis of AI linguistic norms helps them critique linguistic hegemony and increases critical language awareness.
1.7. Establish clear course and assignment-level GenAI policies.
Clear, transparent policies support student learning and clarify expectations for when and how AI can be used. Course-level guidance creates shared understanding, while assignment-level directions explain what is permitted in specific tasks. Policies also create opportunities to discuss authorship, integrity, and responsible decision-making. Faculty should explain how AI use connects to student learning outcomes, highlighting when it is a tool for invention or organization versus when human thinking is required. Rules help students understand boundaries while supporting exploration and ethical engagement. Discussing policies proactively reduces misuse or overreliance and reinforces critical thinking.
1.8. Model appropriate, responsible, and transparent GenAI use.
Instructors shape students’ understanding of ethical and effective writing practices. When using AI in teaching, faculty should disclose its use and demonstrate responsible practices. AI should not be used for grading, evaluation, or providing feedback on student writing, which is relational and context-dependent. Faculty modeling includes showing how AI can assist with invention, organization, or language exploration without replacing human judgment. Additionally, modeling how AI use should involve critical thinking and reflection resists product-oriented approaches to writing instruction. Explaining decisions transparently helps students see the reasoning behind tool use and reaffirms writing-as-process and a meaning-making activity. Engaging in conversations around trust, reliability, and authorship models responsible use.
1.9. Respect varying student perspectives on GenAI.
Students bring different experiences, perspectives, comfort levels, and access to AI tools. Some students may choose to resist using AI due to privacy, labor, or ethical concerns. Providing GenAI outputs can be an accessible way for students to intentionally and critically engage with GenAI without requiring use. Instructors should consider alternative points of access while maintaining and meeting the assignment expectations and objectives. Designing assignments that promote student agency and accessibility helps foster critical engagement. Recognizing diversity of approach encourages students to make intentional choices aligned with their goals and values as writers. Discussing these choices helps students articulate reasoning and consider ethical implications.
1.10. Know that feedback and assessment are human-centered practices.
Teacher response relies on understanding each individual student, including their goals, contexts, and development. AI cannot provide relational, rhetorical, or developmental judgment. In addition, AI tools do not have disciplinary expertise. Faculty feedback includes subject area knowledge that is timely, accurate, and grounded in programmatic values; this knowledge cannot be replicated by an AI tool. Faculty should emphasize the importance of human feedback in guiding reflection, revision, and growth. Instructors should model how to provide thoughtful, specific, and constructive feedback. AI cannot replace the interpretive, responsive work of readers nor can it be a substitute for student agency and negotiation while revising. Writing assessment should complement pedagogical values. Reinforcing human-centered assessment underscores writing as a dialogic, reflective process.
2. First-Year Composition Students
2.1. Recognize writing as a human-centered, socially and culturally situated activity.
Writing is shaped by writers’ identities, histories, communities, and experiences. GenAI does not have lived experience, intention, originality, or cultural insight. While AI may assist with limited tasks when permitted, students are responsible for making rhetorical decisions that reflect their own voice, purpose, and goals as writers. Writing remains fundamentally human, relational, and reflective. Students should consider how their background, culture, and knowledge influence the ideas they communicate and the choices they make in composing. Reflection strengthens voice and rhetorical decision-making. AI cannot substitute for the insight gained from social and cultural knowledge and lived experiences. Students should understand that their social and cultural contexts and lived experiences are assets to the writing classroom.
2.2. Critically examine the affordances and limitations of GenAI.
AI tools can support certain parts of the writing process but cannot think critically, evaluate social and cultural context, or engage in inquiry on a student’s behalf. Students should reflect on what writing is, how it works, and what AI can or cannot do. Effective use of AI means making conscious, intentional choices so that learning, not automation, remains central to the writing process. Students should experiment with AI cautiously, noting when it helps or hinders their understanding of rhetorical decisions. Critical engagement and awareness of AI limitations ensures writers do not rely on AI as a substitute for thinking, planning, or reflecting.
2.3. Recognize that GenAI may disrupt the recursive and rhetorical nature of writing.
Writing develops through planning, drafting, reflection, feedback, and revision, shaped by audience, purpose, and context. AI may encourage shortcuts or formulaic, product-based approaches that bypass innovation, inquiry, discovery, revision. Students should understand how writing works in specific rhetorical situations and evaluate AI outputs carefully, preserving reflection, inquiry, and revision as central practices to writing development. Students should reflect on AI outputs and check whether those support their purpose and aims as writers. Reflection and revision are central to critical AI engagement. Practicing iteration and reflection reinforces writing as a process of discovery and learning. Inquiry should guide AI use, not the other way around.
2.4. Understand that writing is contextual, flexible, and rhetorical.
Writing changes depending on audience, purpose, genre, culture, and power structures. AI-generated outputs may be standardized and not fit the specific context. Students should make intentional choices, adjusting language, organization, and style to meet the needs of readers and rhetorical goals. Rigid formulas can limit creativity, voice, and the effectiveness of writing. Likewise, deficit-based orientations to writing and language should be avoided and critiqued. Students should reflect on how context shapes meaning, experimenting with tone, structure, and style to achieve their purposes. Awareness of context supports flexible, thoughtful, and effective communication. AI cannot replace the judgment required to adapt writing to context nor can it be a substitute for student agency.
2.5. Recognize GenAI concerns and understand that GenAI outputs are not neutral.
AI reflects the assumptions, values, and biases embedded in its design and training. Treating AI as neutral hides how some voices, languages, and perspectives are amplified while others are marginalized. Students should critically examine AI outputs, reflecting on ethical, social, and rhetorical consequences, including accuracy, bias, and inclusion. This also includes ethical issues such as privacy, labor, and environmental impact. Students should question how AI outputs represent or distort cultural, social, or linguistic norms, too. Awareness of bias empowers them to choose what to adopt, revise, or reject, reinforcing ethical and critical writing practices.
2.6. Understand language variation as the norm and critique GenAI linguistic bias.
AI-generated content may seem well-crafted but can reproduce dominant language norms, marginalizing dialects, languages, rhetorics, or rhetorical traditions. Students should evaluate AI outputs critically, considering accuracy, fairness, and cultural perspective. Students should experiment with multiple linguistic or rhetorical choices, recognizing the value of diverse expressions and multiple ways of writing. Awareness of AI bias supports critical thinking about voice, audience, and context. Evaluating AI outputs for fairness and inclusion reinforces responsible and reflective writing practices.
2.7. Follow course and assignment-level guidelines on GenAI use.
Acceptable AI use depends on instructor expectations, course goals, and assignment requirements. Students should reflect on when and why to use AI, prioritizing learning over efficiency. Students should apply AI tools only in ways that align with assignment instructions, learning goals, and ethical standards. Understanding context-specific expectations supports responsible, intentional use. Following guidelines helps students maintain agency while engaging with AI critically.
2.8. Engage in appropriate, responsible, and transparent GenAI use.
AI should not replace critical thinking, collaboration, or human reflection. Students should disclose AI use when allowed and avoid sharing others’ writing without consent. Students should use AI as a tool to support, not replace, thinking, planning, drafting, and revision. Reflecting on AI use reinforces learning and ethical decision-making. Transparency and responsible engagement maintains ownership and integrity in student writing. Disclosing AI use is essential to establishing trust with instructors and peers.
2.9. Know that writers have agency even when GenAI use is suggested.
Students should think critically about AI and recognize issues related to privacy, ethics, or access. Since writing classes encourage student agency and choice, students should raise any concerns with an instructor directly regarding suggested AI use on a specific assignment or activity. Instructors may provide an alternative way for critical inquiry and engagement, such as sharing GenAI outputs, without requiring AI use. Exploring opportunities for meaningful analysis of technologies increases digital literacies and knowledge about writing and rhetoric across contexts. Respecting different perspectives and comfort levels on AI supports engagement and equity. Discussing reasoning with peers and instructors fosters critical reflection and ethical decision-making.
2.10. Know that feedback and assessment are human-centered practices.
Effective response depends on understanding context, purpose, and writer development. AI cannot provide relational or rhetorical judgment. In addition, AI tools do not have disciplinary expertise or deep knowledge of a particular class or curriculum. Faculty feedback includes subject area knowledge that is timely, accurate, and grounded in programmatic values, and peer feedback includes knowledge of instructor preferences, class discussions, and other socially-constructed knowledge that cannot be replicated by an AI tool. Students should rely on instructors, peers, and other human readers for reflection, guidance, and revision. Engaging with human feedback helps students develop voice, agency, and critical evaluation skills. Writing is a process that requires feedback and revision. Writers should have agency to negotiate feedback and make decisions that support their aims for writing, while being aware of purpose, audience, genre, and context. Recognizing the human role in feedback reinforces the iterative and relational nature of writing.