At Saint Mary's University, we encourage faculty, students, and staff to engage with generative AI tools responsibly and productively in teaching, learning, and research. As AI continues to evolve, institutional guidelines will support the SMU community in using AI ethically and effectively.

If you are using generative AI as part of your academic work, there are several key practical and ethical considerations you must follow. These guidelines will be routinely updated to reflect the latest developments in AI technologies.

SMU Faculty and staff are invited to join the SMU Faculty & Staff PLC 2025 | General | Microsoft Teams where you can find the latest generative AI news and discussions.

To explore each section of this guide, use the quick links below:

Introduction to Generative AI

Overview of Generative AI

Artificial Intelligence (AI) refers to computer systems that analyze data, recognize patterns, and generate outputs such as predictions, content, or recommendations. A subset of AI, Generative AI (GenAI), can create text, code, images, and videos, significantly impacting teaching, learning, and assessment.

In higher education around the world, AI is being used in various ways, including:

  • Academic advising through chatbots and AI-powered platforms that provide personalized guidance
  • Course design and instruction where AI-driven tools help educators create customized learning experiences
  • Assessment and feedback by automating grading, detecting plagiarism, and offering real-time insights


As AI continues to evolve, Saint Mary’s University will remain proactive in balancing innovation with ethical considerations. By implementing thoughtful policies, promoting AI literacy, and ensuring equitable access, we will strive to use AI to improve education while upholding academic standards.


The Regional AI Working Group first met in December 2024 and aims to support collaboration among Maritime universities to learn, share, and leverage best practices for using AI in university education. This includes teaching, learning, and preparing graduates for the increasing role of AI in the workforce.

The group consists of representatives from all public universities in the Maritime provinces and is coordinated by the MPHEC, which handles meeting coordination and note-taking. Regular reports are presented to the MPHEC Board through the CEO's report.

The group has been discussing AI principles from various international organizations (Canada, Australia, the UK, the US) and identified common themes. The importance of supporting students' learning and the necessity of AI literacy were among the key themes identified.

Julian L’Enfant from The Studio for Teaching and Learning participated in a subcommittee that drafted a set of guiding principles. These principles have been created and are currently being finalized by the Working Group.

The international policies discussed include the Russell Group (UK), Australia group of 8 universities, McMaster University, UNB, Generative AI in Higher Education Whitepaper – Current practices and ways forward (Liu and Bates, 2025), and others.

To keep up to date on the latest AI news, join the SMU Faculty & Staff PLC 2025 | GAI DISCUSSIONS – NEWS | Microsoft Teams .


Responsible Use of AI at SMU

If you are using generative AI as part of your academic work, there are several key practical and ethical considerations you must follow. These guidelines will be routinely updated to reflect the latest developments in AI technologies.

📄 Download the Draft SMU AI Guidelines (PDF)

To provide feedback on these draft guidelines, please email Academic.Regulations@smu.ca.

SMU has established a set of guidelines to support responsible AI use. These guidelines aim to help students and faculty integrate AI ethically, securely, and transparently while maintaining academic integrity.

Key Considerations:

  • AI tools should be used to enhance rather than replace critical thinking and learning.
  • Faculty members determine AI use policies for their courses and must communicate them clearly in course outlines.
  • Students must acknowledge and cite AI-generated content appropriately.
  • Any misuse of AI in assessments or coursework will be subject to SMU’s academic integrity policies.

The Joint-Subcommittee of Academic Integrity and Learning and Teaching have created draft AI guidelines for the SMU community. The committee is in the process of receiving feedback on these guidelines, and hope to have them available in the near future for further feedback.


AI tools offer valuable opportunities for learning and efficiency, but they also come with ethical responsibilities.

When using AI, consider the following principles:

  • Transparency – Clearly disclose when and how AI was used in academic work.
  • Integrity – Do not use AI to fabricate data, plagiarize content, or bypass critical thinking.
  • Fairness – Be aware of bias in AI outputs and verify information before relying on it.
  • Respect for Others – Avoid using AI tools in ways that infringe on privacy, intellectual property, or consent.

Learn more: Ethics of AI Tools (resource coming soon)



Do not input:

  • Personal data (e.g., names, addresses, biometric or medical information).
  • Unpublished research or confidential material.
  • Commercially sensitive data or intellectual property.
  • Culturally significant Indigenous knowledge or restricted materials.

Data Protection Best Practices:

  • Opt out of data collection when possible (e.g., turning off chat history in ChatGPT).
  • Use university-approved AI tools with enterprise security protections.

Patrick Power Library – Privacy & Data Security Guidelines


Proper citation and attribution of AI-generated content is essential.

  • Course-specific AI use policies: Faculty will outline expectations for AI use in coursework.
  • Referencing AI in research: Follow publisher and university guidelines for acknowledging AI-assisted writing or analysis.

Patrick Power Library – Citing AI Resources


AI-generated content is not always reliable and may produce inaccurate, biased, or misleading information. Before using AI-generated material:

  • Verify sources: AI tools may provide hallucinated citations or outdated data.
  • Cross-check with academic resources: Always consult peer-reviewed research, official data sources, and faculty guidance.
  • Recognize bias: AI models reflect the biases in their training data; critically evaluate responses.

Patrick Power Library Content


  • SMU discourages reliance on AI detection tools due to privacy concerns, bias, and inconsistent accuracy.
  • Faculty are encouraged to design AI-resilient assessments such as:
    • In-class exams & discussions
    • Process-based assignments with drafts & reflections
    • Authentic assessments requiring human reasoning

HESA Recommendations

This external resource is not affiliated with SMU policy. It is provided for reference and broader context.


SMU supports the responsible and ethical use of AI as a tool for enhancing learning, creativity, and efficiency. However, it is each student’s responsibility to use AI transparently, ethically, and in line with academic policies.

Faculty Support: SMU provides training and resources to help navigate AI integration into courses. Contact: Julian L’Enfant, Educational Development & Technologies .

Student Support: Workshops and guides are available to ensure AI is used responsibly in assessments.

Visit SMU’s AI Learning Hub – Coming Soon


Teaching with Generative AI

AI enhances learning and institutional efficiency by providing personalized learning experiences, automating administrative tasks, and transforming assessment methods.

It has the potential to adapt coursework to meet individual student needs, boosting engagement and outcomes (Liu & Bates, 2025). AI can also streamline administrative functions such as grading and scheduling, freeing educators to focus on teaching (Pratschke, 2024).

Moreover, AI supports competency-based assessments that measure deeper learning and move beyond traditional exams (TEQSA, 2024).

This external resource is not affiliated with SMU policy. It is provided for reference and broader context.


Generative AI tools—like ChatGPT, Microsoft Copilot, Gemini, or Claude—can help make your work more efficient and creative. Here are some practical ways faculty and staff can benefit:

For Faculty:

  • Save time on planning: AI can help draft lesson plans, create quizzes, or suggest learning activities.
  • Give better feedback: AI tools can help you provide personalized comments on student work.
  • Design better courses: Use AI to brainstorm new teaching ideas, examples, or ways to explain difficult topics.
  • Support diverse learners: AI can help create accessible materials for students with different needs or learning styles.

For Staff:

  • Speed up admin tasks: Draft emails, reports, or meeting summaries quickly.
  • Help with data: Use AI to analyze data for student support, scheduling, or operations.
  • Improve communication: Create clearer messages and FAQs for students or staff.
  • Brainstorm solutions: Use AI as a creative partner to solve everyday challenges.

Why Use It?

  • It saves time and reduces repetitive work.
  • It opens up space for more creative and meaningful tasks.
  • It can improve the quality of teaching and support services.

Getting started can be as simple as trying out a tool or exploring how it could support your daily work.

Remember: Always use generative AI tools ethically and responsibly, and in line with SMU’s guidelines on academic integrity and data privacy.


Universities are implementing various strategies to integrate AI while upholding academic integrity.

Many are launching AI literacy programs to educate students and faculty on ethical and effective AI usage (Russell Group, 2024). Institutions are also developing clear policies on AI use in coursework and research to establish guidelines and best practices (U15 Canada, 2024).

Furthermore, universities are rethinking traditional assessments by introducing innovative evaluation methods that emphasize critical thinking and problem-solving over rote memorization (TEQSA, 2024).

This external resource is not affiliated with SMU policy. It is provided for reference and broader context.


The following draft guidelines, created by the Joint Subcommittee of Academic Integrity, which consists of student representatives, faculty, and staff, are meant to help faculty make informed decisions about the allowed use of generative artificial intelligence in their courses and communicate their policy to their students.

These guidelines are a living document. Feedback is welcomed and encouraged.

SMU Faculty Guidelines on the Use of Generative AI in Courses (Draft)

To provide feedback on these draft guidelines, please email Academic.Regulations@smu.ca.


Use of generative AI in your teaching and assessment design should be ethical, intentional, and support student learning while maintaining academic integrity. Below are some key questions to consider:

  1. Are your AI usage policies clear and aligned with institutional guidelines?
    • Have you communicated specific rules on AI use for your course, unit, or assessments?
    • Do students understand the permitted and prohibited ways they can engage with AI?
  2. How should students document their use of generative AI?
    • Are you encouraging students to track and reflect on how they use AI in their learning process?
    • Should students include an AI usage statement in their assignments?
  3. How can you ensure that student work remains authentic and not overly reliant on AI?
    • Have you designed assignments that require original thought, critical analysis, and process-based engagement?
    • Are there clear expectations on how AI-generated content should be used, cited, or modified?
  4. How can you help students develop critical thinking when working with AI-generated content?
    • Are students being taught to evaluate AI outputs using disciplinary knowledge and academic rigor?
    • Do students understand AI’s limitations and the risks of misinformation?
  5. How can you mitigate biases in generative AI tools?
    • Are students aware of how AI models can reinforce societal biases in their responses?
    • Have you provided strategies for cross-checking AI-generated information with reliable academic sources?

This resource was created by Lance Eaton for the purposes of sharing and helping other instructors see the range of policies available by other educators to help in the development of their own for navigating AI-Generative Tools (such as ChatGPT, MidJourney, Dall-E, etc.).

View Document: Syllabi Policies for AI Generative Tools

This external resource is not affiliated with SMU policy. It is provided for reference and broader context.


Despite its benefits, AI in education presents several challenges. Bias and fairness concerns arise as AI models may reflect societal biases from their training data, potentially affecting equitable outcomes (Russell Group, 2024).

Academic integrity is another issue, with AI-generated content raising concerns about plagiarism and the authenticity of student work (U15 Canada, 2024).

While on one hand, AI has been described as a way of democratizing education, the digital divide remains a significant challenge, as unequal access to AI tools may widen gaps in learning opportunities (Liu & Bates, 2025).

Privacy and ethical concerns also persist, requiring institutions to ensure responsible data use and compliance with ethical guidelines (Russell Group, 2024).

This external resource is not affiliated with SMU policy. It is provided for reference and broader context.


Why We Don’t Recommend AI Detectors for Student Work

AI detection tools may seem like a simple way to check if a student used generative AI—but they don’t actually work very well. Here’s what you need to know:

  1. They’re unreliable. AI detectors frequently make mistakes in both directions. They may flag original student work as AI-generated, and they can fail to catch work that was heavily AI-assisted. Even small rewordings or paraphrasing can fool these tools.
  2. They’re biased. Studies have shown that AI detectors are more likely to falsely accuse students who are non-native English speakers, have disabilities, or come from under-resourced backgrounds. This raises serious equity and inclusion concerns.
  3. They damage trust. Using AI detectors creates an environment of suspicion. Instead of fostering open conversations about learning and digital literacy, they can lead to punitive responses based on flawed technology.
  4. There are better approaches. Research suggests moving away from policing and toward thoughtful assessment design, open dialogue, and teaching students how to use AI responsibly and ethically.

Bottom line: We advise faculty not to rely on AI detection software. Instead, let’s focus on meaningful learning, transparent communication, and inclusive academic integrity practices.

A Note on Privacy and AI Detectors

Faculty are strongly discouraged from uploading student work to AI detection tools. Here's why:

  • Privacy Risks: Uploading student assignments to third-party AI detectors may violate privacy policies or laws, especially if the platforms collect, store, or reuse the data for training AI models. This could breach your institution’s academic and data governance rules.
  • Consent Concerns: Students are often unaware their work is being used in this way. Using their intellectual property without consent can undermine trust and contravene academic ethics.
  • Legal Uncertainty: Many AI detection tools operate in legal grey areas when it comes to how they handle uploaded content. Faculty are advised to consult with your university’s Privacy Office or legal counsel before using these tools.
  • Better Alternatives Exist: If academic misconduct is suspected, a more educational and fair approach involves open conversations with students, transparent course policies, and thoughtfully designed assessments.

Remember: AI detection tools are not only unreliable—they can also raise serious ethical and legal issues when used carelessly.


Student Guide to Generative AI

Generative AI tools—like ChatGPT, Copilot, or Claude—can help you learn more efficiently, stay organized, and express your ideas more clearly. Here are some practical ways students can benefit:

For Learning:

  • Understand difficult topics: Ask AI to explain concepts in simple terms or give examples.
  • Get study help: Use AI to summarize readings, quiz you on key points, or create flashcards.
  • Improve writing: Get help brainstorming ideas, checking grammar, or refining your drafts.

For Organizing Your Work:

  • Plan assignments: Break big tasks into steps and create timelines.
  • Draft emails or resumes: Write more clearly and professionally with AI suggestions.
  • Take better notes: Use AI to summarize class materials or meeting notes.

For Creative Projects:

  • Explore new ideas: Brainstorm stories, presentations, or project themes.
  • Get feedback: Use AI as a sounding board to improve your work or spot errors.

Why Use It?

  • It helps you study smarter, not harder.
  • It can boost your confidence in writing and communication.
  • It’s like having a digital assistant that’s available 24/7.

Please remember: Always use generative AI tools ethically and responsibly and follow SMU’s guidelines on academic integrity and fair use. If you're unsure, ask your instructor or refer to the course outline.


This flowchart provides guidance for students at SMU on how to properly acknowledge and cite the use of generative AI in coursework, based on institutional expectations.

Usage Instructions:

  1. Begin at the top of the chart and follow the arrows according to the relevant use case.

Key Decision Points Include:

  • Whether generative AI use is permitted in the course (as outlined in course materials).
  • The type of AI tools being used (standalone vs. embedded).
  • The nature of the AI use (e.g., content generation, editing, or support).

Additional Considerations:

  • Course-specific policies may vary; always refer to the course outline.
  • When unsure, consult with the instructor or default to acknowledgment and citation.
  • Citation formats and additional resources are available in the Patrick Power Library’s AI Guide.

General Principles:

  • AI-generated content used directly in assignments generally requires citation.
  • Use of AI for learning or editing support may also require acknowledgment.
  • Embedded tools (e.g., grammar checkers) may fall under different citation policies.

These guidelines aim to promote transparency and uphold academic integrity in the use of AI technologies in academic work.

Visual Flowchart:

Acknowledging the use of AI flowchart

(Created with LucidChart)


Need Help or Have Questions?

We’re here to support you! Here’s who to contact: 

  • For questions about teaching, learning, or assessment (including classroom-based courses): 

Contact: Julian L’Enfant, Educational Development & Technologies

  • For questions related to online courses (design, delivery, or support): 

Contact Jen Tupper, Manager, Learning and Development 

  • For student support: 

Contact Amanda Saoud, Educational Developer, Academic and Digital Communication 

Not sure who to ask? Just email studio@smu.ca—we’ll make sure you get connected to the right person!