
Guidelines for using AI at Marshall University

Employees, affiliates, and third-party agents of Marshall University should be mindful and only utilize approved acceptable tools and services when storing, processing, and/or transmitting Institutional Data. Technology tools and services, even those at no cost to the University, must be reviewed according to ITP-3: Technology Governance and Procurement Review. This includes personal productivity technologies, including artificial intelligence (AI) tools, that process and retain data (i.e., meeting recording and transcription, large language models (LLMs), small language models (SLMs), image processors, etc.).
Microsoft 365 Copilot operates within a secure environment that aligns with enterprise-grade compliance standards to protect Marshall University’s institutional data. It is built on Microsoft’s core principles of privacy, compliance, and security. Data processed by Copilot remains within the organization’s Microsoft 365 environment and adheres to strict security protocols, including role-based access controls and encryption. Institutional data is not used to train Copilot’s underlying large language models, ensuring the confidentiality of sensitive information.
Additionally, Microsoft 365’s existing security features, such as Multi-Factor Authentication (MFA) and Conditional Access policies extend to Copilot Chat. These safeguards help prevent unauthorized access and minimize risks of data breaches. Within M365, MUIT can implement governance policies that can adhere to custom access to AI features for sensitive or private workloads. By combining Copilot Chat’s capabilities with Microsoft’s robust security framework, Marshall University can harness the power of AI while protecting sensitive data.

Plagiarism has long been a concern in higher education, but the advent of AI tools has added new dimensions to this issue. AI-powered writing assistants, like ChatGPT, can generate text that students might use to complete assignments without proper attribution. Marshall University Libraries offers examples of AI use in research and examples to avoid plagiarism.
At Marshall University, we believe in the responsible and ethical use of AI tools to enhance learning and research while upholding the highest standards of integrity and respect. As students, you play a crucial role in ensuring that AI technologies are used ethically.
- Academic Integrity: Use AI tools to support your learning and research but always ensure that your work remains your own. Do not use AI to complete assignments, exams, or any academic tasks dishonestly.
- Privacy and Security: Ensure that any data you handle is protected and used responsibly by confirming you are logged in with your university credentials.
- Fairness and Inclusivity: Be mindful of the potential biases in AI tools. Do not fully rely on the prompts you are receiving. Ensure that you are fact checking and confirming information
- Transparency: Be open about your use of AI tools in your academic work. Clearly acknowledge when AI has been used to assist with research, writing, or other tasks.
- Human Oversight: Remember that AI is a tool to assist you, not replace human judgment. Always apply critical thinking and human oversight when using AI in your studies.
- Ethical Development and Use: If you are involved in developing AI tools, adhere to ethical standards and consider the impact of your work.
- Continuous Learning: Stay informed about the ethical implications of AI and seek to understand the potential risks and benefits. Engage in discussions and educational opportunities to deepen your knowledge of AI ethics.