Using AI: Good vs Bad

by | Jul 25, 2025 | Culture Vulture

 

Artificial intelligence (AI) is a common tool for university students and lecturers today. Yet, confusion and limited awareness of what counts as ethical use of AI remains. 

Academic Integrity teams are working alongside Subject Coordinators, Academic Program Advisors and Deans of Schools to respond to this issue. All students are required to complete Academic Integrity Modules before starting their degree.  

Still, students and staff have different views on acceptable AI use in assessments. Tools like Quillbot and Grammarly are often used for paraphrasing or grammar correction, but many students don’t realise they may need to acknowledge them.  

So, when is AI considered acceptable – and when does it cross the line?  

When is AI use appropriate? 

If used ethically, AI can be a valuable tool for students’ learning. In a course like Data, Communication and Power, students are guided to use AI as a source for information seeking, but students must always check with their Subject Coordination before using any tools.  

Assignments in practical subjects or field education placement may not be submitted through Turnitin, creating space for flexible and nuanced use of AI. For international students or those with English as a second language, AI can be a game changer. It helps with grammar, brainstorming, writing support, and even solving complex problems – giving them a better starting point for research and planning.  

AI can also assist with preparing assignments, improving writing quality, and managing tasks.  

However, students must follow academic rules. Some didn’t realise they had broken rules, or that their work would be flagged or escalated. If you use AI, you need to show that it was approved, or clearly cite your sources according to your subject’s requirements and the university policies 

Make sure you always check your subject learning guide and ask your lecturer if you are unsure.  

Why Educators Raise Concerns about AI  

Lecturers are cautious about how students use AI – especially in fields that rely on ethics, judgment, and human connection. School of Social Science Academic Advisor Dr Ben Joseph said students need to develop genuine reflective and critical thinking skills for their future careers.

‘If AI is a limitation to learning and skill development, then it only increases risks for the emerging social worker (and their clients)’, he said, emphasising the importance of ethical and intentional use.  

In disciplines like social work, AI may be unsuitable for tasks like case note writing or handling sensitive client information. It can breach privacy, harm confidentiality, and damage the trust between clients and social workers.  

‘The University and educators are struggling to keep up with rapidly advancing technology’, Joseph said. ‘We need to support students in developing core critical skills that AI cannot replace’. 

He believes regulating AI use is a shared responsibility and should be handled openly and constructively: ‘It’s about building confidence in students’ true skills’, he added.  

Joseph also noted that students compare their work to AI-generated content and feel insecure about their writing and analysis. ‘This insecurity should not hold them back,’ he said. ‘AI is meant to assist students – not replace the core skills and competencies they must develop themselves.’ 

Educators, he suggests, should create a supportive environment where students feel comfortable asking questions and getting guidance. ‘They should be encouraged to research, reflect, and evaluate their own opinions based on findings’, Joseph said. ‘No AI can replace the direct client support social workers provide, and relying on AI to complete assessments may not prepare students for real-world practice.’ 

(Available resources on WSU Study Smart, 2025) 

Support Exists – But Students Don’t Always Know Where to Find it 

Western’s Library offers academic integrity workshops focused on reading, writing, and research to help students build confidence in completing their assessments. These workshops also introduce students to ethical Generative AI use. Study Smart workshops further support academic literacy and research skills, with resources available both in person and online.  

At the start of each semester, the Library runs Academic Literary sessions, showing students how to navigate Study Smart tools. The Academic Integrity team also trains frontline staff in the Student Services Hub and Western Success so they can better guide students on where to find help and how to use AI responsibly.  

However, many students are unaware of these resources. The University’s website can be overwhelming, and unless lecturers mention and discuss these services in class, students often rely on word-of-mouth or peer support. As AI technology evolves, it will take time for university policies and teaching practices to fully adapt. To support students, the university needs to make these services more visible and accessible – helping students become independent and responsible users of AI in their learning.     

 

Author

Similar Articles

W’SUP DEEP DIVE: The Student General Meeting

In a historic show of unity, Western Sydney University students voted unanimously on August 21, 2025, to pass a pro-Palestine motion demanding the university sever research links with Syqe Medical, lift the suspensions of student protesters, halt staff job cuts, and...

Connect with us