Integrity in the Age of AI: Ethical Considerations of ChatGPT Use Among De La Salle University Senior High School Students
The rise and integration of AI tools like ChatGPT in education raises significant ethical concerns, namely regarding academic integrity. This study investigates the use and ethical implications of ChatGPT among senior high school students at De La Salle University Manila through a descriptive cross-...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Published: |
Animo Repository
2024
|
Subjects: | |
Online Access: | https://animorepository.dlsu.edu.ph/conf_shsrescon/2024/paper_csr/3 https://animorepository.dlsu.edu.ph/context/conf_shsrescon/article/2382/viewcontent/PP_CSR_Santos_Roldan_Pepito_Lim_Cheng___Mark_Janssen_Santos.docx.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | De La Salle University |
Summary: | The rise and integration of AI tools like ChatGPT in education raises significant ethical concerns, namely regarding academic integrity. This study investigates the use and ethical implications of ChatGPT among senior high school students at De La Salle University Manila through a descriptive cross-sectional design. 226 student responses from a structured survey revealed how these tools affect views on academic integrity and actual academic practices. Findings indicate that while ChatGPT boosts academic productivity in various tasks, such as research, comprehension, and content creation, it also introduces risks concerning dependency, originality, and fairness. Moreover, many students believe that said risks can be mitigated through responsible usage but also express concerns over ChatGPT undermining academic integrity. That said, this study highlights the urgent need for clear guidelines and robust educational programs to harness AI's benefits while still maintaining academic standards. The results underscore the need for a balanced approach to integrating AI, guaranteeing ethical considerations are addressed alongside technological advancements. |
---|