GenAI can be a helpful tool for brainstorming topics, generating keywords for searching, and conducting an initial search for scholarly sources within a field. It can also be tempting to ask a GenAI tool for help writing a paper, summarizing someone else's research, and articulating the link between someone else’s ideas and your own. Where do we draw the line with using GenAI and academic honesty/plagiarism?
"At Cal State L. A., plagiarism is defined as the act of using ideas, words, or work of another person or persons as if they were one's own, without giving proper credit to the original sources" (Academic Honesty).
Cheating is defined as:
Copying the work of other persons in whole or in part and claiming authorship
Submitting a paper obtained from any source that provides research/term papers
Using a ghostwriter to compose a paper and claiming authorship (Academic Honesty)
If you have questions about whether or not your use of AI is considered plagiarism, you can meet with a Writing Tutor or schedule an appointment with a librarian.
Like all information sources (such as Wikipedia), information from GenAI should be critically evaluated for accuracy and bias. GenAI tools create content based on patterns in the human-authored texts they were trained on. So, GenAI responses will reflect the biases and perspectives that appear most often in their training data. These tools also present information in an authoritative voice, which discourages users from evaluating responses for bias, incorrect information, and missing information.
Here are some questions to consider when evaluating your prompt responses.
Technology companies take billions of works of human creativity from the internet, compile them into data sets, and use the datasets to train generative AI tools. These practices have legal considerations like copyright infringement and call into question whether artists gave consent for their work to be included in training models. What impact can this have on artists livelihoods? Here are a few cases and news stories about AI and copyright issues.
Prompting ChatGPT for information requires four to five times the energy of conducting a Google search (Crawford, 2024).
Generative AI data centers, the facilities that house the computing infrastructure behind Gen AI, consume enormous amounts of water to cool the computer servers and other hardware.
A Times investigation found that OpenAI, one of the world’s most valuable AI companies, used outsourced underpaid Kenyan laborers, earning less than $2 per hour, to detect and filter toxic content and hate speech.
This can make us reflect on the invisible and traumatic labor that goes into creating and maintaining these technologies.