- Last updated on
- December 3, 2024
- at 4:06 PM
This topic evolves rapidly. What was true and accurate yesterday may be out of date today.
Artificial Intelligence (AI) is when a machine appears to perform the cognitive functions we usually associate with human minds. It does this by finding patterns in data sets, which it does far better and faster than humans. These tools offer potential to improve the way that humans interact with computers.
You have already been using AI for years. Autocomplete. Spelling & grammar check. Language translation. Alexa & Siri. Audio Transcripts. The way Amazon suggests products or Netflix suggest a new show for you to watch. All of these tools are AI driven.
The type of AI that teachers and students are most interested in is likely “generative AI” (Gen AI.) These are tools that can create new original content at our request, from conversational requests (prompts) we provide.
Lilach Mollick explains it well:
AI is UBIQUITOUS, UNDETECTABLE, and TRANSFORMATIVE.
Ubiquitous – AI is everywhere. Every person has access to it. There is no way to filter or block students from access. AI features are already found within everyday tools, and increasingly so. Even if it were possible to block AI from LPS devices (which it is not) students would be still able to access it from home devices and/or their phones.
Undetectable – Students are already using it in all kinds of ways. This includes cheating, and we can’t tell if they’re doing so. Common AI “detectors” have high false positive rates, are easily fooled, and may contain implicit bias.
Transformative – AI is our first technology since the creation of the Internet that touches everything we do. It has the potential to transform how we live, how we work, and how we teach.
It may be worth your time to watch the following video. Created by Ethan and Lilach Mollick, they offer an overview of how large language models work and how this latest generation of models has impacted how we work (as educators) and how we learn (as students),
The LPS Artificial Intelligence Advisory Group (AIAG) is a committee who meet regularly to consider the opportunities and risks of AI use by LPS staff and students, and make recommendations to leadership about appropriate positions.
Similar to the ITT Committee that has reviewed instructional technology tools over the past decade, this cross-departmental team includes staff from a number of departments including Curriculum, Equity, Diversity and Inclusion, Library Services, and Computing Services.
The committee is chaired by Kristi Peters, Director of Educational Technology.
Artificial Intelligence (AI) is when a machine performs the cognitive functions we usually associate with human minds. The term was coined by scientists in 1955.
Algorithm – precise steps (rules) for a computer program to take, much like a recipe in baking. While AI systems may contain algorithms, much of their output is the result of learning from data.
Generative AI (GenAI) – the class of tools that can be used to create new material (novel synthetic content), including audio, code, images, text, simulations, and videos. (E.g. ChatGPT, Gemini, Claude, etc.)
Hallucination – Inaccurate or misleading output from an AI tool
Large Language Models (LLM) – a type of AI that has been trained on vast amounts of text to understand existing content, find patterns, and generate original content in response to human prompts. Examples include GPT from OpenAI, and Gemini from Google.
Prompt – The method of interacting with an AI tool in the form of a request, or question. Usually text based, and conversational in nature.
Spam filters in email, sentence suggestions in word processing, voice assistants on our phones, or fitness tracking in our watches… there are countless machine learning tools quietly existing in our everyday lives in ways you may not have considered. (March 2024)
As technology evolves and becomes more sophisticated, it’s understandable that we’re uneasy about actual and anticipated challenges. Here are some thoughts and strategies to consider as we all move forward.
Generative AI has been available to the public for a few years now, and can be accessed in a number of different avenues. Instead of publishing a list of them here, LPS Staff may refer to this resource that lists common examples.
Google offers a short (~2 hour) course where you can learn the necessary concepts behind generative AI, identify ways it might assist your professional practice, and explore the process of writing prompts. It is a great introduction to this new world of GenAI, with classroom specific examples.
Many Federal Regulations are relevant to the use of AI in an educational setting. Teachers and staff should be mindful of this, and adhere to these commitments we have to students and families before using an AI tool in our work, or with students.
FERPA – Any AI systems used by LPS employees must protect the privacy of student education records. Most publically available AI tools do not have the level of security and privacy policy in place to assure that student data remains protected. If LPS staff choose to use AI in support of their role and practices, they should ensure that no personally identifiable information is ever included in AI prompts.
COPPA – Like all websites that require a login or collect personal information, AI tools require parental consent when used with students under the age of 13. Tools that are approved for use in LPS have gained this consent. Check the Matrix to see if a tool is approved for use with students.
IDEA – AI must not be used in a way that denies disabled students equal access to education opportunities.
CIPA – Schools must ensure that AI use aligns with CIPA protections against harmful content.
Section 504 (of the Rehabilitation Act of 1973) applies to both physical and digital environments. Schools must ensure that their digital content and technologies are accessible to students with disabilities.
Guided Browsing and Freeze Tabs
Hāpara is a critical component of our systems for managing behaviors in the classroom. It offers some powerful features that teachers can use to focus student browsing activity while keeping them from accessing unintended resources during LPS classroom work time. (NOTE: This does not stop a student from accessing tools from their personal cell phones).
You can use the built-in Originality Reports in Google Classroom when creating assignments. Our LPS license includes unlimited reports for our staff. These reports do not detect AI generated text, but may help students understand that you are conscious of the potential for academic dishonesty.
If you are a teacher of science, history, social studies, or any other area that depends upon facts, the most appropriate way to address AI generated text may be to start an assignment by using AI generated text!
Start with an assignment or prompt about your subject that you would otherwise have assigned to your students. Instead, take that to Gemini, ChatGPT, or another generative text tool. Have it generate a version or two of a response.
For your assignment, have students respond to the text you generated. Show them how you got the text. Have the students evaluate the response. Fact check it, cite their sources. Evaluate it for potential bias. Ask them to add one thing they learned about the topic that the AI missed. Have them argue an alternative viewpoint. This is the sort of high level thinking you may have been hoping for when assigning the prompt originally.
Getting an up close look at the hallucinations and errors that can appear in generated text may be the best way to make students wary of using it in a copy/paste fashion. At the very least it will model an appropriate way to approach AI generated text in their future endeavors.
If the work was done in a Google Doc, Sheet, or Slide document, check the version history. As an editor, you should be able to see the progression of the text in the document from word-to-word, and any editing that happens along the way. If the text suddenly appears all at once in the history of the document, it may be a clue that it was pasted from another source.
Consider assigning work that requires students to iterate on an assignment (edit, revise, remix.) These built-in steps might make it more difficult to copy and paste a completed work from another source.
LPS does not recommend any specific AI detection tools. These tools have been proven to be biased, inaccurate, incomplete, or are overselling their abilities. They occasionally make false accusations and even the creators admit there is no 100% accurate tool available. As the tools improve and clever students figure out how to get better results by giving the models more sophisticated prompts, it is hard to imagine any reliable way to “detect” AI text.
As stated earlier, be careful of implicit bias, which is prevalent in most AI tools.
TO BE CLEAR: No tool can tell you with 100% accuracy whether or not a student has used AI to generate text used in an assignment.
Long before a problem occurs, make it clear to students that you are aware of the potential for AI to be used inappropriately. Include references to District Policy on plagiarism and Academic Integrity statements in your course syllabus and in class discussions.
If your curricular area has provided guidance on Artificial Intelligence and Academic Integrity processes, please refer to those resources.
If you believe that a student may have violated these expectations by using AI to generate the work they submitted, following are some steps you can take to address it.
If your investigation leads you to believe that a student may have violated expectations, prepare any relevant documentation and share it with the student and family to inquire about the discrepancy and your concerns. Consider asking the student to resubmit an alternate or revised version of the work, or summarize their main points with pencil & paper in your presence. You will want to contact the student’s parent(s)/guardian(s), a building administrator, and potentially a counselor before taking these steps.
The potential benefits of modern AI for students and teachers are many. This US Department of Education resource outlines a number of areas in which education may see benefits when using technology-enhanced approaches to existing priorities such as supporting students with disabilities, addressing variability in student learning, enabling English language learners in the traditional classroom, or adaptive feedback loops that increase the quality and quantity of feedback provided to students and teachers. AI will never replace good teachers. Educators can use AI-enabled tools to amplify their role in the learning process, enabling them to engage and support their students at even higher levels. It can also help reduce the burden of “off-stage” activities like grading, analyzing data, crafting professional communications, and more.
In these early days of the Large Language Model era, AI generated text is known for being flawed when creating text that is based in fact. If students are to use it ethically, they will need to develop the research skills that allow them to determine if the text is factually correct, accurate in representation, or contains explicit or implicit bias. Students will surely try to use AI for cheating on assignments, as they have with every new technology. Teachers will need to rely upon existing policies and codes of conduct around plagiarism and academic integrity. The bigger question for educators when considering cheating (in any form) has always been “what are we really assessing?” There is no question that over time AI will challenge how we do assessment and determine understanding, just like it did when the pocket calculator was invented, and the internet arrived on student devices and cell phones.
Subscribe to receive our ConnectLPS email newsletter