Artificial Intelligence in LPS

Artificial Intelligence (AI) is when a machine appears to perform the cognitive functions we usually associate with human minds. The type of AI that people are most interested in lately is “generative AI”. These are tools that create new original content at our request, from natural language (prompts) we provide.

There are a number of areas in which education may see long term benefits when using AI-enhanced approaches to existing priorities. Supporting students with disabilities, addressing variability in student learning, enabling English language learners in the traditional classroom, or adaptive feedback loops that increase the quality and quantity of feedback provided to students and teachers are a few examples.

AI will never replace good teachers. Educators have the opportunity to use AI-enabled tools to amplify their role in the learning process, enabling them to engage and support their students at even higher levels. It can also help reduce the burden of “off-stage” activities like grading, analyzing data, crafting professional communications, and more.

These days there is a lot of AI hype out there, with companies pushing “magical thinking machines” and overpromising time savings and results. Every month our existing tools update with more AI enabled features, whether we ask for them or not. 

At LPS, we want to take a measured and responsible approach to the integration of AI tools into our work. Any tool used by LPS staff must meet our requirements for security, data governance, manageability and integration with our current workflows.

We’ll be doing controlled deployments of these tools over time and you can choose to use them if you think they’ll help you with your work. Please share your feedback on the ways they positively impact your work, and what might make them more useful for you.

Icon representing Artificial Intelligence. Made up of an atomic element with a computer chip at the center.
This topic evolves rapidly. What was true and accurate yesterday may be out of date today.

LPS AI Resources

  • ITR000 – Artificial Intelligence
    This data governance rule defines and offers parameters for AI use within LPS.
  • Guiding Principles of AI Use in LPS
    Offers LPS positions on appropriate and responsible use of AI in classroom instruction, school leadership, and districtwide operations.
  • Artificial Intelligence Advisory Group (AIAG)
    This cross-departmental group meets regularly to consider the opportunities and risks of AI use by LPS staff and students, and makes recommendations to leadership about appropriate positions. This committee is chaired by Kristi Peters, Director of Educational Technology.

Common Vocabulary

Generative AI (GenAI) – the class of tools that can be used to create new material (novel synthetic content), including audio, code, images, text, simulations, and videos.

Large Language Models (LLM) – a type of AI that has been trained on vast amounts of text to understand existing content, find patterns, and generate original content in response to human prompts. 

Prompt – The method of interacting with an AI tool in the form of a request, or question. Usually text based, and conversational in nature.

Google Gemini

NOTE: Until Fall 2025, access to LPS Gemini is restricted to administrators, teachers and staff who are part of an identified pilot group. Questions may be directed to Kristi Peters, Director of Educational Technology Services.

Gemini Chat is a generative AI tool available for use by LPS staff members. It exists within our LPS Google domain, and you should access it via your @class.lps.org Google account.

Please consider all LPS guidance and policy around the use of Artificial Intelligence (found elsewhere on this page) before using Gemini. Other important considerations are shared throughout his page.

Unlike other generative AI tools available on the internet, the Gemini Chat tool exists within the legally protected environment of our LPS Google domain (@class.lps.org). Prompts and responses generated here are covered by the same privacy guarantees as Gmail, Drive, Docs, and the other Google Workspace tools used by LPS, in compliance with FERPA, and BoE Policy on data privacy. It is important to understand that these privacy practices are NOT in place when using Gemini within a personal Google account.

Can Students Use Gemini?

No. At this time students DO NOT have access to Gemini Chat in their LPS accounts, nor on LPS Chromebooks. Teachers should not assign any work that requires the use of generative AI at this time. 

However, teachers should assume that enterprising students have ways to access generative AI tools outside of LPS when completing assignments. Please follow existing Board Policy and Student Code of Conduct expectations around academic integrity. 

Gems

Google Gemini Gem iconGems are pre-prompts that can be customized to meet your needs, then saved for re-use. They are a big time saver for things you may use Gemini for repeatedly. You can connect them to helpful support materials, or draft long prescribed prompts that tell Gemini everything you want it to know for a better reply.

Learn how to get started with Gems.

Notebook LM

Notebook with a magic wand on the coverThe NotebookLM application is enabled for your LPS Google account.  It is a tool that uses the Gemini AI engine to better understand and work with your own documents (Google Docs & Slides, PDFs, text files, YouTube videos, web URLs, and more.) 

Here are a few initial ideas for use cases:

  • Enhance your understanding on a specific, curated topic. You can ask questions that are answered specifically based on your documents.
  • Summarize long and complex information, and make connections between different resources.
  • Create summaries, FAQs, outlines, study guides, and mind maps.

As we commonly see with generative AI tools, NotebookLM is not immune to making false statements, so continue being a careful consumer of the content within these tools.

As a “Core Service” of Google Workspace it includes the same data protections as all of our Google Drive, Docs, and Gmail files.

Learn more by visiting Google’s NotebookLM resource page.

Google Gemini logo

Gemini Jump Start

If you are interested in how to use Gemini, you might begin with this short course from Google where you can learn about the concepts, identify ways it might assist your professional practice, and explore the process of writing education specific prompts.

When is it appropriate to use AI?

  • Do you have knowledge of the topic?
    This is the only safe time to use AI. You are responsible for knowing if the output is accurate.
  • Are the stakes high or low?
    AI output is unique BY DESIGN. This makes it unpredictable BY DEFINITION. Avoid using it in high stakes situations.
  • Is high accuracy required?
    Are you looking for AN answer, or THE answer? GenAI is designed to offer human sounding answers and excels at generating unique text on divergent topics. However, you are likely to find that it misstates facts, with confidence. 
  • Is effort the point?
    As an educator you will understand that learning cannot happen unless there is effort involved. AI should be used in ways that still require thought and reflection.

VERY Important Things to Know...

  • When using free AI tools found on the internet your prompts and text responses are NOT private nor secure. They are seen and used by third parties. Rememebring that Federal regulations (FERPA) demand that we protect certain types of data, this should make us pause and be thoughtful about when, where, and how we interact with generative AI. Using LPS approved AI tools (Gemini within your LPS Google account, for example) will help you to keep student data private and protected.
  • Generative AI tools may confidently provide you text that sounds true, but is not accurate. It is unable to determine whether the words it presents you are true or false. You accept full responsibility for evaluating any response with professional judgment before using it.
  • AI tools are not human! They are statistical reasoning engines that generate sentences that sound human.
  • When you interact with Gen AI you are not retrieving existing information like a search engine would have provided. Instead, you are “prompting” it for novel (new, unique) content. 
  • Generative AI tools are not search engines. If you ask a Gen AI tool the same question five times you will get five unique answers that may contradict.   
  • Any generative AI tool is likely to be generating responses based upon biased source data.
  • Generative AI content should never replace the guaranteed and viable curriculum provided by LPS.

Deeper Dives

How does Generative AI work?

Generative AI can’t “think” for itself. The AI tools are really just math, not magic. To better understand how it works, we need to learn about the language gen AI uses (tokens), and how the algorithms create novel text based on our prompts. 

What can educators do?

As technology evolves and becomes more sophisticated, it’s understandable that we’re uneasy about actual and anticipated challenges. Here are some thoughts and strategies to consider as we all move forward. 

What should teachers know?

Lilach Mollick explains it well: AI is UBIQUITOUS, UNDETECTABLE, and TRANSFORMATIVE.

Ubiquitous – AI is everywhere. Every person has access to it. There is no way to filter or block students from access. It is already found within everyday tools, and increasingly so. Even if it were possible to block AI from LPS devices (which it is not) students would be still able to access it from home devices and/or their phones.

Undetectable – Students are already using it in all kinds of ways. This includes cheating, and we can’t tell if they’re doing so. Common AI “detectors” have high false positive rates, are easily fooled, and may contain implicit bias. 

Transformative – AI is our first technology since the creation of the Internet that touches everything we do. It has the potential to transform how we live, how we work, and how we teach.

We need to balance the risks and opportunities of AI,
because there are plenty of both.

Before using AI tools in an instructional setting...

Many Federal Regulations are relevant to the use of AI in an educational setting. Teachers and staff should be mindful of this, and adhere to these commitments we have to students and families before using an AI tool in our work, or with students.

FERPA – Any AI systems used by LPS employees must protect the privacy of student education records. Most publically available AI tools do not have the level of security and privacy policy in place to assure that student data remains protected. If LPS staff choose to use AI in support of their role and practices, they should ensure that no personally identifiable information is ever included in AI prompts.

COPPA – Like all websites that require a login or collect personal information, AI tools require parental consent when used with students under the age of 13. Tools that are approved for use in LPS have gained this consent. Check the Matrix to see if a tool is approved for use with students.

IDEA – AI must not be used in a way that denies disabled students equal access to education opportunities. 

CIPA – Schools must ensure that AI use aligns with CIPA protections against harmful content.

Section 504 (of the Rehabilitation Act of 1973) applies to both physical and digital environments. Schools must ensure that their digital content and technologies are accessible to students with disabilities.

How can I proactively monitor student work?

Use Hāpara

Guided Browsing and Freeze Tabs
Hāpara is a critical component of our systems for managing behaviors in the classroom. It offers some powerful features that teachers can use to focus student browsing activity while keeping them from accessing unintended resources during LPS classroom work time. (NOTE: This does not stop a student from accessing tools from their personal cell phones).

  • Focused Browsing: “Lock” your student(s) to specific tabs you send out to Chromebooks.
  • Filtered Browsing: Add websites to the “blocked” list for a specified amount of time (ex – Google, Wikipedia, known AI tools, curriculum resources, etc.).
  • Freeze Tabs: “Lock” your student(s) to specific tabs already open on their Chromebook for a set amount of time.
Watch this overview of Hāpara Highlights to learn more.

Originality Reports

You can use the built-in Originality Reports in Google Classroom when creating assignments. Our LPS license includes unlimited reports for our staff. These reports do not detect AI generated text, but may help students understand that you are conscious of the potential for academic dishonesty.

Have students work *with* AI

If you are a teacher of science, history, social studies, or any other area that depends upon facts, the most appropriate way to address AI generated text may be to start an assignment by using AI generated text!

Start with an assignment or prompt about your subject that you would otherwise have assigned to your students. Instead, take that to Gemini, ChatGPT, or another generative text tool. Have it generate a version or two of a response.

For your assignment, have students respond to the text you generated. Show them how you got the text. Have the students evaluate the response. Fact check it, cite their sources. Evaluate it for potential bias. Ask them to add one thing they learned about the topic that the AI missed. Have them argue an alternative viewpoint. This is the sort of high level thinking you may have been hoping for when assigning the prompt originally.

Getting an up close look at the hallucinations and errors that can appear in generated text may be the best way to make students wary of using it in a copy/paste fashion. At the very least it will model an appropriate way to approach AI generated text in their future endeavors.

Version History

If the work was done in a Google Doc, Sheet, or Slide document, check the version history. As an editor, you should be able to see the progression of the text in the document from word-to-word, and any editing that happens along the way. If the text suddenly appears all at once in the history of the document, it may be a clue that it was pasted from another source.

Consider assigning work that requires students to iterate on an assignment (edit, revise, remix.) These built-in steps might make it more difficult to copy and paste a completed work from another source.

NOTE: Working “offline” is not a good strategy for stopping AI.

  • Unless the work is started and finished in your presence, students can access AI tools as soon as they leave your classroom.
  • If students had tabs that were opened in Chrome prior to turning off wi-fi, they are still available and can be copied/pasted into documents.
  • Manually selecting “Offline mode” can disrupt settings needed to help the students who most need to access work on their Chromebook outside of school, with no internet access.

How can you tell if a student used AI in an assignment?

About “Detection” Tools

AI detection tools have been proven to be biased, inaccurate, incomplete, or are overselling their abilities. They occasionally make false accusations and even the creators admit there is no 100% accurate tool available. As the tools improve and clever students figure out how to get better results by giving the models more sophisticated prompts, it is hard to imagine any reliable way to “detect” AI text.

As stated earlier, be careful of implicit bias, which is prevalent in most AI tools.

TO BE CLEAR: No tool can tell you with 100% accuracy whether or not a student has used AI to generate text used in an assignment.

Be Proactive

Long before a problem occurs, make it clear to students that you are aware of the potential for AI to be used inappropriately. Include references to District Policy on plagiarism and Academic Integrity statements in your course syllabus and in class discussions.

  • Board Policy 6442: Plagiarism
  • Building Student Code of Conduct RE: Academic Integrity

Investigating Questions of Academic Integrity

If your curricular area has provided guidance on Artificial Intelligence and Academic Integrity processes, please refer to those resources.

If you believe that a student may have violated these expectations by using AI to generate the work they submitted, following are some steps you can take to address it.

  • Consider whether the submitted work is consistent with the student’s previous work. Refer to previous artifacts of their learning for comparison.
  • If the work was done in a Google Doc, Sheet, or Slide document, check the version history. You should be able to see the progression of the text from word-to-word, and any editing that happens along the way. If the text suddenly appears all at once in the history of the document, it may have been pasted from another source.
  • If you choose to use “AI detectors,” use multiple. No single tool can tell you with 100% accuracy whether or not a student has used AI to generate text used in an assignment. You should be aware that current AI detectors are known to indicate false positives at a high rate and have shown bias. That being known, if you consult a few of them it may help you create an informed opinion about whether text is more likely to be authentic or more likely to have been AI generated.
  • Also note that AI text generators can easily hallucinate a well formatted citation which does not actually connect to anything in the real world. Having students produce the cited document(s) could help them understand this. 

If your investigation leads you to believe that a student may have violated expectations, prepare any relevant documentation and share it with the student and family to inquire about the discrepancy and your concerns. Consider asking the student to resubmit an alternate or revised version of the work, or summarize their main points with pencil & paper in your presence. You will want to contact the student’s parent(s)/guardian(s), a building administrator, and potentially a counselor before taking these steps.