For Parents

Parents are legally and practically responsible for their children’s wellbeing.

However, when children interact with digital systems - particularly on school-issued devices, parents often do not have:

  • Visibility into interactions

  • Ability to reconstruct what occurred

  • Ability to intervene in a timely and informed manner

What Has Changed

Historically, tools used by children were:

  • Observable (books, assignments, software outputs)

  • Reviewable after use

  • Controlled by adults at the point of interaction

With generative and interactive systems:

  • Outputs are generated in real time

  • Interactions are not consistently logged, retained or accessible

  • Experiences may differ between users and sessions

This reduces the ability of parents to verify or understand what occurred.

Structural Condition

This creates a separation:

  • Responsibility remains with parents

  • Control over interactions and outputs is partially delegated to systems and institutions

The framework identifies this as a misalignment between responsibility and authority.

What the Framework Evaluates

From a parent perspective, the framework asks:

  • Whether interactions be observed or reconstructed

  • Whether there is clear accountability for system behavior

  • Whether defined boundaries exist for when and how systems are used

  • Whether institutional backing has been fulfilled

These are governance questions, not content questions.

Questions Parents need to ask their schools today.

Send your school administrator the IAF and these four questions. Here’s a sample. Ask for answers in writing.

1. Insurance Review

For each defined student-facing generative or conversational AI use case, has the district’s insurer explicitly reviewed and confirmed coverage?


2. Logging / Reconstruction

Are all student prompts and corresponding AI outputs fully reconstructable by authorized administrators when generative AI tools are used on district-issued devices or district-managed student accounts, whether used at school or off campus?


3. Vendor Liability Allocation

If a vendor disclaims responsibility for AI-generated outputs, how is liability contractually allocated to protect the district, students, and taxpayers?


4. Default Device Posture

What is the default configuration of generative AI tools on district-issued student devices?

Are they (a) disabled by default, (b) disabled by default but enabled only during supervised instructional use, or (c) enabled for general student access?

If configurations differ by grade level, please indicate those differences.

Keep copies of all correspondence. Policy and IT responses do not typically answer governance questions. These questions require legal counsel / risk management review and response.