AI & Human Review

ARKTIK uses AI as a drafting and organizing tool, not as a decision-maker about families, students, or cases. Every education plan, Truancy Shield framework, and report we deliver is owned and signed off by a human being who is accountable for the result. This page explains where AI fits—and where it does not.

How We Use AI

We use AI to make our work faster and clearer, not to replace judgment.

Examples of appropriate use:

  • Turning rough notes into clean drafts of reports or summaries

  • Suggesting alternative phrasings that are easier for non-lawyers to read

  • Helping generate checklists, schedules, and templates from our existing standards

  • Supporting translation or plain-language explanations of our own policies

In all cases, AI is a drafting assistant. It is not a record of truth and not a substitute for staff review.

What AI Never Does at ARKTIK

We do not use AI to:

  • Make decisions about a family’s suitability, risk, or “worthiness”

  • Label or rank students or households (for example, “high-risk,” “low-potential,” etc.)

  • Issue any clinical, legal, or eligibility determinations

  • Generate “evidence” or facts about a case

We do not train AI models on identifiable family records in a way that would expose case details outside our governance boundaries.

Human review and sign-off

Every AI-assisted draft:

  • Is reviewed line by line by ARKTIK staff

  • Is corrected for accuracy, completeness, and tone

  • Is approved by a human who is responsible for the final content

If a draft cannot be verified or does not align with our standards, it is discarded.

You can assume that any document you receive from ARKTIK—whether you are a family, a court, a district, or an agency—has been human-reviewed, corrected, and approved.

Data and Privacy Constraints

When we use AI tools:

  • We apply the same Data Residency & Privacy standards described elsewhere in Governance & Safeguards.

  • We avoid sending unnecessary identifying details into AI systems.

  • Health-related information is framed in functional, plain language, not as full clinical records.

Our priority is to reduce manual busywork, not to expand how much data we hold or share.

Transparency with Families and Partners

We are transparent that:

  • AI is used to support drafting and organization.

  • Humans make the actual decisions, recommendations, and attestations.

  • Families, courts, and partners can ask whether AI was used in preparing any given document.

If a court, district, or agency prohibits AI-assisted drafting for a particular matter, ARKTIK will comply and will handle that engagement using fully manual workflows.

Downloads