Safety boundaries

AI can help with thinking. It cannot replace care, authority, or judgment.

This site teaches practical AI literacy for teens 13-17 with adult awareness. It is not a companion bot, homework-cheating service, therapy tool, or unsupervised agent platform.

Allowed practice

  • Writing safer prompts.
  • Checking claims and sources.
  • Summarizing generalized notes.
  • Mapping workflows and decisions.

Not allowed here

  • Medical, mental-health, legal, crisis, or safety advice.
  • Self-harm, abuse, sexual, violent, illegal, or age-restricted content.
  • Automated tool execution, accounts, payments, or social features.
  • Entering private data about the learner or other people.

Pause and escalate

When content feels private, risky, frightening, or beyond a school task, stop the exercise and involve a trusted adult or appropriate local support.

AI output can be wrong

AI systems can invent facts, miss context, copy bias, or sound confident while being wrong. Learners should check important claims, keep their own voice, and make final decisions themselves.