CSAE-Standards
We review and update these standards regularly to reflect evolving best practices and legal requirements. Material changes will be posted on this page with an updated “last revised” date.
Last revised: August 19, 2025
Contact: report@360io.com
Quick Links
1. Our Commitment
360io maintains a zero-tolerance policy for any form of child sexual abuse material (CSAM), child exploitation, or the sexual abuse of minors. We design our products, policies, and enforcement practices to prevent, detect, remove, and report such content and behavior.
2. Scope and Definitions
Minor: Any individual under 18 years of age.
CSAM: Any content that depicts, represents, or purports to depict the sexual abuse or exploitation of a minor, including self-generated imagery and AI-generated or synthetic content involving minors.
Exploitation: Grooming, sexual solicitation, trafficking, sexualized commentary about minors, or any attempt to harm or endanger a minor.
3. What Is Prohibited
All The following are strictly prohibited on 360io and all affiliated services:
Uploading, creating, sharing, linking to, or storing CSAM.
Sexualization of minors in any form (including drawn, edited, or synthetic content).
Attempts to contact, solicit, or groom minors, including through private messages.
Any activity that facilitates trafficking or exploitation of minors.
Attempts to evade our safeguards, verification, or moderation.
Violations result in immediate account termination and reporting to the appropriate authorities.
4. Platform Safeguards
Age Protections: Our platform is not for minors. Features that may involve medical imagery require documented, verified, and revocable consent from adults only. We do not permit any sexual content involving minors, real or simulated.
Consent Controls: For adult educational/medical materials, we require explicit, recorded consent and removal rights. Content that cannot meet our verification standards is rejected or removed.
Search/Discovery Limits: We restrict terms, tags, and features that could be used to find or distribute exploitative content.
5. Detection and Enforcement
Proactive Detection: We use a combination of automated signals (including industry-standard hash matching) and trained human review to detect suspected CSAM and grooming behaviors.
Immediate Removal: Confirmed violations are removed without notice. Related accounts, devices, and payment methods may be banned.
Reporting: We promptly report suspected CSAM to the National Center for Missing & Exploited Children (NCMEC) and cooperate with law enforcement investigations, consistent with applicable laws.
6. Reporting to 360io
If you encounter content or behavior that may involve the exploitation of a minor:
Email: report@360io.com
Provide links, usernames, timestamps, and any relevant details. We prioritize and investigate these reports 24/7.
7. Support for Victims and Caregivers
We remove known content, preserve evidence as required by law, and cooperate with authorities. We will also honor verified takedown requests from victims, guardians, or authorized representatives when permitted by law.
8. Partner and Staff Responsibilities
Training: Moderators, support, and relevant staff receive ongoing CSAE training, including recognition, triage, and escalation.
Vendors and Clients: All partners must meet these standards. Violations by partners or clients will result in suspension or termination of access.
9. Data Handling
When investigating CSAE, we minimize access to sensitive material, restrict it to trained personnel, and retain data only as required for safety, legal, and compliance obligations.
10. Legal Cooperation
We comply with applicable laws globally, including mandatory reporting requirements, lawful preservation requests, and valid legal process.
11. Updates to This Standard
We review and update these standards regularly to reflect evolving best practices and legal requirements. Material changes will be posted on this page with an updated “last revised” date.