SSLOGO

Our Zero-Tolerance Policy for CSAE

Strings Social prohibits any content or behavior involving sexual exploitation or abuse of minors, including grooming, solicitation, sharing or creation of CSAM, or instructions that facilitate such harm. We act swiftly on reports, remove violating content, preserve evidence, and cooperate with authorities.

Last Updated: Aug 2025

Scope & Definitions

These standards apply to all users, content, messages, profiles, media, and interactions on Strings Social. A “child” or “minor” refers to any person under the age defined by applicable law in their jurisdiction. “CSAM” (Child Sexual Abuse Material) includes any sexualized depiction of a minor, real or synthetic, and material that facilitates or normalizes sexual exploitation of children.

Prohibited Content & Behavior

  • Sharing, requesting, generating, or directing others to CSAM (including AI-generated or manipulated content).
  • Grooming, sexual solicitation, or romantic or sexual propositions involving a minor.
  • Sexualized comments about minors; fetishization or normalization of sexual activity with minors.
  • Instructions to find, trade, conceal, or produce CSAM; links to such material; or use of codewords to evade detection.
  • Impersonation of a minor to facilitate exploitation; coercion, extortion, or threats (including “sextortion”).

Violations can lead to immediate content removal, account restriction or termination, reporting to authorities, and legal action where applicable. (Aligned with Google Play’s Child Safety Standards for social or dating apps.)

Safety by Design

  • Access & Use: Strings Social is not intended for children below the legal minimum age in their jurisdiction. We do not knowingly permit accounts for users under that age. Minors must use the app only with appropriate consent and supervision as required by law.
  • Discovery Controls: We restrict sexualized content and actively discourage romantic or sexual interactions involving minors.
  • Detection & Review: We may use a combination of automated signals and human review to identify and address potential CSAE content or behavior. (Automations are not a substitute for user reporting.)
  • Rate-limiting & Abuse Controls: We deploy anti-spam and anti-harassment measures to reduce unsolicited contact and abuse.

How to Report & What We Do

  1. In-App Reporting: Use “Report” on profiles, chats, or posts to flag CSAE concerns. Provide as much detail as possible (who, what, where, when, links or screenshots).
  2. Email Escalation: For urgent concerns, contact our designated point of contact: info@exentrea.com.
  3. Internal Review: We promptly assess reports, remove illegal content, restrict accounts as needed, and preserve evidence per law.
  4. Law-Enforcement Cooperation: We cooperate with competent authorities. If the matter concerns the United States or U.S. persons, we may report to the NCMEC CyberTipline. In India, users can also report via the National Cybercrime Reporting Portal.

References: Google Play Child Safety Standards; NCMEC CyberTipline guidance; India’s National Cybercrime Reporting Portal and POCSO framework.

Guidance for Parents, Guardians & Users

  • Do not share or forward suspected CSAM—report it immediately using in-app tools or the email above.
  • Discuss online safety with minors: avoid sharing personal images, locations, or meeting strangers alone.
  • If you suspect immediate danger, contact local law enforcement first, then report in-app.

Designated Contact

Email: info@exentrea.com

This contact is prepared to discuss our prevention practices and compliance processes regarding CSAE.

Updates to These Standards

We may update this page to reflect legal, safety, or operational changes. Material updates will be communicated in-app or via the website.

Last updated: August 2025