Scope & Definitions
These standards apply to all users, content, messages, profiles, media, and interactions on Strings Social. A “child” or “minor” refers to any person under the age defined by applicable law in their jurisdiction. “CSAM” (Child Sexual Abuse Material) includes any sexualized depiction of a minor, real or synthetic, and material that facilitates or normalizes sexual exploitation of children.
Prohibited Content & Behavior
- Sharing, requesting, generating, or directing others to CSAM (including AI‑generated or manipulated content).
- Grooming, sexual solicitation, or romantic/sexual propositions involving a minor.
- Sexualized comments about minors; fetishization or normalization of sexual activity with minors.
- Instructions to find, trade, conceal, or produce CSAM; links to such material; or use of codewords to evade detection.
- Impersonation of a minor to facilitate exploitation; coercion, extortion, or threats (including “sextortion”).
Violations can lead to immediate content removal, account restriction or termination, reporting to authorities, and legal action where applicable. (Aligned with Google Play’s Child Safety Standards for social/dating apps.)
Safety by Design
- Access & Use: Strings Social is not intended for children below the legal minimum age in their jurisdiction. We do not knowingly permit accounts for users under that age. Minors must use the app only with appropriate consent and supervision as required by law.
- Discovery Controls: We restrict sexualized content and actively discourage romantic/sexual interactions involving minors.
- Detection & Review: We may use a combination of automated signals and human review to identify and address potential CSAE content or behavior. (Automations are not a substitute for user reporting.)
- Rate‑limiting & Abuse Controls: We deploy anti‑spam and anti‑harassment measures to reduce unsolicited contact and abuse.
How to Report & What We Do
- In‑App Reporting: Use “Report” on profiles, chats, or posts to flag CSAE concerns. Provide as much detail as possible (who, what, where, when, links/screenshots).
- Email Escalation: For urgent concerns, contact our designated point of contact: info@exentrea.com.
- Internal Review: We promptly assess reports, remove illegal content, restrict accounts as needed, and preserve evidence per law.
- Law‑Enforcement Cooperation: We cooperate with competent authorities. If the matter concerns the United States or U.S. persons, we may report to the NCMEC CyberTipline. In India, users can also report via the National Cybercrime Reporting Portal.
References: Google Play Child Safety Standards; NCMEC CyberTipline guidance; India’s National Cybercrime Reporting Portal and POCSO framework.
Legal & Compliance
- Zero‑Tolerance & Reporting: We remove CSAE content and may report incidents to authorities as required by law and platform obligations (e.g., NCMEC in the U.S.).
- India: We recognize the Protection of Children from Sexual Offences Act, 2012 (POCSO) and related obligations. Users can also file complaints on the government’s cybercrime portal.
- Data Handling: We preserve and share limited data with authorities when legally required to investigate CSAE. We minimize, secure, and retain data only as necessary to meet legal obligations and platform safety needs.
- Jurisdiction: We comply with applicable laws based on user location and the nature of the report.
Guidance for Parents, Guardians & Users
- Do not share or forward suspected CSAM—report it immediately using in‑app tools or the email above.
- Discuss online safety with minors: avoid sharing personal images, locations, or meeting strangers alone.
- If you suspect immediate danger, contact local law enforcement first, then report in‑app.
Designated Contact
Email: info@exentrea.com
This contact is prepared to discuss our prevention practices and compliance processes regarding CSAE.
Updates to These Standards
We may update this page to reflect legal, safety, or operational changes. Material updates will be communicated in‑app or via the website.
Last updated: August 2025