Sign In

Child Safety Standards

Last Updated: 2025-08-09

Our Commitment to Child Safety

Coffee on Mars is committed to maintaining a safe environment for all users, with particular attention to protecting minors from child sexual abuse and exploitation (CSAE). We have implemented comprehensive policies, procedures, and technologies to prevent, detect, and respond to any content or behavior that could endanger children.

Age Requirements and Verification

Minimum Age

Coffee on Mars requires users to be at least 16 years old
Users between 16-17 years old must have parental consent
We do not knowingly collect personal information from children under 16

Age Verification

Users must confirm their age during account registration
We employ detection systems to identify potentially underage users
Suspected underage accounts are immediately suspended pending verification

Prohibited Content and Behavior

Zero Tolerance Policy

Coffee on Mars maintains a zero-tolerance policy regarding:
Child sexual abuse material (CSAM)
Child exploitation content in any form
Grooming behavior or attempts to exploit minors
Inappropriate communication directed at minors
Content that sexualizes, endangers, or exploits children

Specific Prohibitions

Users are strictly prohibited from:
Posting, sharing, or transmitting any content depicting minors in sexual or suggestive contexts
Engaging in conversations of a sexual nature with users known or suspected to be minors
Attempting to solicit personal information from minors for inappropriate purposes
Using the platform to arrange inappropriate meetings with minors
Sharing content that could be used to identify, locate, or contact minors inappropriately

Detection and Prevention Systems

Automated Detection

Advanced content scanning technology to detect potential CSAM
Behavioral analysis systems to identify grooming patterns
Machine learning algorithms trained to recognize suspicious activity
Regular updates to detection capabilities based on emerging threats

Human Moderation

Trained content moderation team available 24/7
Specialized training in identifying child exploitation risks
Regular review of flagged content and user reports
Escalation procedures for serious safety concerns

Proactive Measures

Regular audits of user-generated content
Monitoring of private messaging for safety violations
Analysis of user interaction patterns for suspicious behavior
Collaboration with child safety organizations and law enforcement

Reporting and Response

User Reporting

Users can report concerning content or behavior through:
In-app reporting buttons on all posts and profiles
Direct messaging to our safety team
Email to our dedicated child safety address: woosung.app@gmail.com
Emergency hotline for immediate threats: +821027547570

Response Timeline

Immediate: Automated systems flag and remove obvious violations
Within 1 hour: Human review of all child safety reports
Within 24 hours: Investigation completion and appropriate action
Ongoing: Continuous monitoring of flagged accounts

Actions Taken

Depending on the severity of violations, we may:
Immediately remove violating content
Suspend or permanently ban user accounts
Report incidents to the National Center for Missing & Exploited Children (NCMEC)
Cooperate fully with law enforcement investigations
Preserve evidence as required by law

Collaboration with Authorities

Law Enforcement Cooperation

Immediate reporting of suspected CSAM to appropriate authorities
Full cooperation with law enforcement investigations
Preservation of evidence as legally required
Participation in industry-wide child safety initiatives

Reporting to NCMEC

All instances of suspected CSAM are reported to NCMEC within 24 hours
Detailed reports include all available evidence and user information
Follow-up cooperation with NCMEC investigations
Regular communication regarding case status

International Cooperation

Collaboration with international child safety organizations
Compliance with child protection laws in all operating jurisdictions
Participation in global efforts to combat online child exploitation

Technology and Tools

PhotoDNA Integration

Microsoft PhotoDNA technology to detect known CSAM images
Regular updates to hash databases from law enforcement agencies
Automated blocking and reporting of matched content

Advanced Analytics

Machine learning models to identify potential grooming behavior
Natural language processing to detect inappropriate communications
User behavior analysis to identify suspicious patterns

Regular Updates

Continuous improvement of detection capabilities
Integration of new child safety technologies as they become available
Regular security audits and penetration testing

User Education and Awareness

Safety Resources

In-app safety tips and guidelines for users
Educational content about recognizing and reporting suspicious behavior
Resources for parents about online safety
Regular safety reminders and updates

Community Guidelines

Clear communication of acceptable behavior standards
Regular updates to community guidelines based on emerging threats
User education about the importance of child safety

Staff Training and Protocols

Specialized Training

All staff receive comprehensive child safety training
Regular updates on new threats and detection methods
Trauma-informed training for content moderators
Legal compliance training regarding reporting requirements

Internal Protocols

Clear escalation procedures for child safety concerns
Regular team meetings to discuss safety improvements
Coordination between technical, legal, and safety teams
Documentation and record-keeping protocols

Privacy and Data Protection

Minimal Data Collection

Limited collection of personal information from users
Enhanced privacy protections for users under 18
Secure storage and handling of all user data
Regular deletion of unnecessary data

Data Security

Encryption of all user communications and data
Secure servers with restricted access
Regular security audits and vulnerability assessments
Incident response procedures for data breaches

Transparency and Accountability

Regular Reporting

Annual transparency reports on child safety efforts
Statistics on content removed and accounts suspended
Information about cooperation with law enforcement
Updates on new safety initiatives and improvements

Third-Party Audits

Regular independent audits of safety procedures
Collaboration with child safety experts and organizations
Public accountability for child protection efforts

Contact Information

Child Safety Team

Email: woosung.app@gmail.com
Emergency Hotline:
+821027547570
Mailing Address: 240, Olympic-ro, Songpa-gu, Seoul, Republic of Korea (05554)

Law Enforcement

24/7 Law Enforcement Contact: woosung.app@gmail.com
NCMEC Reports: Filed through CyberTipline as required

General Safety Concerns

Safety & Legal Team: woosung.app@gmail.com

Continuous Improvement

We are committed to continuously improving our child safety measures through:
Regular review and updates of our policies
Integration of new safety technologies
Collaboration with industry experts and organizations
Responsiveness to feedback from users and authorities
Adaptation to emerging threats and challenges

Legal Compliance

These safety standards comply with:
US federal laws including FOSTA-SESTA
International child protection regulations
Platform-specific safety requirements
Industry best practices and standards
If you suspect child exploitation or abuse, please report it immediately through our in-app reporting system or contact our child safety team directly. For immediate danger, contact local emergency services.
Coffee on Mars is committed to maintaining the highest standards of child safety and will continue to evolve our policies and procedures to protect all users, especially minors.