Effective Date: 12/30/2024
Last Updated: 12/30/2024
Introduction At ViewMe Global, we are committed to ensuring the safety and well-being of children who use our application. This Child Safety Standards Policy outlines our commitment to comply with regulations such as the Children's Online Privacy Protection Act (COPPA) in the United States and similar laws globally, as well as Google Play's Child Safety Standards. This policy applies to all aspects of our app, from design and development to operation and maintenance.
1. Age Appropriateness
-
Target Audience: Our app specifies its target audience clearly in the Google Play Store & iOS App Store description. If the app is designed for children or might appeal to them, we adhere to the Google Play Families policies.
-
Content: All content within the app is appropriate for the age group it targets. We ensure no adult themes, including excessive violence, gore, or encouragement of harmful activities, are included.
2. Data Privacy and Protection
-
COPPA Compliance: For users under 13 within the U.S., we adhere to COPPA by obtaining verifiable parental consent before collecting, using, or disclosing personal information from children.
-
Information Collection: We collect only what is necessary for the app's functionality, such as basic user interaction data for improving user experience. Sensitive data like location or contact details are only collected with explicit parental consent.
-
Security Measures: We implement robust security measures like encryption to protect children's data from unauthorized access or breaches.
-
3. User Interaction and Safety Features
-
Parental Controls:
-
Communication Safety:
-
For apps with chat or social features, we ensure:
-
No direct image sharing; only pre-approved, safe communication options like stickers or emojis are available.
-
Monitoring and moderation of group chats to ensure safety.
-
Features for users to report inappropriate behavior directly within the app.
-
-
4. Safety Against Exploitation
-
Child Sexual Abuse Material (CSAM):
-
We have strict policies against the creation, distribution, or storage of any CSAM. We will report any detected CSAM to appropriate authorities like the National Center for Missing & Exploited Children.
-
In-App Reporting: Users can report any concerns through an in-app mechanism without leaving the app, ensuring immediate action can be taken.
-
-
Safety Standards Compliance:
-
We self-certify compliance with Google's Child Safety Standards policy through the Play Console before app publication, ensuring all required standards are met.
-
5. Transparency and Education
-
Educational Content: We provide resources or links to educational materials on internet safety for both children and parents within the app or through our website.
-
User Guidance: Instructions on how to use parental controls, manage privacy settings, and understand the app's safety features are clearly accessible.
6. Continuous Improvement and Accountability
-
Feedback Mechanism: We welcome feedback from users, especially parents, to continually improve our safety measures.
-
Regular Audits: We conduct internal and external audits to ensure compliance with this policy and evolving legal standards.
-
Policy Updates: This policy will be reviewed and updated periodically or as necessary in response to new legislation or safety findings.
ConclusionThis Child Safety Standards Policy is a testament to ViewMe Global's dedication to child safety in the digital space. By following these guidelines, we aim to provide a safe, enjoyable, and educational environment for our youngest users, with the peace of mind for their guardians.
Effective Date: This policy is effective from 2024-12-30 and will be updated as needed to reflect changes in our practices, technology, or regulatory requirements.
CSAE Policy
ViewMe Global Child Sexual Abuse and Exploitation (CSAE) Policy
Introduction: At ViewMe Global, we are deeply committed to protecting children from sexual abuse and exploitation. Our platform is designed to be a safe space for all users, with special emphasis on safeguarding minors. This policy outlines our stringent measures against Child Sexual Abuse and Exploitation (CSAE).
1. Zero Tolerance for CSAE Content
-
Prohibition: Any form of child sexual abuse material (CSAM), including images, videos, or text that depict, promote, or solicit child sexual abuse, is strictly prohibited on ViewMe Global.
-
Definition: CSAE includes but is not limited to:
-
Child pornography or any visual depiction of minors in sexually explicit conduct.
-
Grooming or soliciting minors for sexual purposes.
-
Live-streaming abuse or exploitation.
-
Sharing or distribution of self-generated explicit content by minors under coercion.
-
2. Detection and Prevention
-
Automated Tools: We employ advanced machine learning and AI technologies to detect and flag potentially abusive content or behavior. This includes hash-matching systems to identify known CSAM.
-
Human Moderation: In addition to automated systems, we have a team of trained content moderators who review flagged content and user reports.
-
User Reporting: Users can report suspected CSAE content or behavior directly through our in-app reporting feature, which is prioritized for urgent review.
3. Response to CSAE
-
Immediate Action: Upon detection or report of CSAE:
-
The content is immediately removed from ViewMe Global.
-
The account associated with the content is suspended or banned.
-
Reports are forwarded to the National Center for Missing & Exploited Children (NCMEC) via the CyberTipline or equivalent international bodies if outside the U.S.
-
-
Preservation of Evidence: We ensure that any CSAE content detected is preserved for law enforcement investigation while maintaining user privacy for non-CSAE related data.
4. User Safety Features
-
Age Verification: We implement age verification mechanisms to restrict access to our app for users under the age of 13 without parental consent, in line with COPPA regulations.
-
Privacy Controls: Enhanced privacy settings for minors, limiting visibility and interaction with unknown users.
-
Parental Controls: Features allowing parents to monitor and control their child's activity on ViewMe Global, including friend requests, content sharing, and message visibility.
5. Education and Awareness
-
User Education: We provide resources on digital safety, how to recognize grooming behaviors, and how to report inappropriate activities.
-
Community Guidelines: Clear communication of our stance against CSAE within our community guidelines, which all users must agree to upon registration.
6. Collaboration and Compliance
-
Partnerships: We actively collaborate with:
-
Law enforcement agencies to facilitate investigations.
-
NGOs and organizations like the WePROTECT Global Alliance for ongoing education and policy refinement.
-
Other tech companies to share best practices and tools for combating CSAE.
-
-
Legal Compliance: We ensure compliance with all applicable laws, including:
-
The Children’s Online Privacy Protection Act (COPPA) for users in the U.S.
-
Regional laws concerning child safety and data protection.
-
7. Continuous Improvement
-
Feedback and Adaptation: We regularly seek feedback from users, child safety organizations, and law enforcement to update and improve our policies and technologies.
-
Policy Review: This policy will be reviewed at least annually or in response to significant changes in technology, law, or community feedback.
8. Contact Information
-
Child Safety Officer: Jonathan Alerte/Director of Operations at jonathan@velocegallery.com, who serves as the point of contact for all CSAE-related queries or reports from external bodies.
Conclusion: At ViewMe Global, our priority is to create a secure digital environment for all users, especially children. We encourage all users to report content or behavior that violates our CSAE policy. Together, we can work towards a safer online community.