1. Purpose and Scope
OceanMingle Pty Ltd (ACN 689 562 909) (we, us or our) is committed to maintaining standards ofchild safety on our platform. This policy sets out the steps we have taken or intend to take toprevent, detect, and respond to child sexual abuse and exploitation and child sexual abuse material(CSAM) on our platform.
This Child Safety Standards Policy (Policy) applies to all our staff, contractors, and agents, and allusers of our platform.
For the purposes of this Policy, a child or means a person under the age of 18 years (and children hasthe corresponding meaning).
2. Our commitment to child safety
We are committed to:
- restricting our platform to users aged 18 years and over;
- responding promptly and appropriately to any suspected or actual incidents of child abuse orexploitation;
- implementing processes to prevent, detect and remove CSAM from our platform;
- complying with all applicable child safety laws and regulations, including the Online SafetyAct 2021 (Cth), as relevant to our operations; and
- cooperating fully with law enforcement agencies in relation to child safety matters.
3. Child Sexual Abuse and Exploitation and CSAM
We have zero tolerance for child sexual abuse and exploitation and CSAM on our platform.We prohibit:
- uploading, storing, or sharing CSAM;
- providing links to or encouraging users to access CSAM hosted elsewhere;
- any conduct that sexualises children; and
- any content depicting or promoting child sexual abuse.
4. Age Restrictions and Verification
Our platform is restricted to users aged 18 years and over. We require users to confirm they are 18years of age or older during account set up by pressing a confirmation button and entering their dateof birth. Users who provide a date of birth indicating they are under 18 years of age will beprevented from creating an account.
We reserve the right to request additional age verification, including photo ID, from any user at anytime if:
- we suspect the user may be under 18 years of age;
- the user's account has been flagged through our monitoring systems; or
- we receive reports suggesting the user may be a child.
Where additional verification is requested, we may:
- suspend the user's account until verification is provided; or
- suspend or terminate the user’s account if they fail to provide satisfactory identification.
If we verify that the user is a child, we will terminate their account or suspend their account untilthey are above 18 years of age.
Any identity documents provided for verification purposes will be used solely for the purpose of ageverification, deleted after verification is complete, and handled in accordance with privacy laws.
5. Detection and Removal
We employ a range of technological solutions to identify and remove:
- underage users;
- CSAM;
- grooming behaviour or other conduct that may indicate child exploitation; and
- any other conduct that poses a risk to child safety.
Our system incorporates the following tools:
- age confirmation at account creation, restricting platform access to users who are 18 years ofage or older;
- accessible in-app reporting mechanisms, including reporting functionality on each userprofile; and
- automatic flagging and content removal for reports pending manual review
We reserve the right to suspend users or remove content that we determine to be in contraventionof this policy or applicable child safety laws. We will notify affected users through our internalmoderator account, explaining the reason for the action.
6. Community Reports
We recognise that our community plays a vital role in maintaining child safety. We encourage usersto report any concerns about potential underage users or child exploitation through our in-appreporting mechanism.
We provide an accessible in-app mechanism that allows users to:
- report or block users or profiles;
- report suspected underage users;
- report CSAM or suspected CSAM;
- report grooming behaviour or suspected child exploitation;
- provide feedback on child safety concerns; and
- provide any other feedback in relation to the platform.
When a user or profile is reported, the profile is automatically flagged and their user content ishidden from public view. Both the Child Safety Point of Contact and our support team(
suport@oceanmingle.com) will receive an alert with the relevant details. We will then:
- promptly review the report;
- assess the report for severity and appropriate action; and
- document and track our resolution process.
7. Response and Investigation Procedures
Where we discover or receive a report of a suspected underage user or a child safety incident, wewill take the following steps:
- Suspension: suspend the relevant account(s) pending investigation;
- Removal: remove or block any CSAM or other harmful content immediately
- Investigation: conduct an internal investigation to assess whether the user is underage, orthe content or conduct constitutes a child safety incident (as applicable);
- Determination: following our investigation, we will determine appropriate further action,including legal review; and
- Record Keeping: we will maintain records of all child safety incidents, investigations, actionstaken and reports made to authorities.
Where appropriate, we will escalate the matter to our legal team to:
- assess the nature and severity of the incident;
- determine our obligations under applicable laws; and
- identify whether the matter must be reported to any applicable authorities, including:
- the relevant state or territory police body;
- the Australian Centre To Counter Child Exploitation; and
- the Office of the eSafety Commissioner
Where we report a matter to authorities, we will provide our assistance and cooperate fully with anyinvestigation or proceedings, including by providing relevant information, evidence anddocumentation as required by law.
9. Roles and Responsibilities
The management team will:
- ensure effective child safety and wellbeing governance, policies, procedures, codes, andpractices are in place and followed;
- promote regular open discussion on child safety issues within the organisation, including atmanagement meetings and staff meetings;
- facilitate regular professional learning for staff, contractors, and agents to build deeperunderstanding of child safety, cultural safety, child wellbeing, and abuse prevention;
- create an environment where child safety complaints and concerns are readily raised, and noone is discouraged from reporting an allegation of child abuse to relevant authorities;
- oversee the development and implementation of robust online safety features and AImoderation tools to protect children using our platform.
All staff, contractors, and agents must:
- always comply with this Policy and any of our other policies and procedures relating to childsafety;
- understand and comply with their child safety and legal obligations
- immediately report any suspected child safety incidents to their direct supervisor or theChild Safety Point of Contact below;
- participate in child safety training, as directed by us;
- contribute to the ongoing development and improvement of our online safety features andmoderation tools; and
- cooperate with internal and external investigations.
The Child Safety Point of Contact will:
- act as the primary point of contact for all child safety concerns or inquiries;
- coordinate responses to child safety incidents;
- ensure all staff are aware of their reporting obligations and support them in making reportswhen necessary;
- maintain accurate records of child safety concerns and incidents;
- liaise with relevant authorities when required;
- provide regular updates to the management team on child safety matters;
- lead the review of this Policy and any other child safety policies and procedures.
12. Child Safety Point of Contact
Our designated Child Safety Point of Contact is:
Name: Xavier Avogniko
Policy Last Updated: December 2025
Policy Last Reviewed: December 2025
Policy Due For Review: December 2026