Reporting Incidents

How to report online safety issues

Request article

Written by Cyber Expert:

Taryn Wren

ICT Teacher

If your child is experiencing abuse or unwanted contact online, there are a number of in-platform controls available to help manage the situation. 

Reporting to have embarrassing, inappropriate, or hurtful posts or comments removed from social media as soon as possible, is essential. Reporting to the platform is generally the first step but if you have ongoing concerns, or if the incident is serious (such as severe cyberbullying, image-based abuse, or involving child exploitation material), consider reporting it to the  Office of the eSafety Commissioner (Australia) or Netsafe (New Zealand). In some cases, it may also be appropriate to seek assistance from your child's school or local law enforcement.

Reporting Incidents by Platform

Use the links below to report online safety incidents directly to the respective platforms.


Report content on TikTok

TikTok allows users to report people for cyberbullying, impersonation and sharing inappropriate or illegal content. Users are able to block people from interacting with them through the platform and delete unwanted comments. Privacy controls can also be used to restrict who can view, comment, and duet a user’s posts.


Report content on Instagram

Instagram has a range of tools available to help users manage cyberbullying and other forms of unwanted contact or abuse. If your child is the target of unwanted contact on Instagram, try looking through the management tools available together, and discussing which management option your child is most comfortable with.


Report content on Snapchat

Snapchat allows users to report abuse, including harassment, bullying or any other safety concerns. Every report is reviewed by someone at Snapchat, usually within 24 hours.


Report content on Facebook

Facebook allows users to report comments, people, groups, advertisements, and more. Facebook will review and take action where an item breaches the platform’s community guidelines. A user can also block certain people from interacting with, or viewing their page or content.


Report content on Messenger

Messenger allows users to block people and delete messages/conversations. Users can also report people for being abusive, inappropriate, or for spamming.


Report content on Skype

Skype allows users to block contacts. Users are also able to report other users to Skype.


WhatsApp's Safety Guide

WhatsApp allows users to block and report other accounts. If a user violates the platform’s Terms of Service, then WhatsApp may ban their account.


Report content on YouTube

YouTube allows users to report inappropriate content, problematic search predictions, abuse, or other content that breaches their community guidelines. YouTube does not allow users to make comments on videos featuring children.


Report content on Fortnite

Fortnite allows users to report players for bad behavior. Fortnite reviews reports and takes action against players that have breached their code of conduct.


Minecraft Multiplayer Server Safety

Minecraft offers lots of options for controlling who can play and communicate with your child. If your child is being targeted on the platform, you can mute, block, and report the problematic player.


Report content on Roblox

Roblox allows users to block and report abuse and other rule violations. There are also in-platform controls that allow you to control who can contact your child (see our Roblox guide for more information).

Related Articles

Join the conversation