Roblox, Fortnite, Minecraft and Steam have received a notice from the eSafety Commissioner requiring them to explain how they are identifying, preventing and responding to serious online harms.
Concerns have been raised about these types of platforms being used as a point of first contact by sexual predators to groom children, or by extremists to spread violent propaganda and radicalise them.
"Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate," eSafety Commissioner Julie Inman Grant said.
"Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms."
The video game platforms face fines of up to $825,000 per day should they fail to comply with the commissioner's notice.
About nine in 10 Australian children between the ages of eight and 17 had played games online, according to the commission's research.
Roblox and Fortnite are among the most popular games for younger children, but each have been embroiled in various controversies.
Neo-Nazi, anti-Semitic and violent content has been found on Fortnite, including a map based on the Jasenovac concentration camp where 100,000 people were killed by Nazis during World War II, the Global Project Against Hate and Extremism revealed.
Meanwhile, terrorist attacks and mass shootings have reportedly been recreated on Roblox.
Online services are required to implement processes to protect Australians from illegal and restricted material, including measures to address risks of grooming.
Roblox has pledged to make private by default those accounts belonging to children under 16, and will introduce tools to prevent adults from contacting them without parental consent.
Steam is a storefront primarily used to buy digital video games that has some social networking services.