In the United States, gaming is a $35.5 billion industry, and millions of people play online multiplayer games. Current, gaming-specific survey data show harassment is widespread in these spaces: about 83% of U.S. adult online multiplayer gamers report harassment in the past six months; roughly 60% encountered identity-based harassment, and 27% experienced severe forms (e.g., physical threats, stalking, doxxing, or swatting). Platform transparency corroborates the scale of enforcement: Xbox’s semiannual Transparency Reports detail millions of actions per period with harassment/abuse among top categories, and Discord’s Transparency Hub similarly shows harassment as a significant share of user reports and enforcements. Ofcom’s Online Nation highlights trolling/harassment as a persistent interactive harm internationally.
In addition to the general allure of game play and competition, online gaming platforms also establish connections between people, allowing gamers to strike up conversations with friends or complete strangers, and to build what can sometimes turn out to be long-lasting relationships. However, the ability to easily connect socially through a simple internet connection also means there’s a platform for negative social interactions, which can even lead to instances of online harassment. Independent research shows that not only adults but also teens are at risk: ADL’s latest waves indicate a majority of teen players report harassment in online games, underscoring the need for strong youth safeguards.
Online harassment is as old as online gaming itself. When things get competitive, it is easy for people to lash out at others while hiding behind their screens. Recent findings from the ADL show that about 83% of adult online multiplayer gamers experienced harassment in the past six months; 60% encountered identity-based abuse, and 27% faced severe harassment. Targets frequently report attacks tied to gender, race/ethnicity, religion, or sexual orientation, with competitive, team-based titles such as first-person shooters and MOBAs historically among the most reported for abusive behavior in ADL’s annual studies.
What online harassment looks like
Online harassment can include a variety of damaging behaviors such as threats, hateful messages, defamation, and distributed denial-of-service (DDoS) attacks. Other tactics include dogpiling/brigading, doxxing (publishing personal information), and image-based abuse. The latest gaming-specific research indicates these experiences are common and consequential: a sizeable minority of players face severe harassment (e.g., sustained abuse, physical threats, stalking, doxxing, or swatting), prompting many to self-censor or avoid voice chat and specific titles.
According to the ADL study, severe harassment in games encompasses behaviors like doxxing and swatting. Doxxing entails divulging personal information about someone without their consent, such as a home address, telephone number or a gamer’s real name. This exposure can enable further targeting by others who come across that information and is often paired with sustained harassment across platforms.
Once this information is online, the harassment can then escalate to swatting. Swatting occurs when an ill-intentioned gamer reaches out to emergency services and law enforcement agencies and asks that they respond to a fake emergency at the victim’s house. When law enforcement enters what they believe to be a dangerous situation, such as the fallout of a shooting or a hostage situation, people can get hurt in the confusion.
When harassment transcends the boundaries of the internet, there are far-reaching consequences. To prevent online harassment from escalating, and to keep online gaming fun for the entire gaming community, it is important for gamers to report instances of online harassment early to the gaming company or service provider of the gaming platform. Companies need to know about the incident in order for them to step in and remedy the situation. Evidence-led reporting and block/mute tools are now core parts of safety design; regulators such as Ofcom outline best practices for low-friction, in-product reporting and comprehensive user controls in their Online Safety codes. Platforms also publish transparency reports detailing time-to-action and policy enforcement volumes (see Xbox and Discord), and many use graduated penalties (e.g., strikes, communication restrictions, and bans) to reduce recidivism.
How companies are addressing harassment
Most large tech and gaming companies are aware of the possibility (if not prevalence) of online harassment happening on their platforms. And many of these companies have put measures in place to help players report online harassment or digital abuse. Across the industry, anti-harassment measures increasingly include AI-assisted voice safety, streamlined report/block/mute flows with evidence capture, tiered enforcement with user feedback, and regular transparency reporting. Examples include Xbox’s platform-level voice reporting (short clip capture for moderation), Call of Duty’s global rollout of AI-enabled voice chat moderation, and VALORANT’s continuing voice evaluation for disruptive behavior.
Console gaming tycoons Xbox (owned by Microsoft) and Playstation (Sony) each have a clearly defined code of conduct outlining the boundaries of behavior accepted on the platforms. Microsoft’s Community Standards and Sony’s Community Code of Conduct leave very little room for misunderstanding where the boundaries of acceptable and unacceptable online behaviors lie. Both companies now also publish regular transparency reports with safety metrics so players can see how policies are enforced (Xbox; PlayStation).
On top of these overarching guidelines, Xbox and Playstation have created clear, step-by-step procedures for how to block players, file complaints and report incidents of online harassment. Many ecosystems also support evidence-led reporting (e.g., voice clip capture on Xbox) and use structured enforcement (e.g., strikes, temporary communication restrictions, and account actions) documented in their transparency reports.
Of course, the options of blocking other players or filing reports of online harassment are not limited to the world of console gaming. Just about every online gaming platform has similar options available to players. For example, the online gaming platform Steam has posted rules and guidelines and a Steam Online Conduct guide, both of which clearly outline what is considered inappropriate behavior that will not be tolerated. Steam users are also encouraged to file a complaint should they experience harassment or come across inappropriate content. Transparency reporting is increasingly standard across large networks (for example, Roblox), helping players understand how harassment and hate policies are enforced.
Harassment and defamation are not the only things gamers need to report. Gaming platforms encourage people to report a wide range of incidents. Some of these include (but are not limited to) cheating, posting spam, posting inappropriate or offensive content, or using inappropriate profile pictures and player names. Many services now also surface dedicated categories for identity-based abuse, doxxing or sharing private information, and threats, improving triage and time-to-action in moderation pipelines (see Discord’s Transparency Hub and Xbox Transparency Report).
What to do when targeted
In most cases dealing with online harassment isn’t a matter of toughing it out. Being proactive means you’re more likely to get the behavior to stop. Luckily, there are several steps you or your child can take to deal with online harassment. Let’s look at a few: Best-practice guidance from regulators and safety authorities emphasizes in-product reporting, comprehensive block/mute controls, and timely moderation feedback (Ofcom; Australia’s eSafety: Safety by Design – Gaming). Transparency reports from major platforms also track report-to-action and time-to-action metrics, underscoring why reporting with evidence matters (Discord; Twitch).
- Step 1: Document and gather evidence. Gathering evidence in the form of digital records is incredibly important if you are being harassed. The evidence is crucial for when you need to take criminal or legal action against your aggressor. Since online harassment can cross into real life, you need to have proof of everything that has happened in the digital world. Document all of your interactions with the person harassing you. This means taking and printing screenshots and getting timestamps whenever you can. (Tip: Screen shots often save with a file name that doubles as a timestamp.) Where available, use platform features that preserve context—for example, Xbox’s voice reporting allows submitting a short clip of offending voice chat with your report.
- Step 2: Tell them to stop. This is an important step to take when you’re dealing with online harassment. As a one-time action, tell the person they need to stop harassing you. Don’t initiate a dialogue with the harasser. Engaging with the harasser can lead to escalation.
- Step 3: Try not to take it personally. Regardless of the level of harassment, remember that someone else is behaving badly, and their words and actions are not a reflection on you. Try to emotionally distance yourself from the scenario. In a nutshell, don’t let them push your buttons. By refusing to engage you are taking away their power over you.
- Step 4: Block and report them. Most games provide players with the option to block and/or report other players. By blocking a player, you can ensure you won’t be paired up with them again, and hopefully the harassment will end there. If the harassment is more serious in nature, you should report the player as well. Filing a formal complaint might mean the aggressor is completely banned from the game as a consequence. High-quality implementations apply blocks across DMs, invites, voice, and text, and pair them with triaged reporting that speeds review for threats and sustained abuse (Ofcom; eSafety). Transparency reports show that while many enforcements are proactive, user reports remain pivotal for context-heavy harassment (Discord; Twitch).
- Step 5: Speak to the other players. It takes a tribe. Speak to the other players in your game about the player targeting you. Chances are some of them will have had similar experiences. The chances of having an aggressor banned from the game are much higher if there are multiple complaints filed against them.
- Step 6: Reach out to the game moderators. Game moderators can act as both the police and peacekeepers. If the game you are playing has a moderator, you can reach out to them and make them aware that another player is harassing you.
- Step 6: Know your legal rights. Stay prepared by knowing your rights if you encounter online harassment. Carla Franklin, a cyber abuse expert, survivor and advocate for victims, offers this tip on her website: “Even if you have a lawyer or the police involved in your situation, you are your own best advocate and know your case better than anyone.” We’ve provided a list of resources below. Depending on where you live, additional avenues may be available: the EU’s Digital Services Act requires notice-and-action, statements of reasons, and appeals; the UK brought new intimate-image abuse offences (including deepfakes and cyberflashing) into force; Australia’s eSafety scheme provides a fast administrative pathway for removal of seriously harmful content via the Adult Cyber Abuse program. In the U.S., platform liability remains largely shielded by Section 230; remedies typically rely on state criminal/civil laws (e.g., Massachusetts criminalized non-consensual explicit images, sextortion, and sexual deepfakes, adding civil remedies: state law update; state deepfake laws tracked by NCSL).
- Step 7: Contact the police. If the person who is harassing you makes threats or does something that is considered illegal in the real world, you should contact the police. If another player threatens your life, or to cause you harm, it is time to get law enforcement involved.
- Step 8: Step up your internet security. Some people may be more tech-savvy than others in gaining information on you and your online activities. They may go so far as to try to access your email or social media accounts. But you can make it harder for others to engage in doxxing by tightening up your online security. It is incredibly important to have two-factor authentication, as well as a very strong password for each account. Have a good look at how much information you have posted about yourself and your whereabouts online. It may not seem important to you at the time you post something, but for an aggressor who is gathering sensitive and personal information about you, your social media accounts can be a goldmine. If intimate images are involved and you are a minor or young adult, consider hash-based takedown tools such as NCMEC’s Take It Down.
- Step 9: Reach out for support. As with any other traumatic experience in life, you should reach out to your support network and possibly even a licensed mental health professional for help. Being the target of online harassment is something you shouldn’t have to deal with by yourself. Samantha Silverberg, co-founder of the Online SOS Network, told Psychology Today, “People don’t realize how big of an issue it is from a mental health perspective. It’s really hard to quantify what’s happening.” Online harassment is associated with higher symptoms of depression and anxiety and sleep disturbance, especially for adolescents, according to the U.S. Surgeon General and the American Psychological Association; nationally, about 41% of U.S. adults report experiencing online harassment, with roughly a quarter reporting severe forms. If you are in acute emotional crisis in the U.S., you can contact the 988 Suicide & Crisis Lifeline (phone, chat, or text) for 24/7 support.
- Step 10: Look out for others. Anyone is at risk of becoming the victim of online harassment. Don’t be a bystander if you see someone harassing another player. The gaming community is meant to be a safe space for everyone, and identifying and dealing with a troll often needs to be a collective effort. Emily May, the co-founder and executive director of Hollaback! and HeartMob, which supports people who are being harassed online, reported to Vice, “We have to depend on one another and work together to change the culture that makes online harassment acceptable.”
Resources
Because online harassment is such a big issue, there has been an increase in people and groups advocating for the victims of online harassment. These individuals, support groups and nonprofit organizations aim to end all online harassment, and have made many resources on the topic readily available. For immediate emotional crisis support in the U.S., contact the 988 Suicide & Crisis Lifeline. If sexual assault or sexual harassment is involved, RAINN’s National Sexual Assault Hotline is available 24/7. For intimate partner harassment or stalking, see The National Domestic Violence Hotline or the StrongHearts Native Helpline. For workplace issues, the EEOC explains your rights and how to file. If you need confidential navigation and referrals, the VictimConnect Resource Center can help. For non-consensual intimate images involving minors/young adults, consider NCMEC’s Take It Down.
Here are a few resources you can turn to if you, or someone you know, is being harassed online:
- HeartMob: A nonprofit organization powered by activists that aim to end online harassment. The platform provides real-time support for anyone who experiences online harassment and has made available a long list of resources, including helpful guides and more information on the topic of harassment.
- Crash Override Network: A crisis helpline, advocacy group and resource center for people who are experiencing online abuse. The group has made various resources available, including educational materials, referrals, guides and interactive tools, which people can use to inform and educate themselves on online harassment.
- Cyber Smile Foundation: A nonprofit organization working towards ending online harassment and cyberbullying, and promoting kindness, diversity and inclusion online. They offer several articles and guides aimed specifically at the gaming community.
- International Game Developers Association (IGDA): A nonprofit organization with extensive resources covering how to respond to online harassment.
- Online SOS: A platform providing information and tools to help empower people dealing with online harassment. The nonprofit also offers services such as crisis coaching and referrals to experts.
- Feminist Frequency: A nonprofit educational organization that has put together an extensive guide on how to protect yourself from online harassment.
- Data & Society Research Institute: This nonprofit research organization has compiled a detailed report on online harassment, digital abuse and cyberstalking in America.
The bottom line
The majority of people looking to online gaming as a source of entertainment want to join a community of people who have one thing in common: a love for games. Unfortunately, not everyone is doing their part to create a safe and positive online gaming culture.
Two-thirds of online gamers in the U.S. have experienced some form of online harassment, and often the harassment does not simply end when a player logs off. In recent waves, roughly 60% encountered identity-based abuse and 27% faced severe forms that include threats, stalking, doxxing, and swatting. These experiences are linked with stress and mental health impacts for many targets (Surgeon General; APA; ADL). Luckily, there are many steps a victim of online harassment can take to empower and protect themselves, and many resources are available to help.
Take control of your gaming experience and do your part to spot and report online harassment early, to help ensure a safe and fun gaming community for everyone.