Cases of homophobia, sexism, and racism in League of Legends matches "have fallen to a combined 2 percent of all games," says Riot Games social designer Jeffrey Lin.

Riot has been working on curbing toxic behaviour in its immensely popular online game for years.

The developer implemented a "Tribunal" system wherein players could report abusive behaviour, and other players could judge each case and vote on its toxicity.

"The vast majority of online citizens were against hate speech of all kinds," Lin said.

One hundred million reports later, Riot used Tribunal data to build a system that detects abusive behaviour and custom-creates punishments or incentives based on prior, similar cases.

As a result, "verbal abuse has dropped by more than 40 percent, and 91.6 percent of negative players change their act and never commit another offence after just one reported penalty."

Lin concluded, "Is it our responsibility to make online society a better place? Of course it is, for all of us. [...] We are at a pivotal point in the timeline of online platforms and societies, and it is time to make a difference."