DEX Screener To Review Moderation Policy Amid Flurry of Racist Memecoins

The platform ensured it won’t be a gatekeeper, but that it’s also not here to spread hate.

By: Pedro Solimano Loading...

mage depicting the concept of moderation policy in an online marketplace with a minimalistic and futuristic style

A proliferation of racist tokens has pushed DEX Screener, a real-time DeFi token analytics platform, to review its token profile moderation policy.

“We'll be reviewing our token profile moderation policy in the coming days,” said the platform’s X account, adding that it, “won't be the gatekeepers of what happens on-chain, but we're definitely not here to spread hate.”

The announcement, made late Friday, March 22, came as a number of unsavory token profiles began popping up on the platform–most of which are surfacing on Solana.

Some of these include derogatory terms and symbols based on race and religion, as well as other hateful language.

Permissionless for Good or Bad

Racist memecoins highlight an odd situation in the cryptocurrency space. For one, it shows an ugly underbelly in the ecosystem, with people interested in creating and purchasing these types of tokens.

Meanwhile it also shows the permissionless nature of open blockchains, and the fact the technology enables the freedom to create anything a developer or user wants–for good or bad.

Memecoins have taken the cryptocurrency ecosystem by storm in recent months.

Some, like DogWifHat and Book of Meme, have climbed the market cap ladder extremely quickly. The former trades for $2.21 with a $2.2 billion market cap, whereas BOME–which reached a $600 million market day in less than two days–now sits at $800 million, trading for $0.01.