Hi, I am here to inform you today that I have Made a Toxicity Limiter Module.
You Simply Just Give it some Text and it will return the percent of how toxic it is.
To Help Reduce Toxicity in the Roblox Community Please share so more people are aware so they can use it to make the Roblox community less toxic.
Suggestion: Try The Game To see how you Like it: https://www.roblox.com/games/6904733868/Module-Tester
Note: You Need To Enable HTTP Services and have this at the top of your code
local HttpService = game:GetService("HttpService")
Sample Code
local HttpService = game:GetService("HttpService")
local module = require(6904706913)
print("Toxicity of the Word Module: "..module.analyze("Module"))
This is What I get after running that sample code
To Use it Require The Module
local HttpService = game:GetService("HttpService")
local module = require(6904706913)
And To Analyze your text use
module.analyze("Text To Analyze Here")
Analyzing Text Will Return a percent value out of 100 You can make a decision on what to do after that. Letâs say it is 83% Toxic I might do something like delete the message and warn the person.
If I were to analyze something more toxic like
âYouâre Trashâ
You Can See Here that it returns with a Toxicity percent of 88%
I had it print it out BUT you could store that value
Note: Just Because it has a Toxicity Percent doesnât mean it is toxic. It is only toxic when it gets closer to a hundred.
To find the Test Game Go To The top.
Thanks For Reading! Any Bugs Or If You Want To Show What You Made With it Write It In The Comments!
Any Errors for accessing it should be fixed now
New Feature
Doing:
local result = module.fullanalyze ("Free You Know What at This Spam Site")
print(result.toxic)
^ Running This Code with the Word âmoduleâ in it it would return 7.652474
This will return the Spamness, identity_attack, insult and Toxicity of the message
Edit: added identity_attack, and insult levels
â This Was Printing the Word âmoduleâ