Hello! I’m using https://github.com/Kironte/Roblox-Neural-Network-Library/ to try to create a neural network, for practice. What I’m trying to do now, is have the NN “guess” the average between two numbers. It works if I limit the input to numbers between 1-10, but anything beyond, things get messed up.
I’m dividing the numbers by 10, so they can go into a 0-1 range, compatible with the neural network. The average is also stored as a 0-1 number, and I just multiply it by 10 to get the actual average. Now, I want to make it not limited to numbers between 1-10, but I’m not sure how, since the nodes for output/input are limited to a range of 0 to 1.
I tried to highlight where I believe I should make changes, but simply changing the 10’s to 100’s (to allow input up to 100, for example) doesn’t work. I think I’m overlooking something really simple.
No it is very possible im talking about the way you are getting that 0, 1 (the math.clamp()) is what we would call an activation function. We forward propogate, during this process we are calling this activation function which sums are weights to a uniformic 0 to 1. I am just going to tell you this there are many ways of writting activation functions but it is definitly not math.clamp. They all lie around some sort of infinite series, I just used eulers constant cuz its easy to program.
Changing Hidden and Output activation functions to LeakyReLU fixed things. Here is the final script. Thank you @g_captain!
math.randomseed(os.clock()+os.time())
local repStorage = game:GetService("ReplicatedStorage")
--Try to keep the variable names equal to the class's name.
local Package = repStorage:WaitForChild("NNLibrary");
local Base = require(Package.BaseRedirect)
local FeedforwardNetwork = require(Package.NeuralNetwork.FeedforwardNetwork)
local ParamEvo = require(Package.GeneticAlgorithm.ParamEvo)
local Momentum = require(Package.Optimizer.Momentum)
local clock = os.clock();
local Settings = {
Optimizer = Momentum.new();
HiddenActivationName = "LeakyReLU";
OutputActivationName = "LeakyReLU";
LearningRate = 0.3;
}
local gens = 50000;
local numGensBeforeLearning = 1;
local Net = FeedforwardNetwork.new({'startnumber', 'endnumber'}, 2, 3, {'out'}, Settings);
local BackProp = Net:GetBackPropagator();
function analyze(info)
local average = (info.startnumber + info.endnumber) / 2;
print(info.startnumber, info.endnumber, average);
return average;
end
for i = 1, gens do
local random = {startnumber = math.random(1,10), endnumber = math.random(1,10)};
local correct = {out = analyze(random)}
BackProp:CalculateCost(random, correct);
if i % numGensBeforeLearning == 0 then
BackProp:Learn();
end
end
local function GetAverage(startnum, endnum)
local netout = Net({startnumber = startnum, endnumber = endnum});
return netout.out;
end
local Player = game.Players.LocalPlayer;
local PGui = Player:WaitForChild("PlayerGui");
local ScreenGui = PGui:WaitForChild("ScreenGui");
ScreenGui.TextButton.MouseButton1Down:Connect(function()
ScreenGui.Guess.Text = GetAverage(tonumber(ScreenGui.Num1.Text), tonumber(ScreenGui.Num2.Text));
end)