Neural Network Averages Help

Hello! I’m using https://github.com/Kironte/Roblox-Neural-Network-Library/ to try to create a neural network, for practice. What I’m trying to do now, is have the NN “guess” the average between two numbers. It works if I limit the input to numbers between 1-10, but anything beyond, things get messed up.

I’m dividing the numbers by 10, so they can go into a 0-1 range, compatible with the neural network. The average is also stored as a 0-1 number, and I just multiply it by 10 to get the actual average. Now, I want to make it not limited to numbers between 1-10, but I’m not sure how, since the nodes for output/input are limited to a range of 0 to 1.

I tried to highlight where I believe I should make changes, but simply changing the 10’s to 100’s (to allow input up to 100, for example) doesn’t work. I think I’m overlooking something really simple.

That isnt how we perform activation functions in ML. The actual equation is where z is the number

local function activationFunction(z)
 return 1/(1+2.71828^-z)
end

I’m sorry, but I’m not really sure what to do with this. Are you saying that what I’m trying to do is not possible in ML?

Edit: I’m using Tanh for input and output nodes

No, what @CoderHusk posted is a sigmoid function. It squeezes any number between 0 and 1:


(from https://analyticsindiamag.com/activation-functions-in-neural-network/)

You can use this as the activation function:

local function sigmoid(x)
	return  1 / (1 + math.exp(-x))
end

No it is very possible im talking about the way you are getting that 0, 1 (the math.clamp()) is what we would call an activation function. We forward propogate, during this process we are calling this activation function which sums are weights to a uniformic 0 to 1. I am just going to tell you this there are many ways of writting activation functions but it is definitly not math.clamp. They all lie around some sort of infinite series, I just used eulers constant cuz its easy to program.

Make sure to have the base be derived from a infinite series

So I shouldn’t be dividing by any numbers, I should not be using math.clamp, but switch my activation functions from Tanh to Sigmoid?

your data model is supervised right? Or are you generating the neural structure from a GA?

I’m going to assume supervised. I’m new to neural networks. The training data is randomly generated numbers, it starts on line 36.

GA = genetic algorithm, it’s a algorithm to determine how many neurons your network should have. Take a look at my AI for example (it uses your method, i’ve seen the library but this is from scratch)
https://github.com/EppersonJoshua/machineLearningXOR/blob/main/main.py

Changing Hidden and Output activation functions to LeakyReLU fixed things. Here is the final script. Thank you @g_captain!

math.randomseed(os.clock()+os.time())

local repStorage = game:GetService("ReplicatedStorage")

--Try to keep the variable names equal to the class's name.
local Package = repStorage:WaitForChild("NNLibrary");
local Base = require(Package.BaseRedirect)
local FeedforwardNetwork = require(Package.NeuralNetwork.FeedforwardNetwork)
local ParamEvo = require(Package.GeneticAlgorithm.ParamEvo)
local Momentum = require(Package.Optimizer.Momentum)

local clock = os.clock();

local Settings = {
	Optimizer = Momentum.new();
	HiddenActivationName = "LeakyReLU";
	OutputActivationName = "LeakyReLU";
	LearningRate = 0.3;
}

local gens = 50000;

local numGensBeforeLearning = 1;

local Net = FeedforwardNetwork.new({'startnumber', 'endnumber'}, 2, 3, {'out'}, Settings);
local BackProp = Net:GetBackPropagator();


function analyze(info)
	local average = (info.startnumber + info.endnumber) / 2;
	print(info.startnumber, info.endnumber, average);
	return average;
end


for i = 1, gens do
	local random = {startnumber = math.random(1,10), endnumber = math.random(1,10)};
	local correct = {out = analyze(random)}
	BackProp:CalculateCost(random, correct);

	if i % numGensBeforeLearning == 0 then
		BackProp:Learn();
	end
end

local function GetAverage(startnum, endnum)
	local netout = Net({startnumber = startnum, endnumber = endnum});
	return netout.out;
end

local Player = game.Players.LocalPlayer;
local PGui = Player:WaitForChild("PlayerGui");
local ScreenGui = PGui:WaitForChild("ScreenGui");

ScreenGui.TextButton.MouseButton1Down:Connect(function()
	ScreenGui.Guess.Text = GetAverage(tonumber(ScreenGui.Num1.Text), tonumber(ScreenGui.Num2.Text));
end)
1 Like