Delete this delete this delete this

Delete this delete this delete this

That isnt how we perform activation functions in ML. The actual equation is where z is the number

local function activationFunction(z)
 return 1/(1+2.71828^-z)
end

Delete this delete this delete this

No, what @CoderHusk posted is a sigmoid function. It squeezes any number between 0 and 1:


(from https://analyticsindiamag.com/activation-functions-in-neural-network/)

You can use this as the activation function:

local function sigmoid(x)
	return  1 / (1 + math.exp(-x))
end

No it is very possible im talking about the way you are getting that 0, 1 (the math.clamp()) is what we would call an activation function. We forward propogate, during this process we are calling this activation function which sums are weights to a uniformic 0 to 1. I am just going to tell you this there are many ways of writting activation functions but it is definitly not math.clamp. They all lie around some sort of infinite series, I just used eulers constant cuz its easy to program.

Make sure to have the base be derived from a infinite series

Delete this delete this delete this

your data model is supervised right? Or are you generating the neural structure from a GA?

Delete this delete this delete this

GA = genetic algorithm, it’s a algorithm to determine how many neurons your network should have. Take a look at my AI for example (it uses your method, i’ve seen the library but this is from scratch)
https://github.com/EppersonJoshua/machineLearningXOR/blob/main/main.py

Delete this delete this delete this

1 Like