Writing this equation in code?

Hello I am unsure on how to write and equation that uses the summination symbol in code. This is the equation

image

Can someone help me with this?

Its just a sum from i=1 to i=n. y_i represents the ith element of a set y.

local s = 0
for i = 1, n do
    s = s + y[i] - f(x[i])^2
end

f is your predictor function that approximates y[i]

2 Likes

if I only have one input then “y” would no be an array/would not have to be. What then? Also in my case I do not know what n is

If you only have 1 data point in the dataset y then n = 1 and the summation isnt necessary
It would simplify to:
(y-f(1))^2
Where f(1) is the predicted value of y by the linear model

I dont see why you would only have one data point though, its kinda strange to linearly model a single point, Im probably just misinterpreting what youre asking

I am researching neural networks and I want to create a Multi-layer-perceptron to create a self driving car. But all he tutorials I can find are done in python. The ones that are in Lua are Modules like Kironte which has everything you need but it is not explained. Right now I am following a tutorial on towardsdatascience but all the code is removed and I am left with stuff like:

Which is helpful with someone who understands the symbols and math behind them

not to mention the equations used on this website are not the ones used on another websites. This one uses a sigmoid func inside a sigmoid func when others do not.

Would you mind linking the tutorial youre using? I cant seem to find it through some basic searching

It’s on towardsdatascience but is only viewable with a membership I have pasted the article into docs so you can view it.

From my reading it this is the best neural net I have been able to come up with:

local NeuralNetwork = {
	w1 = 0.11, 
	w2 = 0.21, 
	w3 = 0.12, 
	w4 = 0.08, 
	w5 = 0.14, 
	w6 = 0.15,
	r = 1
}

S = function(x)
	return 1 / (1 + math.exp(-x))
end

function NeuralNetwork:feedForward(x)
	return S(self.w1 * x[1] + self.w2 * x[2])
end

function NeuralNetwork:train(inputs, y)
	local h1 = S(self.w1 * inputs[1] + self.w2 * inputs[2])
	
	local RSS = 0
	for Index = 1, table.getn(inputs) do
		RSS += (y - h1) ^ 2
	end
	
	self.w1 -= self.r * (RSS / self.w1)
	self.w2 -= self.r * (RSS / self.w2)
end

for Index = 1, 100 do
	NeuralNetwork:train({0, 0,}, 0)
	NeuralNetwork:train({1, 0,}, .5)
	NeuralNetwork:train({0, 1,}, .5)
	NeuralNetwork:train({1, 1,}, 1)
end

local Ouput = NeuralNetwork:feedForward({0, 0})
print(Ouput)