Increasing weight of an ore in a table given a depth value and its min/max depth

Confusing math question incoming!
Let’s say I have a table for ores generated in a mine. The generation system generates six ores around the player, checks for occupied spots, the usual. The function to generate an ore takes two vars:

CreateOre(OreType, CFrameAttempt), where OreType is the ore to generate and CFrameAttempt is the CFrame of the mined block.

I have a ModuleScript in ServerScriptService that stores information about all of the ores: here’s the information part for an example ore:

["ExampleOre"] = 
{
Tier = "Common",
Value = 1,
Weight = 1,
MinDepth = 100,
DepthApex = 600,
MaxDepth = 750
}

Let me explain the vars (or what I would like their intended function to be)
Value is obviously the value of the ore. Duh.
Weight is the rarity of ore relative to to the other ores, where a higher number = higher gen. chance
MinDepth is the minimum depth where an ore can generate
MaxDepth is the maximum depth where an ore can generate

And the complicated one… DepthApex. DepthApex is the depth where the ore has the highest probability of generate.

For example: let’s take the ore above as an example. If you dug at depth 600, the chance for the ore to generate is the highest, while at depths 100 and 750, the chance for the ore to generate is the lowest. Basically, mining at the depth apex of a given ore guarantees you will have the highest chance of finding the ore. I hope I explained this well enough.

Basically, I’m describing a function that returns a higher number the closer the inputted value is to the DepthApex:

f(x) = some math func involving DepthApex, MinDepth, and MaxDepth
f(MinDepth) = 0
f(MinDepth + 1 ... DepthApex - 1) = 0.001 .. 0.999
f(DepthApex) = 1
f(DepthApex + 1 ... MaxDepth - 1) = 0.999 .. 0.001
f(MaxDepth) = 0

The actual weight variable should in theory control the chance said ore has spawning relative to other ores chose, e.g. an ore with a weight of 1 has ten times more priority over an ore with a weight of 0.1

I have a function I’ve borrowed from some other project I did a while ago, and it’s a function to generate ores based on the weight value established above:

local function SelectOre(Depth)
	local Total = 0
	
	for _, c in pairs(Ores) do -- Ores is the table with the ore information
		Total = Total + c["Weight"]
	end
	
	local Chosen = RNG:NextNumber() * Total
	local Current = 0
	local Selection

	for i,d in pairs(Ores) do 
		if Chosen <= Current + d["Weight"] then 
			Selection = i
			return(Selection) 
		else 
			Current = Current + d["Weight"] 
		end 
	end
end

This function works fine. But my question is how would I alter or modify it to account for the DepthApex variable I explained while keeping the actual weight variable accounted for? I’m assuming the math involves the math.abs() function at the very least. Any help will be appreciated.

1 Like

I would suggest creating a “score” value to represent how likely an ore is to spawn during a generation.

First, get an initial score for each ore (you could multiply a random number by your weight value for each ore)

Second, get the absolute distance of your ore’s depth to its apex. (math.abs(Depth-DepthApex))
Subtract this from your score value (And maybe give this a global weight so you can tweak how aggressive it is.)

Finally, pick the ore with the highest score.

3 Likes

I agree that this is a good way of doing it, but with one change. Instead of finding the absolute distance between the depth and DepthApex, a function should be created for finding the “DepthScore”. The reasoning behind this is that the DepthApex is not guaranteed to be the midpoint between the mix and max depth.

function Ore:getDepthScore(Depth)
    if Depth > self.MaxDepth or Depth < self.MinDepth then
        return 0
    elseif Depth < self.DepthApex then
        return (Depth - self.MinDepth)/(self.DepthApex - self.MinDepth) -- returns a value linearly distributed between 0 and 1
    if Depth > self.DepthApex then
        return (Depth)/(self.DepthApex - self.MaxDepth) + 1 -- same linear distribution
    end
end
   
4 Likes

Quick question: I’ve noticed you’re using the ‘self’ thing I see in a lot (what I presume to be metatables?) Would you mind explaining it in ELI5 terms? I’m not very experienced with metatables.

EDIT: This would be a function I put into the table, right?

When using : to define functions there is a secret first argument in there which is what self is. When. You call a function using : the object the function is a part of is passed as the first argument.

Here’s a neat trick using this:
If you store a function of an instance in a variable (e.g. game.GetChildren) and call it with an argument of a different instance it will act like you called GetChildren on that instance.

Essentially: table:function(a, b, c) is equal to table.function(table, a, b, c) when calling or defining functions.

1 Like

So with the Ore:getDepthScore(Depth), it’s basically Ore.function(Ore, Depth), correct?

Yes… Also I made a slight mistake in my explanation… When defining the function using : the name of the first argument is self…

You can define a function like this: table.function(self) and it will be the same as table:function()

1 Like

Thanks! Appreciate the explanation.

Unsure if this is true, but I think there may be an error in the code you’ve provided me.
Here’s the code (I changed the if to elseif, I’m assuming that’s a typo. If it’s not, please tell me)

local function GetDepthScore(Ore, Depth)
	
    if Depth > Ore.MaxDepth or Depth < Ore.MinDepth then
        return 0
    elseif Depth < Ore.DepthApex then
        return (Depth - Ore.MinDepth)/(Ore.DepthApex - Ore.MinDepth) 
  elseif Depth > Ore.DepthApex then 
        return (Depth)/(Ore.DepthApex - Ore.MaxDepth) + 1 
    end

end
["Dirt"] =
{
Weight = 3, -- Chance relative to other ores this spawns
MinDepth = 0, -- Minimum depth to spawn
DepthApex = 75, -- Depth where ore is the most common
MaxDepth = 100 -- Maximum depth to spawn
},

Here is my sample table.

Using the values above, the code works as intended until Depth reaches 75, at which point the code returns nil. (I’m assuming this is because there is no check to determine if the Depth is equal to the DepthApex, which would return 1.)

76 returns -2.04, 77 returns -2.08 … 99 returns -2.96, and 100 returns -3, at which point 101 returns 0 as intended. Is this the intended function, or is the code supposed to return these values?

Is this my error? Sorry for the bump, I’m not sure if this is against the rules or not. I figure I’d ask you.