Confusing math question incoming!
Let’s say I have a table for ores generated in a mine. The generation system generates six ores around the player, checks for occupied spots, the usual. The function to generate an ore takes two vars:
CreateOre(OreType, CFrameAttempt)
, where OreType is the ore to generate and CFrameAttempt is the CFrame of the mined block.
I have a ModuleScript in ServerScriptService that stores information about all of the ores: here’s the information part for an example ore:
["ExampleOre"] =
{
Tier = "Common",
Value = 1,
Weight = 1,
MinDepth = 100,
DepthApex = 600,
MaxDepth = 750
}
Let me explain the vars (or what I would like their intended function to be)
Value is obviously the value of the ore. Duh.
Weight is the rarity of ore relative to to the other ores, where a higher number = higher gen. chance
MinDepth is the minimum depth where an ore can generate
MaxDepth is the maximum depth where an ore can generate
And the complicated one… DepthApex. DepthApex is the depth where the ore has the highest probability of generate.
For example: let’s take the ore above as an example. If you dug at depth 600, the chance for the ore to generate is the highest, while at depths 100 and 750, the chance for the ore to generate is the lowest. Basically, mining at the depth apex of a given ore guarantees you will have the highest chance of finding the ore. I hope I explained this well enough.
Basically, I’m describing a function that returns a higher number the closer the inputted value is to the DepthApex:
f(x) = some math func involving DepthApex, MinDepth, and MaxDepth
f(MinDepth) = 0
f(MinDepth + 1 ... DepthApex - 1) = 0.001 .. 0.999
f(DepthApex) = 1
f(DepthApex + 1 ... MaxDepth - 1) = 0.999 .. 0.001
f(MaxDepth) = 0
The actual weight variable should in theory control the chance said ore has spawning relative to other ores chose, e.g. an ore with a weight of 1 has ten times more priority over an ore with a weight of 0.1
I have a function I’ve borrowed from some other project I did a while ago, and it’s a function to generate ores based on the weight value established above:
local function SelectOre(Depth)
local Total = 0
for _, c in pairs(Ores) do -- Ores is the table with the ore information
Total = Total + c["Weight"]
end
local Chosen = RNG:NextNumber() * Total
local Current = 0
local Selection
for i,d in pairs(Ores) do
if Chosen <= Current + d["Weight"] then
Selection = i
return(Selection)
else
Current = Current + d["Weight"]
end
end
end
This function works fine. But my question is how would I alter or modify it to account for the DepthApex variable I explained while keeping the actual weight variable accounted for? I’m assuming the math involves the math.abs() function at the very least. Any help will be appreciated.