the values for math.noise actually range from 0.75304806232452 to -0.60881221294403

repo: run this in the command line

local biggest = 0
local smallest = 0
for i = 1, 5000000 do
local noise = math.noise(i/100000, i/100000, i/100000)
biggest = math.max(biggest, noise)
smallest = math.min(smallest, noise)
end
print(biggest, smallest)

Seems to be that someone on the Developer Hub took a bit too much liberty with their assumptions and just put down on the page that it should be between -.5 and .5. As far as I know, this guarantee was never given on the old wiki.

See this article: (also on why your test may not be representative of the true range)

Perlin noise should not generate values between -.5 and .5. It is a Gaussian, values |x| > .5 are just much less likely than values closer to 0.

Iâ€™m going to move this over from Engine Bugs to Documentation Requests.

iirc it was listed as -0.5 to 0.5, somewhere at least. i brought this up with urist in the roblox IRC towards the end of 2015

i think itâ€™s fairly arbitary based on the implementation of the noise function, although i really donâ€™t think that a range of -0.608â€¦ to 0.753â€¦ was the intended behavior when implementing the function

all in all, i really think this should stay in bug reportsâ€¦ updating the documentation on the wiki is a decent bandaid fix but i think having the noise function throw out a weird value range is not intended behavior, nor is it desirable for development

Could you run another test with random samples in the full 3d range? Right now youâ€™re taking a diagonal through the space (i.e. x=y=z) which is at a regular interval, it might occlude some of the range due to the nature of perlin noise.

local biggest = 0
local smallest = 0
local noise = math.noise
local rand = math.random
for i = 1, 10000000 do
local n = noise(rand(0,1000000)/10000, rand(0,1000000)/10000, rand(0,1000000)/10000)
biggest = math.max(biggest, n)
smallest = math.min(smallest, n)
end
print(biggest, smallest)

You can actually sometimes get values slightly outside of the range [-1,1], so if the interval is critical to you, you should use math.clamp(noise, -1, 1) on the output. Try this variant, and see what you get:

local biggest = -math.huge
local smallest = math.huge
local rng = Random.new(0)
for i = 1, 10000000 do
local noise = math.noise(rng:NextNumber()*1000000, rng:NextNumber()*1000000, rng:NextNumber()*10000000)
biggest = math.max(biggest, noise)
smallest = math.min(smallest, noise)
end
print(biggest, smallest)

The documentation for math.noise states that it will return a number between -0.5 and 0.5. However, this is not true. Sometimes, the result will be slightly outside of those bounds. For instance, math.noise(9.9, 9.6) will return -0.50271731615067. Clearly, thatâ€™s out of bounds.

This is important to document, because developers need to know that they need to explicitly clamp the noise value between -0.5 and 0.5 to properly use the value for clean data.

1D noise is bounded by [-0.5, 0.5].
2D noise is bounded by (-1, 1)
3D noise is bounded by ???

I have probably done over a hundred million random samples for each dimension of noise. The 3D bounds are unknown without the exact implementation, but the 1D and 2D results are significant enough imo to warrant documentation.

Methodology

1D

for i = 1, 1e6 do
local a = math.random() * 1e4
if math.abs(math.noise(a)) >= 0.5 then
print(math.noise(a), a)
end
end
print("DONE")

2D (might take a few tries)

for i = 1, 1e6 do
local a, b = math.random() * 1e4, math.random() * 1e4
if math.abs(math.noise(a, b)) > 0.98 then
print(math.noise(a, b), a, b)
end
end
print("DONE")

3D (might take a few tries)

for i = 1, 10e6 do
local a, b, c = math.random() * 1e4, math.random() * 1e4, math.random() * 1e4
if math.abs(math.noise(a, b, c)) > 1 then
print(math.noise(a, b, c), a, b, c)
end
end
print("DONE")

Did you scroll up in the thread? None of your findings are correct, this has been tested before. It is possible for values to occur (at very low frequency) even outside those ranges.

The range of the noise function is different depending on how many inputs you give it. The posts above focus on 3D whereas Iâ€™ve included bounds for 1D and 2D. You can try my tests for yourself.

EDIT: Well sleitnick did mention 2D but his bounds donâ€™t contradict mine.

The point is that itâ€™s like a bell curve. Values outside of the â€śregular rangeâ€ť are totally possible but infrequent. It doesnâ€™t make sense at all to document an unbounded probability function of values as â€śboundedâ€ť between X and Y. The DevHub should describe what is actually returned so that developers know to clamp and not hit obscure bugs that are hard to debug just because they didnâ€™t account for it going out of bounds since the devhub didnâ€™t tell them to.

This article gives a proof for the range of Perlin noise being [-sqrt(N)/2, sqrt(N)/2] with N being the number of dimensions. That is,

3D: ~0.866
2D: ~0.707
1D: 0.5

Robloxâ€™s Perlin noise function doesnâ€™t adhere to the ranges for 3D and 2D and I donâ€™t know why.

I set up this game today to distribute calculations among multiple clients. At one point we had about 20 people each doing a million random samples every 10 frames. The largest values we calculated are

3D: 1.0336
2D: 1
1D: 0.5

Obviously this isnâ€™t definitive proof of the range, but I believe it warrants looking into the algorithm to actually calculate it. Without the algorithm, the best I can do is numerical estimates.

Looks like I misread your dimensions, I skimmed and thought you came to the same conclusions as someone earlier in the thread, but your ranges are higher than what the others mentioned.

Going back to the original point I was trying to make: the dev hub should just describe the nature of the distribution more and not just say â€śitâ€™s between X and Yâ€ť because that isnâ€™t necessarily all of the information an uninformed reader would need to know to figure out what they need to do to make sure the noise is in a range suitable for their use case (they might want to clamp more/less or otherwise transform the data depending on the kind of distribution/range they want). And as you mention, for the actual bounds themselves, doing massive amounts of experiments is an indicator but not an actual proof on the implementation.