math.random() (the call with no args) returns a number in the range [0, 1). Includes 0, excludes 1. But why does it exclude 1?
Not that I depend on it, and chances of even getting zero are already low enough.
Math.random(), in C#,
Random().Next(). Is there a reason for it then?
It could be for any number of reasons, and I don’t know if there’s any definitive one, however I can think of the following off the top of my head:
- It makes it easy to scale your numbers by simply multiplying and/or rounding.
math.random() * n is essentially range
- In languages where arrays start at 0, it makes it easy to choose an element in your array.
my_arr_of_five[Floor(Random() * 5)] is never going out of bounds. Since
Random never returns
1 we don’t hit the case of out-of-range index.
- Maybe it’s intuitive, in
Floor(Random() * 5) you have exactly 5 possible values, while in a
[0, 1] range it’d be six.
- Maybe someone very early on decided it would be standard, and so it became that.
These are just guesses, but regardless I personally like the arbitrary decision.
Some more reasons I found with a quick search: https://www.quora.com/Why-does-the-Math-random-function-return-a-double-in-the-range-0-1-instead-of-other-intervals-in-Java