Mathematical issue in a script

I made a script where a random number between 100 and 500 is picked, so the random number can latter be converted into a smaller value (between 0 and 1). To do so, I can either divide the number by 1000 or multiplicate it by 0.001.

Here is the script:

It seems like a really simple and easy script, wich is true. However (in some cases), the result is shown wrong in the multiplication by 0.00000000000000003 (really low and insignificant number), like in this example:

The selected number was 288. Why the result of 0.288 is wrong by that low value in the multiplication? Great question, I have no idea. The only logical answear is that the random number selected was 288.00000000000003, but it is not the case, as in the division, the number was shown correctly (0.288), and also that the random number picker doesn’t allow decimals.

You’ve posted in the wrong category! The correct category is #help-and-feedback:scripting-support. Use the pencil button to change it.

It’s like the 0.1 + 0.2 = 0.30000000004 case. Here is why it does that:

1 Like

Anyway, the issue is due to floating point error. It boils down to this: computers can’t actually represent numbers all that precisely, so any operation you do is simply the computer’s best estimation as to what it is.

As an example:

u0_a301@localhost ~> node
Welcome to Node.js v21.6.2.
Type ".help" for more information.
> 0.1 + 0.2 === 0.3
false

This is because, according to Node, 0.1 + 0.2 = 0.30000000000000004.

If you let us know what specifically you’re doing, I can help you work around this issue :slightly_smiling_face:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.