这是我的简单代码:
let time = 0;
setInterval(function() {
time += 0.1;
console.log(time);
}, 100);
预期输出为:
0.1
0.2
0.3
0.4
etc...
我得到的是:
0.1
0.2
0.30000000000000004
0.4
0.5
0.6
0.7
0.7999999999999999
0.8999999999999999
0.9999999999999999
1.0999999999999999
1.2
1.3
1.4000000000000001
1.5000000000000002
1.6000000000000003
...
2.2000000000000006
2.3000000000000007
2.400000000000001
2.500000000000001
...
5.599999999999996
5.699999999999996
5.799999999999995
5.899999999999995
5.999999999999995
6.099999999999994
6.199999999999994
6.299999999999994
...
7.19999999999999
7.29999999999999
...
7.899999999999988
7.999999999999988
8.099999999999987
8.199999999999987
...
9.099999999999984
9.199999999999983
9.299999999999983
...
9.89999999999998
9.99999999999998
etc...
为什么会这样?它与二进制计算的工作方式有关吗?我不知道,这没有任何意义。
(我删除了一些输出,因此代码不会太长。)