Why do you use setTimeout, to print a result about every other second?
the code is as follows:
for (let i = 0; i < 5; iPP) {
setTimeout(function() {
console.log(i);
}, i * 1000);
}
print result:
question: why is it that the delay time of setTimeout is dynamically set here, and the effect is about 1 second to print a result? If you want to print a result every 0s, 1s, 2s, 3s, 4s, how do you modify the code?
here comes the answer
function sum(i){if(i){return sum(i-1) + i}else return 0}
for (let i = 0; i < 5; iPP) {
setTimeout(function() {
console.log(i);
}, sum(i) * 1000);
}
in a loop, each loop decides to print once after 1000 milliseconds.
then it is very simple to modify it. First, declare a function that calculates the delay according to your purpose
.
function getDelay(i) {
var result = 0
while (i > 0) {
result += i;
i--;
}
return result;
}
then put the original code
i * 1000
this place is changed to
getDelay(i) * 1000
for (let i = 0, t = 0; i < 5; iPP, t+=i*1000) {
setTimeout(function() {
console.log(i);
}, t);
}
for (let i = 0; I < 5; iPP) {
setTimeout(function(){
console.log(i)
},i*(i+1)/2*1000)
}
this question, I understand it this way. First of all, the js interpreter has read the code, setTimeout is asynchronous, so they start timing at the same time, that is, one second per second, but Synchronize, wait a second, then wait 2 seconds, 3 seconds.
if you want to achieve the effect you want, you need to make a separate variable, or change the current async to Synchronize.
this should be a js closure. It is recommended to know some relevant knowledge, and the answer will be
.