there is a usage scenario, a filter page
. Users can quickly click on the filter conditions, and immediately request data api, as soon as the filter conditions change, but the API is far abroad, so it can be understood that the request is very time-consuming
, so this interface often has several requests being processed at the same time, and then the page data will flash.
so I think that for the same url
, all requests are only up-to-date, old requests are abandoned no matter whether they are successful or padding status, and many online articles are just about canceling repeated requests, which is a little different from what I want. In addition, I am not familiar with the front-end (this dish is backstage and write the front-end project for the first time). I have not thought about it for a long time, so I would appreciate it if you would trouble me to point out one or two for the younger brother!
now there are interceptors as follows (extraneous code has been removed)
import axios from "axios"
const service = axios.create({
baseURL: "a.com",
})
service.interceptors.request.use(config => {
return config
}, error => {
console.log(error)
Promise.reject(error)
})
service.interceptors.response.use(
response => response,
error => {
console.log("err" + error)
return Promise.reject(error)
})
export default service