Httr2 error when dowonloading weekly patterns

Hello, I am using R and the download_files1() function to download a month of weekly patterns from ADVAN and I keep getting this error.

Error during httr2::req_perform.
HTTP 429 Too Many Requests.

image

How do I either increase the time to download the data I need or resolve this error?

Hi Gabriela,

Apologies for the inconvenience here. We had made some adjustments to the rate limits which should be fixed now. Feel free to reply in this thread if you are still experiencing this.

Thanks!

Best,
Nolan

I keep gettting the same error. I can download 8 files and then get the error again.

Hi @gabriella.palomo,

Sorry about that. Are you able to try using the .download_files() method?

If you continue to see the error, try to add some time time between your API hits to avoid reaching the limit. Here is the underlying library GitHub to try it with GitHub - amplifydata/amplifydata-public. All of your same credentials will work.

I’m also running into this issue. Can you post where to find the updated rate limits?

Hi @jake, I am getting the same / a similar error as others –

Error in req_perform():
! HTTP 429 Too Many Requests.

– where the req_perform() function is from the httr2 package.

Hopefully I have not missed the follow-up for this. Thank you!

@jbayham @jgellman Apologies for the inconvenience here. Can you clarify the package and code you are currently using when receiving this error?

Thanks, @jake. I am using the deweydatar package in R. My solution is to build in 15 min pause every time I get the 429 error. I added %>% req_retry(max_tries=10, backoff=~1000) to line 461 here:

I would like to know the download rate limits, so that I can be respectful of Amplify rules/policies.

Thank you @jake and @jbayham.

Previously, using the deweydatar package version 0.2.1, I simply ran the following code:

download_files1(apikey_, pp_advan_wp, "my/file/path",start_date="2018-01-01",end_date="2023-12-31")

– and ended up getting the rate limiting error noted above.

I am currently running Jude’s modified version of the code, but no certainty yet on if a 15 minute wait time is overkill.

@jake, is it possible to share the rate limits so that we can stay below them?

@jgellman @jbayham Thank you for your communication and patience on this. The rate limit is 2 API hits per second.

Thank you @jake. Is this assessed over some time window (e.g., average over an hour)? Through trial and error, I had been pausing for 5 seconds and would sometimes still get a 429. Does the get_file_list() query count toward this rate limit?

@jbayham @jgellman Yes, any hit to the API counts towards the rate limit, not just the file download, therefore getting the file list counts.

To clarify on the rate limit topic, the rate limit is just per second, not averaged over an hour.

Hope this helps! Feel free to follow up here with any other questions.

@nolan I am using the Python version and I get the 429 error very often, even if I put time.sleep(10) before each API call. This never happened to me before with exactly the same code. Could you look into it and put the limits similar to what we had before? It already took a long time to download the data before getting this error, but now this error is making it impossible to download several years of data…

Hey @jake, can you tell us again what are the current rate limits? Because I have code that works correctly for several hours and then all of a sudden starts giving me the 429 error. Then if I wait a bit (some hours) it works again. This seems to indicate that there is some other limit besides the rate limit per second.

@jake, I’d echo what @jbayham and @ariadnaaz13 are saying here. Even when I include Sys.sleep breaks in the API calls to avoid making more than 2 API calls per second, I still end up rate limited.

@max_kagan The rate limit seems a bit unpredictable at times. I think it is easiest to just build in a catch, so when you get the 429, wait for some time (I did 15 min). See my code above for an example in R.

The issue causing the frequent rate limit error as been resolved! Please try your download again if you were experiencing interruptions.

Thank you all for you patience and communication on this!

If you are still experiencing some expected rate limit errors, you can try implementing this helper function, similar to what @jbayham proposed: Python Code Examples

Thank you again!