-
Notifications
You must be signed in to change notification settings - Fork 200
0.6.7 get_timeline: rate limit error #266
Comments
I have figured out what your problem was. There is a rate limit on the number of times you can use rate_limit(), which is 180. If you call rate_limit() and look at the 11th row you will see it. This is what it looks like for me. 11 application/rate_limit_status 180 178 14.8827226996422 2018-07-10 00:09:04 2018-07-09 23:54:12 NetworkVoyager2 |
Having the same issue here and am pretty confident that the answer is not simply the limit of rate limit checks. This limit can be addressed, for example, by adding some sleep time when a call to Here is a very rough fix for the
It seems like, in some cases where the limit is exceeded,
|
New to rtweet and am now encountering the issue discussed in this thread. To summarize, the issue appears to be that the twitter API rate limit for get_timeline() is 900, but the get_timeline() function itself makes a call to rate_limit() and the twitter API rate limit on the rate_limit() is 180. Thus, in practice the limit on get_timeline() is 180 over 15 minutes when it could be as high as 900. Since Twitter itself allows for much more calls, there should be a way around this. Are the original authors considering a modified function or has anyone else found a way around this? Thank you. |
My apologies, I now figured out that this was an argument under Setting the check=FALSE will resolve this. But the documentation could be clearer on this. |
Hey guys. I am trying to download tweets with a specific hashtag (#PRIDE), I used the tips above but still every time I run the code for downloading the tweets the loading bar reaches only 10%-20%. Any ideas? Thank you |
I was able to resolve this issue by wrapping the
|
This worked very well for me. Thanks for posting.
|
Problem
Hi, first of all thank you for developing such a powerful package for R. Currently, I struggle a bit with the get_timeline function. Instead of getting the tweets of 900 users (Twitter API limit), I get a rate limit error ("Warning: Rate limit exceeded - 88 Error in if (n%/%200 < n.times) { : argument is of length zero") by trying to scrape the data of 250 users.
What I already did:
a) Update rtweet to 0.6.7.9000
b) Filter out protected Twitter users
c) Filter out users without tweets
d) Figure out if the account is still existing
e) Print out the rate limit as part of a loop to see where the function stops working. I didn't observe any pattern. Sometimes it stopped after 64 users, 120 users, 153 users, etc.
f) Use the following loop:
for (i in seq_along(data)) {
data[[i]] <- get_timeline(users[i], n = 10)
rl <- rate_limit(token, "statuses/user_timeline")
if (rl$remaining == 0L) {
Sys.sleep(as.numeric(rl$reset, "secs"))
}
}
The loop is not working because get_timeline stops before reaching rl$ramining == 0L.
It would be great if someone has any ideas to solve the problem.
Thank you!
Error
Warning: Rate limit exceeded - 88
Error in if (n%/%200 < n.times) { : argument is of length zero
Code
fds <- get_followers("potus", n = 1000)
fds_data <- lookup_users(fds$user_id)
users <- fds_data$screen_name[ order(fds_data$followers_count, decreasing = TRUE)][1:250]
tmls <- get_timeline(users, n = 10)
rtweet version
0.6.7.9000
Session info
Matrix products: default
locale:
[1] LC_COLLATE=German_Germany.1252 LC_CTYPE=German_Germany.1252 LC_MONETARY=German_Germany.1252
[4] LC_NUMERIC=C LC_TIME=German_Germany.1252
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] httpuv_1.4.4.1 rtweet_0.6.7.9000
loaded via a namespace (and not attached):
[1] httr_1.3.1 compiler_3.5.0 magrittr_1.5 R6_2.2.2 promises_1.0.1 later_0.7.3 tools_3.5.0
[8] pillar_1.2.3 tibble_1.4.2 curl_3.2 Rcpp_0.12.17 jsonlite_1.5 rlang_0.2.1 openssl_1.0.1
Token
request: https://api.twitter.com/oauth/request_token authorize: https://api.twitter.com/oauth/authenticate access: https://api.twitter.com/oauth/access_token rtweet_token_bk key: eSp0******************** secret: oauth_token, oauth_token_secret, user_id, screen_nameThe text was updated successfully, but these errors were encountered: