-
Notifications
You must be signed in to change notification settings - Fork 200
Update roadmap #471
Comments
First batch (ending on issue 317): 33/50 were closed Number of issues of the list, issue number50: issue 317 |
Second batch: total 53 closed /100 reviewed from last 50 only 20 issues were closed At least 3 question about authentication on servers (workaround copy token created in local computer) A <- function(x, y, z) {
do.all(A_, list(x = x, y = y, z = z)
}
A_ <- function(x, y, z) {
# Lots of stuff
} This could be updated to A <- function(x, y, z) {
# Do stuff
} This would be slightly faster and wouldn't pollute the namespace so much. Also it seems that the package assumes that only one token is in use. As an issue mentions support for multiple tokens would be great. This would solve problems with bearer tokens and others. |
Update 79/167 so 79/150 reviewed, from the last 50 26 were closed. |
Finishing reviewing them 85/167 closed, so half of the issues remain. |
Following some discussions next release will have breaking changes and thus be major version change?
|
Note to self, document that multiple images and multiple alt text work, but multiple images and 1 alt_text doesn't. |
Seems that this is no longer the plan |
Roadmap to update rtweet and clean up issues and PR listed below (on date 2021/02/15):
Code to return this list of issues/PR (ID, title, comments, last update date) :
```r library("tidyverse") remotes::install_github("llrs/socialGH") socialGH::get_issues("ropensci/rtweet") %>% filter(state != "closed") %>% arrange(id) %>% mutate(fix = " - [ ]") %>% select(fix, id, title, n_comments, updated) %>% knitr::kable() %>% gsub("\\|", ",", .) %>% gsub("^,\\s*", "", .) %>% gsub(",\\s*$", "", .) %>% gsub(",\\s*", ", ", .) %>% gsub("\\s*,", ",", .) %>% gsub("\\], ", "] #", .) %>% # gsub("\\] ([0-9]+)", "] #\\2", .) %>% cat(file = "status.txt", sep = "\n") ```get_thread()
useful? #164,get_thread()
useful?, 6, 2018-02-08 01:07:52home_user()
#338, Exporthome_user()
, 0, 2019-07-05 14:49:44counts
endpoint #369, Accessingcounts
endpoint, 6, 2020-02-18 21:40:36search_30day()
#388, Recovering tweet contents from temp files after usingsearch_30day()
, 1, 2020-02-24 14:47:21get_token
,get_tokens
#422, fixes reference to old and missing vignette inget_token
,get_tokens
, 0, 2020-05-15 01:18:54The text was updated successfully, but these errors were encountered: