Avoid Rate limit with rtweet get_timeline()“Error in unserialize” - foreach/doSNOW/snow with SOCK (windows)tryCatch to ignore error and go on failed. R stops computationR - Set execution time limit in loopInterrupting readline() after a time interval in RTrycatch error handling in a vectorized wayAvoiding twitter's rate limitingRepeating values in loop until error disappearsAvoid error 420s with streaming API tweepyExtended range in geom_ribbonRtweet Rate Limit Exceeded- trying to read timeline tweets for sentiment analysis

Intersection Puzzle

How to prevent "they're falling in love" trope

Avoiding direct proof while writing proof by induction

Forgetting the musical notes while performing in concert

Why was the shrinking from 8″ made only to 5.25″ and not smaller (4″ or less)?

How to tell a function to use the default argument values?

Determining Impedance With An Antenna Analyzer

Do scales need to be in alphabetical order?

How dangerous is XSS?

How does a predictive coding aid in lossless compression?

Plagiarism or not?

CAST throwing error when run in stored procedure but not when run as raw query

Could the museum Saturn V's be refitted for one more flight?

Is there an expression that means doing something right before you will need it rather than doing it in case you might need it?

What is a romance in Latin?

What's the in-universe reasoning behind sorcerers needing material components?

What reasons are there for a Capitalist to oppose a 100% inheritance tax?

If human space travel is limited by the G force vulnerability, is there a way to counter G forces?

How do I deal with an unproductive colleague in a small company?

Why can't we play rap on piano?

Ambiguity in the definition of entropy

Why didn't Miles's spider sense work before?

Are there any examples of a variable being normally distributed that is *not* due to the Central Limit Theorem?

How writing a dominant 7 sus4 chord in RNA ( Vsus7 chord in the 1st inversion)



Avoid Rate limit with rtweet get_timeline()


“Error in unserialize” - foreach/doSNOW/snow with SOCK (windows)tryCatch to ignore error and go on failed. R stops computationR - Set execution time limit in loopInterrupting readline() after a time interval in RTrycatch error handling in a vectorized wayAvoiding twitter's rate limitingRepeating values in loop until error disappearsAvoid error 420s with streaming API tweepyExtended range in geom_ribbonRtweet Rate Limit Exceeded- trying to read timeline tweets for sentiment analysis













2















Is there anyway to stop my loop from being interrupted by the rate limit? I would like my code to wait to execute until the time limit has passed if possible.



A side question: I thought about parallelizing the for loop. Do you think this would be a good idea? I was not sure if there would be a chance for data to be written to the wrong file.



library(rtweet)
create_token(app="Arconic Influential Followers",consumer_key,consumer_secret)

flw <- get_followers("arconic")
fds <- get_friends("arconic")
usrs <- lookup_users(c(flw$user_id, fds$user_id))

for(i in 1:length(usrs$user_id))

a<-tryCatch(get_timeline(usrs$user_id[i]),
error=function(e)message(e)
)
tryCatch(save_as_csv(a,usrs$user_id[i]),
error=function(e)message(e)
)











share|improve this question


























    2















    Is there anyway to stop my loop from being interrupted by the rate limit? I would like my code to wait to execute until the time limit has passed if possible.



    A side question: I thought about parallelizing the for loop. Do you think this would be a good idea? I was not sure if there would be a chance for data to be written to the wrong file.



    library(rtweet)
    create_token(app="Arconic Influential Followers",consumer_key,consumer_secret)

    flw <- get_followers("arconic")
    fds <- get_friends("arconic")
    usrs <- lookup_users(c(flw$user_id, fds$user_id))

    for(i in 1:length(usrs$user_id))

    a<-tryCatch(get_timeline(usrs$user_id[i]),
    error=function(e)message(e)
    )
    tryCatch(save_as_csv(a,usrs$user_id[i]),
    error=function(e)message(e)
    )











    share|improve this question
























      2












      2








      2


      1






      Is there anyway to stop my loop from being interrupted by the rate limit? I would like my code to wait to execute until the time limit has passed if possible.



      A side question: I thought about parallelizing the for loop. Do you think this would be a good idea? I was not sure if there would be a chance for data to be written to the wrong file.



      library(rtweet)
      create_token(app="Arconic Influential Followers",consumer_key,consumer_secret)

      flw <- get_followers("arconic")
      fds <- get_friends("arconic")
      usrs <- lookup_users(c(flw$user_id, fds$user_id))

      for(i in 1:length(usrs$user_id))

      a<-tryCatch(get_timeline(usrs$user_id[i]),
      error=function(e)message(e)
      )
      tryCatch(save_as_csv(a,usrs$user_id[i]),
      error=function(e)message(e)
      )











      share|improve this question














      Is there anyway to stop my loop from being interrupted by the rate limit? I would like my code to wait to execute until the time limit has passed if possible.



      A side question: I thought about parallelizing the for loop. Do you think this would be a good idea? I was not sure if there would be a chance for data to be written to the wrong file.



      library(rtweet)
      create_token(app="Arconic Influential Followers",consumer_key,consumer_secret)

      flw <- get_followers("arconic")
      fds <- get_friends("arconic")
      usrs <- lookup_users(c(flw$user_id, fds$user_id))

      for(i in 1:length(usrs$user_id))

      a<-tryCatch(get_timeline(usrs$user_id[i]),
      error=function(e)message(e)
      )
      tryCatch(save_as_csv(a,usrs$user_id[i]),
      error=function(e)message(e)
      )








      r twitter parallel-processing






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Feb 3 '17 at 13:59









      Brent FerrierBrent Ferrier

      137313




      137313






















          2 Answers
          2






          active

          oldest

          votes


















          1














          What I ended up doing was create a while loop that checked the number of records I had left in my Users vector, ran my for loop, and then put the system to sleep for 15 mins. This approach is good, but there are some things to account for. I have the while loop breaking at 200 just in case there were users that didn't have any data to save into a csv. This turned out to be a good move because if you notice the for loop starts iterating at 80. As you start moving across your vector of users the good users are removed iteratively. This leaves only the users that cause errors. An improvement for someone up to the task would be to handle this programatically.



          Users <- usrs$user_id
          goodUsers <- substring(list.files(),1,nchar(list.files())-11)
          Users <- setdiff(Users,goodUsers)

          while(length(Users)>200)
          for(i in 80:length(Users))

          a<-tryCatch(get_timeline(Users[i],usr=FALSE),
          error=function(e)message(e)
          )
          tryCatch(save_as_csv(a,Users[i])
          goodUsers <- append(goodUsers,Users[i]),
          error=function(e)message(e)
          )



          Users <- setdiff(Users,goodUsers)
          Sys.sleep(900)


          length(Users)
          length(goodUsers)





          share|improve this answer























          • From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

            – csmontt
            Mar 28 '18 at 18:44


















          0














          I was able to resolve it by wrapping get_timeline() function in the following code.
          The function get_timeline_unlimited calls itself recursively after waiting the required time for the rate limit to reset. So far it worked well for me with no issues.



           get_timeline_unlimited <- function(users, n)

          if (length(users) ==0)
          return(NULL)


          rl <- rate_limit(query = "get_timeline")

          if (length(users) <= rl$remaining)
          print(glue("Getting data for length(users) users"))
          tweets <- get_timeline(users, n, check = FALSE)
          else

          if (rl$remaining > 0)
          users_first <- users[1:rl$remaining]
          users_rest <- users[-(1:rl$remaining)]
          print(glue("Getting data for length(users_first) users"))
          tweets_first <- get_timeline(users_first, n, check = FALSE)
          rl <- rate_limit(query = "get_timeline")
          else
          tweets_first <- NULL
          users_rest <- users

          wait <- rl$reset + 0.1
          print(glue("Waiting for round(wait,2) minutes"))
          Sys.sleep(wait * 60)

          tweets_rest <- get_timeline_unlimited(users_rest, n)
          tweets <- bind_rows(tweets_first, tweets_rest)

          return(tweets)






          share|improve this answer























            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f42025979%2favoid-rate-limit-with-rtweet-get-timeline%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            1














            What I ended up doing was create a while loop that checked the number of records I had left in my Users vector, ran my for loop, and then put the system to sleep for 15 mins. This approach is good, but there are some things to account for. I have the while loop breaking at 200 just in case there were users that didn't have any data to save into a csv. This turned out to be a good move because if you notice the for loop starts iterating at 80. As you start moving across your vector of users the good users are removed iteratively. This leaves only the users that cause errors. An improvement for someone up to the task would be to handle this programatically.



            Users <- usrs$user_id
            goodUsers <- substring(list.files(),1,nchar(list.files())-11)
            Users <- setdiff(Users,goodUsers)

            while(length(Users)>200)
            for(i in 80:length(Users))

            a<-tryCatch(get_timeline(Users[i],usr=FALSE),
            error=function(e)message(e)
            )
            tryCatch(save_as_csv(a,Users[i])
            goodUsers <- append(goodUsers,Users[i]),
            error=function(e)message(e)
            )



            Users <- setdiff(Users,goodUsers)
            Sys.sleep(900)


            length(Users)
            length(goodUsers)





            share|improve this answer























            • From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

              – csmontt
              Mar 28 '18 at 18:44















            1














            What I ended up doing was create a while loop that checked the number of records I had left in my Users vector, ran my for loop, and then put the system to sleep for 15 mins. This approach is good, but there are some things to account for. I have the while loop breaking at 200 just in case there were users that didn't have any data to save into a csv. This turned out to be a good move because if you notice the for loop starts iterating at 80. As you start moving across your vector of users the good users are removed iteratively. This leaves only the users that cause errors. An improvement for someone up to the task would be to handle this programatically.



            Users <- usrs$user_id
            goodUsers <- substring(list.files(),1,nchar(list.files())-11)
            Users <- setdiff(Users,goodUsers)

            while(length(Users)>200)
            for(i in 80:length(Users))

            a<-tryCatch(get_timeline(Users[i],usr=FALSE),
            error=function(e)message(e)
            )
            tryCatch(save_as_csv(a,Users[i])
            goodUsers <- append(goodUsers,Users[i]),
            error=function(e)message(e)
            )



            Users <- setdiff(Users,goodUsers)
            Sys.sleep(900)


            length(Users)
            length(goodUsers)





            share|improve this answer























            • From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

              – csmontt
              Mar 28 '18 at 18:44













            1












            1








            1







            What I ended up doing was create a while loop that checked the number of records I had left in my Users vector, ran my for loop, and then put the system to sleep for 15 mins. This approach is good, but there are some things to account for. I have the while loop breaking at 200 just in case there were users that didn't have any data to save into a csv. This turned out to be a good move because if you notice the for loop starts iterating at 80. As you start moving across your vector of users the good users are removed iteratively. This leaves only the users that cause errors. An improvement for someone up to the task would be to handle this programatically.



            Users <- usrs$user_id
            goodUsers <- substring(list.files(),1,nchar(list.files())-11)
            Users <- setdiff(Users,goodUsers)

            while(length(Users)>200)
            for(i in 80:length(Users))

            a<-tryCatch(get_timeline(Users[i],usr=FALSE),
            error=function(e)message(e)
            )
            tryCatch(save_as_csv(a,Users[i])
            goodUsers <- append(goodUsers,Users[i]),
            error=function(e)message(e)
            )



            Users <- setdiff(Users,goodUsers)
            Sys.sleep(900)


            length(Users)
            length(goodUsers)





            share|improve this answer













            What I ended up doing was create a while loop that checked the number of records I had left in my Users vector, ran my for loop, and then put the system to sleep for 15 mins. This approach is good, but there are some things to account for. I have the while loop breaking at 200 just in case there were users that didn't have any data to save into a csv. This turned out to be a good move because if you notice the for loop starts iterating at 80. As you start moving across your vector of users the good users are removed iteratively. This leaves only the users that cause errors. An improvement for someone up to the task would be to handle this programatically.



            Users <- usrs$user_id
            goodUsers <- substring(list.files(),1,nchar(list.files())-11)
            Users <- setdiff(Users,goodUsers)

            while(length(Users)>200)
            for(i in 80:length(Users))

            a<-tryCatch(get_timeline(Users[i],usr=FALSE),
            error=function(e)message(e)
            )
            tryCatch(save_as_csv(a,Users[i])
            goodUsers <- append(goodUsers,Users[i]),
            error=function(e)message(e)
            )



            Users <- setdiff(Users,goodUsers)
            Sys.sleep(900)


            length(Users)
            length(goodUsers)






            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Feb 8 '17 at 14:11









            Brent FerrierBrent Ferrier

            137313




            137313












            • From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

              – csmontt
              Mar 28 '18 at 18:44

















            • From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

              – csmontt
              Mar 28 '18 at 18:44
















            From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

            – csmontt
            Mar 28 '18 at 18:44





            From http://rtweet.info/ "Twitter rate limits cap the number of search results returned to 18,000 every 15 minutes. To request more than that, simply set retryonratelimit = TRUE and rtweet will wait for rate limit resets for you".

            – csmontt
            Mar 28 '18 at 18:44













            0














            I was able to resolve it by wrapping get_timeline() function in the following code.
            The function get_timeline_unlimited calls itself recursively after waiting the required time for the rate limit to reset. So far it worked well for me with no issues.



             get_timeline_unlimited <- function(users, n)

            if (length(users) ==0)
            return(NULL)


            rl <- rate_limit(query = "get_timeline")

            if (length(users) <= rl$remaining)
            print(glue("Getting data for length(users) users"))
            tweets <- get_timeline(users, n, check = FALSE)
            else

            if (rl$remaining > 0)
            users_first <- users[1:rl$remaining]
            users_rest <- users[-(1:rl$remaining)]
            print(glue("Getting data for length(users_first) users"))
            tweets_first <- get_timeline(users_first, n, check = FALSE)
            rl <- rate_limit(query = "get_timeline")
            else
            tweets_first <- NULL
            users_rest <- users

            wait <- rl$reset + 0.1
            print(glue("Waiting for round(wait,2) minutes"))
            Sys.sleep(wait * 60)

            tweets_rest <- get_timeline_unlimited(users_rest, n)
            tweets <- bind_rows(tweets_first, tweets_rest)

            return(tweets)






            share|improve this answer



























              0














              I was able to resolve it by wrapping get_timeline() function in the following code.
              The function get_timeline_unlimited calls itself recursively after waiting the required time for the rate limit to reset. So far it worked well for me with no issues.



               get_timeline_unlimited <- function(users, n)

              if (length(users) ==0)
              return(NULL)


              rl <- rate_limit(query = "get_timeline")

              if (length(users) <= rl$remaining)
              print(glue("Getting data for length(users) users"))
              tweets <- get_timeline(users, n, check = FALSE)
              else

              if (rl$remaining > 0)
              users_first <- users[1:rl$remaining]
              users_rest <- users[-(1:rl$remaining)]
              print(glue("Getting data for length(users_first) users"))
              tweets_first <- get_timeline(users_first, n, check = FALSE)
              rl <- rate_limit(query = "get_timeline")
              else
              tweets_first <- NULL
              users_rest <- users

              wait <- rl$reset + 0.1
              print(glue("Waiting for round(wait,2) minutes"))
              Sys.sleep(wait * 60)

              tweets_rest <- get_timeline_unlimited(users_rest, n)
              tweets <- bind_rows(tweets_first, tweets_rest)

              return(tweets)






              share|improve this answer

























                0












                0








                0







                I was able to resolve it by wrapping get_timeline() function in the following code.
                The function get_timeline_unlimited calls itself recursively after waiting the required time for the rate limit to reset. So far it worked well for me with no issues.



                 get_timeline_unlimited <- function(users, n)

                if (length(users) ==0)
                return(NULL)


                rl <- rate_limit(query = "get_timeline")

                if (length(users) <= rl$remaining)
                print(glue("Getting data for length(users) users"))
                tweets <- get_timeline(users, n, check = FALSE)
                else

                if (rl$remaining > 0)
                users_first <- users[1:rl$remaining]
                users_rest <- users[-(1:rl$remaining)]
                print(glue("Getting data for length(users_first) users"))
                tweets_first <- get_timeline(users_first, n, check = FALSE)
                rl <- rate_limit(query = "get_timeline")
                else
                tweets_first <- NULL
                users_rest <- users

                wait <- rl$reset + 0.1
                print(glue("Waiting for round(wait,2) minutes"))
                Sys.sleep(wait * 60)

                tweets_rest <- get_timeline_unlimited(users_rest, n)
                tweets <- bind_rows(tweets_first, tweets_rest)

                return(tweets)






                share|improve this answer













                I was able to resolve it by wrapping get_timeline() function in the following code.
                The function get_timeline_unlimited calls itself recursively after waiting the required time for the rate limit to reset. So far it worked well for me with no issues.



                 get_timeline_unlimited <- function(users, n)

                if (length(users) ==0)
                return(NULL)


                rl <- rate_limit(query = "get_timeline")

                if (length(users) <= rl$remaining)
                print(glue("Getting data for length(users) users"))
                tweets <- get_timeline(users, n, check = FALSE)
                else

                if (rl$remaining > 0)
                users_first <- users[1:rl$remaining]
                users_rest <- users[-(1:rl$remaining)]
                print(glue("Getting data for length(users_first) users"))
                tweets_first <- get_timeline(users_first, n, check = FALSE)
                rl <- rate_limit(query = "get_timeline")
                else
                tweets_first <- NULL
                users_rest <- users

                wait <- rl$reset + 0.1
                print(glue("Waiting for round(wait,2) minutes"))
                Sys.sleep(wait * 60)

                tweets_rest <- get_timeline_unlimited(users_rest, n)
                tweets <- bind_rows(tweets_first, tweets_rest)

                return(tweets)







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Mar 8 at 22:13









                SashaSasha

                1,57361825




                1,57361825



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f42025979%2favoid-rate-limit-with-rtweet-get-timeline%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Identity Server 4 is not redirecting to Angular app after login2019 Community Moderator ElectionIdentity Server 4 and dockerIdentityserver implicit flow unauthorized_clientIdentityServer Hybrid Flow - Access Token is null after user successful loginIdentity Server to MVC client : Page Redirect After loginLogin with Steam OpenId(oidc-client-js)Identity Server 4+.NET Core 2.0 + IdentityIdentityServer4 post-login redirect not working in Edge browserCall to IdentityServer4 generates System.NullReferenceException: Object reference not set to an instance of an objectIdentityServer4 without HTTPS not workingHow to get Authorization code from identity server without login form

                    2005 Ahvaz unrest Contents Background Causes Casualties Aftermath See also References Navigation menue"At Least 10 Are Killed by Bombs in Iran""Iran"Archived"Arab-Iranians in Iran to make April 15 'Day of Fury'"State of Mind, State of Order: Reactions to Ethnic Unrest in the Islamic Republic of Iran.10.1111/j.1754-9469.2008.00028.x"Iran hangs Arab separatists"Iran Overview from ArchivedConstitution of the Islamic Republic of Iran"Tehran puzzled by forged 'riots' letter""Iran and its minorities: Down in the second class""Iran: Handling Of Ahvaz Unrest Could End With Televised Confessions""Bombings Rock Iran Ahead of Election""Five die in Iran ethnic clashes""Iran: Need for restraint as anniversary of unrest in Khuzestan approaches"Archived"Iranian Sunni protesters killed in clashes with security forces"Archived

                    Can't initialize raids on a new ASUS Prime B360M-A motherboard2019 Community Moderator ElectionSimilar to RAID config yet more like mirroring solution?Can't get motherboard serial numberWhy does the BIOS entry point start with a WBINVD instruction?UEFI performance Asus Maximus V Extreme