I am trying to find the top most followers based on the number of tweets about a brand named Maybelline on Twitter. The brand has about 600K followers and when I try to retrieve them the code keeps running for hours. So is there an efficient way to do this? I am using the below code after setting up twitter authentication. I want all the followers ( top 50) who tweeted the most about Maybelline.
user<-getUser('Maybelline')
user$toDataFrame()
followers<-user$getFollowers()
Thanks
While working with the Twitter API, it's useful to familiarize yourself with their limits. You have two main limits for a GET request, one is a Rate Limit (how many requests you can make in a 15 minute timeframe) and the other is the limit of how many results a certain call gives you.
In your scenario, you are using the GET followers/list endpoint from their API. You can read the docs for that here. That endpoint returns a list of followers and is limited to 20 followers per request and 15 request per 15 minutes, meaning that in a 15 minute timeframe, you can only retrieve 15*20 = 300 users. So to retrieve 600K followers that would take a very long time ( 30K minutes = 500 hours = ~21 days ).
It would be more efficient to use the GET followers/id which returns up to 5K user ids with each request with the same 15 request per 15 minute rate limit. Twitter API reference here. You can use this in conjunction with the Twitter GET users/lookup which returns up to 100 users per request and has a rate limit of 900 requests per 15 minutes. This means it would take 2 hours (at 75K users per 15 minute increment) to get 600K followers id's. And less than 2 hours to get the user objects ( at 90K users per 15 minutes ).
The rate limits can change depending on how the package you are using does authentication. If you are logging in as a Twitter user with your credentials, then the above rate limits are correct. If you are using only application credentials, then getting the followers will take 3x longer as users/lookup has a rate limit of 300 requests in that case or 30K users per 15 minutes. This answer has some good information on rate limits.
Related
Firebase Analytics reports data w.r.t. Daily/Weekly/Monthly Active Users.
Few questions:
(1) Dashboard:
Projecting the Daily Active Users to a month, does not match the value shown in Firebase Dashboard.
For e.g. if Daily Active Users is 30K, then Firebase shows the corresponding Monthly Active Users as 150K.
Does it imply that there were 30K users in last 7 days, and 120K in the preceding 21days?
Not sure why isn't it 30 days x 30K = 900K.
(2) On selecting Firebase > Events > Select_Content > App version
Last 7 days: shows approx 100K
Last 30 days: shows approx 140K
Does it imply that in the 21 day period only 40K User sessions occurred, while the App usage went up drastically in last 7 days?
Please help clarify.
thanks in advance,
The Active Users report in the Firebase dashboard is showing counts of users in the past 30, 7 and 1 day. The values are not projected, but rather based on user engagement that has been measured over those periods. The other thing to keep in mind is for each of those periods, it's the count of unique users over the entire period.
So, for example, if your seeing 150K Monthly Active Users (which is defined here as 30-day active users), that tells you you've had 150K unique users engage with your App in the last 30-day period. If you're seeing 30K Daily Active users, that tells you you had 30K unique users yesterday, and 120K different unique users from the 29 days before yesterday.
If the same user engages with your App more than once in the period, they only count as one. Out of your 30K users from yesterday, a number of those would have presumably engaged in the 29 days before that, so it's expected that your Monthly Active Users would be less than your Daily Active Users x 30 days. How much lower would depend on the specifics of your app, but the closer those numbers are, the more frequently the same users are returning to your App over the 30 days, which is positive in terms of user engagement retention.
I'm using Azure Text Analytics for Sentiment Analysis. I was wondering if the API is limited to 100 requests per minute or if there is any request limit for the service.
When I tried to request more than 100 times within a minute, the API returns an empty document.
As mentioned at https://learn.microsoft.com/en-us/azure/cognitive-services/Text-Analytics/overview:
Limits
Maximum size of a single document: 5,000 characters as measured by String.Length.
Maximum size of entire request: 1 MB
Maximum number of documents in a request: 1,000 documents
The rate limit is 100 calls per minute. Note that you can submit a large quantity of documents in a single call (up to 1000 documents).
This poster has similar issue and developed their own ratelimiter for cognitive services.
I was wondering if anyone knew what is the max number of LinkedIn connection a user can add per day?
Thanks!
the maximum number of linked-in request per day is 50 on single day, previous it was 300 then year by year they reduced. if you are a premium user then there is no restriction in request only in mails restriction are there. if you logged as normal user then its 50.
https://www.linkedin.com/pulse/how-get-around-new-linkedin-connection-request-limit-2021-alex-gray/ says it's probably ~100 per week.
https://evaboot.com/blog/how-many-connection-requests-send-on-linkedin and https://expandi.io/blog/linkedin-connections-limit/ and https://www.paulgreensmspmarketing.com/the-new-linkedin-connection-request-and-what-it-means-for-msps/ seem to agree.
And the last one says there is an "ultimate cap of 30,000 connections".
I'm testing out the Here API for geocoding purposes. Currently in the evaluation period, some of my tests include geocoding as many as 400 addresses at a time (later I may rarely hit 1000). When I tried this with google maps, they would give me an error indicating I'd gone over the rate limit, but I have not gotten such an error from Here API despite not limiting the rate of my requests (beyond waiting for one to finish before sending the next).
But in the Developer FAQ the Requests Per Second limit is given as:
Public Plans Business Plans
Basic 1 N/A
Starter 1 1
Standard 2 2
Pro 3 3
Which seems ridiculously slow. 1 request per second? 3 per second on the highest plan? Is this chart a typo? If so, what are the actual limits? If not, what kind of error should I expect if I exceed that limit?
Their documentation states that the RPS means "for each Application the number of Requests per second to HERE Services calculated as an average (number of Requests during a period of 5 minutes) to all of the APIs used to access the features listed for each subscription plan".*
They say later in the documentation that quota is calculated monthly: "When a usage record is loaded into our billing system that results in a plan crossing its monthly quota, the price applied to that usage record is pro-rated to account for the portion that is included in your monthly quota for free and the portion that is billable. Subsequent usage records above your monthly quota will show at the per transaction prices listed on this website."*
Overages are billed at 200/$1 USD for Business or 2000/$1 USD for Public plans. So for the Pro plan, you will hit your limit if you use more than 7.779 million API requests in any given month, any usage beyond that would be billed at the rates above.
Excerpts taken from Developer FAQ linked above.
When using https://www.linkedin.com/countserv/count/share?format=json&url= to access an article's sharecount, is there an api daily limit?
We noticed that the time it was taking to retrieve count data was taking as much as 20 seconds on our production server. We added logic to cache the number of counts, and the 20 second delay stopped the next day. We are left wondering though what the limit might be (we can't seem to find it in your documentation).