Using R: Get a users followers from a specific location - r

I started to learn R, but now I am stuck.
I want to analyse followers from a specific twitter account. The problem is that that profile has a lot of followers, so to get all followers would take much time. And I am just interested in the followers from Switzerland.
So I wonder if its possible to just load the data of followers who are coming from switzerland?
This is what I already have:
library("twitteR")
consumer_key <- "my_key"
consumer_secret <- "my_secret"
access_token <- "my_token"
access_secret <- "my_secret"
options(httr_oauth_cache=T) #This will enable the use of a local file to cache OAuth access credentials between R sessions.
setup_twitter_oauth(consumer_key,
consumer_secret,
access_token,
access_secret)
[1] "Using direct authentication"
trump <- getUser("RealDonaldTrump")
follower <- trump$getFollowers(retryOnRateLimit=180)
So, the last line of code obviously would take hours, so I need a better solution. Thanks :)

Could you elaborate on what information you want about each follower? Do you want a count of how many followers list "Switzerland" as their home country? Or do you want more information about each user?
My understanding is that the API doesn't permit the filtering of followers' output on a field such as country. Thus, it seems to me, that one would need to collect all users' information, then filter, after the fact, on the country.
I collected the user ID numbers for all of Donald Trump's followers in June 2016 (when he had fewer than 10 million followers, I think). It took some time, but, with the use of the smappR package & smappR::getFriends function, it was easy to do. I'm sure that it will take longer now that he has many more followers, but the procedure with smappR::getFriends should work.
It will, however, require some additional time to download the user information for each user ID. I think that you'll need to make a distinct query to a twitter API to get the user information, as smappR::getFriends will give only the user IDs (and maybe the user names). You would then need to query an API with a function like smappR::getUsers to get their user information, including country of residence. I admit that my understanding of Twitter APIs is incomplete, but I hope that this response helps.

Related

LinkedIn API to get list of users connections

I am working on a research project based on the six degrees of separation theory (based on the works of Stanley Milgram 1967 and later, Duncan Watts 2001). I just wanted to plot the connections of every user part of the study.
I wanted to show a users connections list, from LinkedIn. I read that there was something like a peoples API which could be used, but I think it is deprecated now. Is there any way of getting a users list of connections, even if we take the users permission?
I also read that there is something like a LinkedIn partners program, I cannot find much information about how I can apply for it.

Using LinkedIn API to retrieve advertising reports

I'm working on a simple app to programmatically retrieve ads performance within Linkedin. I have general API experience but this is the first time i get my feet wet with the Linkedin API.
One example from Linkedin API documentation suggest something that would get me started:
GET https://api.linkedin.com/v2/adAnalyticsV2?q=analytics&dateRange.start.month=1&dateRange.start.day=1&dateRange.start.year=2016&timeGranularity=MONTHLY&pivot=CREATIVE&campaigns=urn:li:sponsoredCampaign:112466001
I am encountering two problems:
First this example implies that you already know the campaign ID. However I am unable to find a way to retrieve a list of campaign ID's for a given account.
Second, if I manually pull a campaign ID, I receive an error: "{"serviceErrorCode":2,"message":"Too many fields requested. Maximum possible fields to request: 20","status":400}". Pretty clear error.
A little research tells me that by adding the parameter "&fields=" I will be able to limit my query to less than 20 field (I really need only a dozen anyway) but I can't find and documentation regarding the names of the fields available.
Any help or pointer will be appreciated.
please refer the link below scroll down where you ill see the field names mentioned as metrics , these are the fields.
https://learn.microsoft.com/en-us/linkedin/marketing/integrations/ads-reporting/ads-reporting?tabs=http#analytics-finder

Find out when user followed an account - rtweet

I'm curious to see when accounts started following me on Twitter (and when I started following accounts). It'd be interesting to see my user activity related to the types of accounts I follow, as well as maps of my followers/followings over time + season.
I've tried getting followers and lookup users in the following manner:
followers <- get_followers("twitterhandlehere", n = 50)
followers_data <- lookup_users(followers$user_id)
Followers_data is a data frame with user info including profile picture, bio, and when the user's account was created, but no where in there does it indicate when the relationship started, as far as I can tell.
Nor does this function seem to indicate the date in which the follow/following started:
lookup_friendship("BarackObama", "MyUsername")
It appears the API didn't support this functionality in the past, and I understand I can stream this data in the future - but is there any way to salvage specificity in the past data?
No, this is not available in the API. You would have to have been regularly polling the friends and followers endpoints to record those changes. You cannot discover it from the API at a specific point in time, you'd have to make the record of follower list changes youself.

Downloading tweets with twitteR does not work as hoped

I'm fairly new to R, I use it for a course on network analysis at my university.
As part of a research project, I want to analyse tweets by Donald Trump and Hillary Clinton. I successfully managed to grant RStudio access to my twitter account, but every time I try to download tweets, I get a fairly meager selection ranging from 1,100 tweets at best to just 800-900 tweets at worst. I do not understand this as I do not get any error message, either. Am I missing something? I thought the limit on downloading tweets was at 3,200?
This is my code:
#load twittR package and necessary tool for login
library(twitteR)
library(ROAuth)
#load login data
api_key <- "blah"
api_secret <- "blah"
access_token <- "blah"
access_token_secret <- "blah"
#login
setup_twitter_oauth(api_key,api_secret,access_token,access_token_secret)
#retreive tweets by Donald Trump, maximum number is 3200
tweetsTrump <- userTimeline("realDonaldTrump", n=3200)
#convert those tweets to a dataframe
Trump.df <- twListToDF(tweetsTrump)
I am eternally grateful for every useful tip!
Have a look at the Twitter API documentation where it says:
The Twitter Search API searches against a sampling of recent Tweets
published in the past 7 days.
Before getting involved, it’s important to know that the Search API is
focused on relevance and not completeness. This means that some Tweets
and users may be missing from search results. If you want to match for
completeness you should consider using a Streaming API instead.
Thus, the results from the API are limited per se. If you want more, use the streaming api or services like Gnip.

twitteR r package: How to get as many tweets as possible per account within API limits

I'm a novice R and twitteR package user but I wasn't able to find a strong recommendation on how to accomplish the following.
I'd like to mine a small number of twitter accounts to identify their output for keyword usage. (i.e. I don't know what the keywords are yet)
Assumptions:
I have a small number of tweeter accounts (<6) I want to mine with a max of 7000 tweets if you aggregate the various account statuses
Those accounts are not generating new tweets at a fast rate (a few a
day)
The accounts all have less than 3200 tweets according to the profile data returned by lookupUsers()
When I use the twitteR function userTimeline("accountname", n=3200) I get between 40 and 600 observations returned i.e no where near the 3200. I know there are API limits but if it was an issue of limits I would expect to get the same number of observations back or get the notice that I need to wait 15 mins
How do I get all the text I need while still playing nice ?
By using a combination of cran and github packages it was possible to get all the tweets for a user
The packages used were streamR available in cran and https://github.com/SMAPPNYU/smappR/ to help with the analysis and getting the tweets.
The basic steps are
Authenticate to twitter using oauth and your twitter keys, tokens and secrets
use smappR function getTimeline() which saves the tweets to a json file you specify
Use parseTweets(jsonfile) to read the json contents into a dataframe
This can be accomplished with rtweet package, which is still supported. First you need to be approved as a developer and create an app. (As a note, twitter has now changed their policies, and approval can take a while. It took me almost a week.)
After that, just use get_timeline() to get all of the tweets from a timeline, up to 3200.
djt <- get_timeline("adamgreatkind", n = 3200)

Resources