Need to analyze tweets on a specific topic, my application (for using Twitter API) was not approved. Instead, I tried to do it manually using Twitter Advanced Search. However it's more burdensome than using easy-to-use Twitter API, I think advanced search doesn't retrieve all the relevant tweets containing a particular keyword, as I tested it for several case.
So, there are two questions. Firstly, am I right about incompleteness of advanced search in providing results? Secondly, is there another way (or turn around) to use API without need to approval?
Specifically, is there any limit for returning results by advanced search or it provides all the possible results, just like API?
I am trying to use R to scrape the whole conversation thread of a twitter status. I am exploring rtweet, which is the latest package for extracting twitter data in R. I could not not find any. I wonder if anybody could help
Thanks
I want to use the advanced search options from the link https://twitter.com/search-advanced?lang=en to extract the twitter data.
I want to do this in R. How this can be achieved?
twitteR package allows only searchTwitter option. This doesnt support the advanced search.
Any pointers will be helpful
Is there a way to search twitter users which have a certain keyword in their 'description' field? Right now my best thought is to write a loop to which will sequentially run through every user id, search the 'description' field and then only save the users which have that keyword.
Looping through every Twitter ID out there seems excessive! Is there a better way or method?
Sub-question are their packages beyond twitteR and streamR for Twitter analysis in R?
P.S. as this is an entirely conceptual question, it was judged that no reproducible code was necessary... some can be provided if the question is unclear.
Thanks!
As you mention this is an entirely conceptual question:
twitter API offers search by users' profile description keywords by using the 'q' parameter: https://dev.twitter.com/rest/reference/get/users/search
You can even OAuth in the link above if you have the credentials and 'curl' test your query. If you simply don't want to build the query, just for the sake of checking feasibility I found this site where you can search by keywords in users' profile: https://moz.com/followerwonk/bio/ (I'm guessing they use Twitter's official API).
As for the R subquestion, I'm afraid I only know the ones you mentioned :-S
I'm using R for data analysis, and I'm sharing some data with collaborators via Google docs. Is there a simple interface that I can use to access a R data.frame object to and from a Google Docs spreadsheet? If not, is there a similar API in other languages?
There are two packages:
RGoogleDocs on Omegahat: the package allows you to get a list of the documents and details about each of them, download the contents of a document, remove a document, and upload a document, even binary files.
RGoogleData on RForge: provides R access to Google services through the Google supported Java API. Currently the R interface only supports Google Docs and Spreadsheets.
As of 2015, there is now the googlesheets package. It is the best option out there for analyzing and editing Google Sheets data in R. Not only can it pull data from Google Sheets, but you can edit the data in Google Sheets, create new sheets, etc.
The GitHub link above has a readme with usage details; there's also a vignette for getting started, or you can find the official documentation on CRAN.
This may partially answer the question, or help others who want to begin by only downloading FROM public google spreadsheets: http://blog.revolutionanalytics.com/2009/09/how-to-use-a-google-spreadsheet-as-data-in-r.html#
I had a problem with certificates, and instead of figuring that out, I use the option ssl.verifypeer=FALSE. E.g.:
getURL("https://<googledocs URL for sharing CSV>, ssl.verifypeer=FALSE)
I put up a Github project to demonstrate how to use RGoogleDocs to read from a Google Spreadsheet. I have not yet been able to write to cells, but the read path works great.
Check out the README at https://github.com/hammer/google-spreadsheets-to-r-dataframe
I just wrote another package to download Google Docs spreadsheets. Its much simpler than the alternatives, since it just requires the URL (and that 'share by link' is enabled).
Try it:
install.packages('gsheet')
library(gsheet)
gsheet2tbl('docs.google.com/spreadsheets/d/1I9mJsS5QnXF2TNNntTy-HrcdHmIF9wJ8ONYvEJTXSNo')
More detail is here: https://github.com/maxconway/gsheet
Since R itself is relatively limited when it comes to execution flow control, i suggest using an api to an high-level programming language provided by google: link text.
There you can pick whichever you are most familiar with.
I for one always use python templates to give R a little more flexibility, so that would be a good combination.
For the task of exporting data from R to google docs, the first thing that comes to my mind would be to save it to csv, then parse and talk to g/docs with one of the given languages.