Obtaining twitter API timestamp from string ID - r

Recently downloaded a small (230k) dataset of tweets using the streamR package in R. I saved the workspace, quit R and today began trying to use the information but the time stamp in ALL of the tweets (the created_at column of the data frame that streamR creates) shows the time when I restarted R and loaded the workspace... How can this be? Is the timestamp dynamic or dependent on the save of the file?
Being at this point, is there any way to call back a specific string_id and return the timestamp using streamR? I could create a loop and fix the issue this way, being that it is a information that is VERY time sensitive.

Related

how to extract data in table format using python

I want to make a report from my csv raw data file using python so I can use it every time instead of manually working on it every day
I am new to coding so I tried using you tube videos and copy paste coding but did not get the results as expected

Naming sheet versions in google sheet using googlesheets4 or any other R package

Is there any way to save the current version of a sheet and name the version.
For example I want to automate the updating of covid data in a spreadsheet. The script runs once per day and updates a specific sheet. After updating the spreadsheet I want to name the current version depending on the day (just paste the strings).
I cant find a function or a parameter in the googlesheets4, googledrive package documentation.
Where do I look, is this possible by inserting a 'googlescript' command in the R script?

customizing URLs in R

I am a day trader based in INDIA. I am using R to do my research. I want to download the End of the Day(EOD) stock prices for different stocks. I was using Quandl and quantmod but was not satisfied with them ( they are OK for historical data but not for EOD quotes). After much research I found out that the EOD for NSE(national stock exchange of india) can be found in the so called "bhav copy" that can be downloaded daily from its website. The URL, for 30th APRIL, is:
https://www.nseindia.com/content/historical/EQUITIES/2018/APR/cm30APR2018bhav.csv.zip
I have two questions:
1) If I type this in the address box of google chrome and execute, it throws a pop up window that asks where to store the csv file. How do I automate this in R? If I just enter the URL as an argument for read.csv, will it suffice?
2) The bhav copy is updated daily. I want to write a function in R that automates this downloading daily. But the URL changes daily( the above URL is only for 30th APRIL 2018). The function will take the current date as an argument. How can I create a one one map to the date and the URL for that particular date? In other words, the URL for date dt is:
https://www.nseindia.com/content/historical/EQUITIES/2018/APR/cmdtAPR2018bhav.csv.zip
the R function f(dt) should create the URL for that particular date and download the csv file.
Very many thanks for your time and effort....
download.file(url, destfile) should be what you need to download the data from the URL in R. Then you can use read.csv. You may need to use unzip() before processing it, judging by the URL you provided.
If you feel like it, you can use fread from the data.table library to pass the url directly, but if it's a zip file then the first option is probably better for you.
As for the URL and processing dates, the lubridate library will be handy for parsing dates.
Package nser solves your problem.
To download and read today's bhavcopy use bhavtoday
library(nser)
bhavtoday
To download and read historical bhavcopy of Equity segment
bhav("30042018")
bhavcopy of F&O segment
fobhav("30042018")
You can also use RSelenium to download bhavcopy zip file using function bhavs.
Package link https://cloud.r-project.org/web/packages/nser/index.html

How to fetch data in R by executing command in excel?

I have to fetch data from Reuters and they provide an excel plugin for that. The problem is that my excel is crashing if I try to fetch too many variables at the same time. I was wondering whether I can do it from R via some excel connection.
In general I want to give the command to excel (from R), fetch data and get the data back in R for analysis. The process has to be repeated a number of times.
=TR("SCREEN(U(IN(Equity(active,public,primary))/*UNV:Public*/))",A1:K10,"PERIOD:FY2015 NULL:-- curn=USD RH=In CH=Fd",A6)
I get the variable name from (A1:K10) and then the output is stored from cell A6 onwards.
The answer https://stackoverflow.com/a/43222477/1389469 here points to RDCOMClient library but I am not able to run the macro from R as described there.
Another guide here https://cran.r-project.org/web/packages/excel.link/excel.link.pdf talks about how to read/write data from an open excel workbook but doesn't tell how to execute commands in excel from R.

Time series data homogenization using climatol package in R - .dah file missing

I am trying to homogenize rainfall time series data for 12 stations in R (RStudio) using homogen tool in climatol package. I used monthly total series computed using dd2m tool. The homogen command runs well and also generates the results including .rda and .pdf files. But I can't see the .dah (homogenized data with missing data filled) and .esh files being created in working folder as expected.
Any help on what might have happen, and how can I get this result would be appreciated.
Cheers
I just figured out that we can export the 'would be' content of the dah file by loading the rda content to R and then writing to a text file, i.e.
load('rTest_1950-2000.rda')
write.csv(dah,"C:/Test/Test-dah.csv").

Resources