I'm currently learning R and I'm just trying to pull in some price data using getSymbols in the quantmod package.
I have a dataframe that has tickers and dates of annual results releases for a sector on the Australian Stock Exchange. What I'd like to do is merge in an adjusted price for the day of the release as well as the price 5 days before and 5 days after.
Ticker Ann_Rep_Date Ad.Price +5d.Ad.Price -5d.Ad.Price
AGI.AX 14/10/16
ALL.AX 22/12/16
CWN.AX 19/09/16
TAH.AX 4/08/16
TAH.AX 4/08/17
TTS.AX 17/08/17
Note that there can be multiple prices required for a single ticker as there are multiple annual results releases.
I managed to do this using tqget from tidyquant. It can store all my prices in one big happy data frame which makes it easy to merge with the above.
library(tidyquant)
prices <- c('TBH.AX','CWN.AX','SGR.AX','AQS.AX','ALL.AX',
'TAH.AX','JIN.AX','TTS.AX','AGI.AX') %>%
tq_get(get = 'stock.prices')
Ann_Reps <- merge(prices, Ann_Reps, by.x=c('symbol','date'),
by.y=c('ticker','Ann_Rep_Date'))
Related
I'm very new to R so I'm sorry if some of the terminology I use is incorrect. I have a large .csv file for daily visits, with the columns including the date (in D/M/Y format) and the number of visits that day in a seperate column. The date starts on 05/01/20 and ends on 06/11/20. I've plotted a daily time series of this data, and now I'm trying to plot a weekly time series, with the total sum of daily visits totaled together to get a weekly total, starting Monday and ending Sunday. I have looked through other similar questions on this site, and came across this code:
Week <- as.Date(cut(DF$Date, "week"))
aggregate(Frequency ~ Week, DF, sum)
However, I can't seem to get it to work. I would prefer to keep it as simple as possible. I also have the forecast package and zoo package installed if that helps.
I am having trouble with my R assignment I am working on this semester.
Here is the part that I am tasked with doing that I am confused about:
iv. Download 3 month TBill rate from Fred for the same sample period 01/01/1993 to 12/31/2013.
Useful Hints: You may have to chop the data to match the sample period.
v. Construct a matrix of return series combining Stock, S&P500, and TBill for the sample period.
Useful Hints:
Note that the rownames for the TBill may not match with the other two return series, as the dates do not match, although the month and year matches
You have to construct the row names for each of the series as Year – Month format (e.g. 1993-01) or delete the rownames from T-bill before you can combine all three series into one Return matrix.
You have to convert the Return matrix to a dataframe before you use the lm() function.
I tried this below like I have used getSymbols before for SPY and AAPL but it pulls an entire data set rather than the specific date range. How can I chop the data so it fits the desired date range?
getSymbols('TB3MS', src = 'FRED', from = "1993-01-01", to = "2013-12-31")
Next, how would I go about constructing the matrix of return series combining all of the stocks? Can anyone point me in the right direction?
Filtering an xts object: see examples in the xts documentation ?xts.
# filter 1993 until 2013
TB3MS["1993/2013"]
But these dates are of, because tbills are at the first day of the month, the stock dates are the last day of the month. With the coredata you can extract the tbill data and stick it into the other timeseries if the rows match.
Taking the data example from your previous question, you could do something like this (and I'm creating more steps than needed, you could combine a few statements into one):
# create monthly returns of the spy data and give the column a better name than monthly.returns
spy_returns <- monthlyReturn(SPY)
colnames(spy_returns) <- "SPY_returns"
# filter the tbill data
TB3MS_1993_2013 <- TB3MS["1993/2013"]
# add tbill data to spy data
spy_returns$TB3MS <- coredata(TB3MS_1993_2013)
Merging xts objects can just be done with merge. They will be merged on the dates.
merge(spy_returns, aapl_returns) would combine these two. If you have a lot of tickers, use Reduce (check help and SO on how to use Reduce with merge) but better would be to use the tidyquant package if allowed.
I am working on stock markets of two different nations, i.e China and the US. I used "quantmod" library in r, to import daily historical prices from yahoo finance. My sample data belongs form 01 JAN 2010 to 31 March 2015, but due to the different culture of these nations they have holidays on different dates and stock markets are closed on those days. Hence, i have different no. of rows of data and I can not apply the garch model on these values. For example, stock market of China has 1267 rows (one column) and the US market has 1303 rows (one column).
now my question is, how can I make a data frame with similar dates and delete/ skip the values with different dates?
my codes and error in r are given below,
library("rugarch")
library("rmgarch")
library("quantmod")
startdate<-as.Date("2010-01-01")
enddate<-as.Date("2015-03-31")
getSymbols("^SSEC", from=startdate, to=enddate)
getSymbols("^GSPC", from=startdate, to=enddate)
rsse<-dailyReturn(SSEC$SSEC.Close) # *calculate returns*
rgspc<-dailyReturn(GSPC$GSPC.Close)# *calculate returns*
returns<-data.frame(rsse, rgspc) # *making data frame with both market returns*
**Error**
Error in data.frame(rsse, rgspc) :
arguments imply differing number of rows: 1267, 1303
You should do an inner join on two dataframes. Each dataframe needs to have a date and the price for that day. I don't know the structure of your dataframes but something like:
dplyr::inner_join(SSEC, GSPC, by='my.date.variable')
or if the two dataframes have different names for their date variables, for example SSEC_date and GSPC_date:
dplyr::inner_join(SSEC, GSPC, by=c('SSEC_date' = 'GSPC_date'))
I have a data frame with the following column names.
"week" "demand" "product-id"
The problem is to convert it into a time series object.
week is a number like 3,4,5,6,7,8,9 etc., and demand is in units and product-id is unique.
I want to convert the week column into time series, so as to prepare for modeling.
I want to predict weeks 10 and 11 demand by using an ARIMA model. How do I do that?
myTS <- ts(mydataframe[-1], frequency = 52)
will convert your demand and productId to a timeseries of 52 observations per year. For more elaborate timeseries, check package xts. Also compare this post on weekly data with ts.
I am trying to calculate calendar year GDP growth for the GDPC96 time series from FRED (i.e. for a xts object). I am looking for a simple function without loops which calculate the calendar year growth where the variables are the data object (here GDPC96), the frequency (here quarterly) and whether deprecated periods (such as 2013) shall be shown or not.
For example:
library(quantmod)
getSymbols("GDPC96",src="FRED")
a <- annualReturn(GDPC96,leading=FALSE)
tail(a)
I would like it to be such that the changes are per calendar year, i.e. it should calculate from 01.01.1947 to 01.01.1948 and so on. Then, for 2012, where data is only available through Oct, it should be omitted.
As far as I have seen none of the functions in PerformanceAnalytics and the related packages can do this properly.
It seems you want something like a year-over-year return calculation. I'm not aware of a function that does this automatically, but it's easy to do with the ROC function in the TTR package.
library(quantmod)
getSymbols("GDPC96",src="FRED")
ROC(GDPC96, 4) # 4-period returns for quarterly data
getSymbols("SPY")
spy <- to.monthly(SPY)
ROC(spy, 12) # 12-period returns for monthly data
Update based on comments:
first.obs.by.year <- lapply(split(GDPC96, "years"),first)
last.obs.by.year <- lapply(split(GDPC96, "years"),last)
ROC(do.call(rbind, first.obs.by.year))
ROC(do.call(rbind, last.obs.by.year))