I am looking for ways to get company description, key statistics, chairman name from Yahoo Finance (or other financial website) using R, for example package quantmod.
There is oodles of info how to get current and historical prices etc, but this is not what I want.
best,
This R package does not support queries for Asian bourses. The problem appears to be with the underlying Yahoo APIs.
You can get that using Intrinio's API. Their data tag directory allows you to look up the tags you want, in your case, "long_description" and "ceo" will get you the data you want:
#Install httr, which you need to request data via API
install.packages("httr")
require("httr")
#Create variables for your usename and password, get those at intrinio.com/login
username <- "Your_API_Username"
password <- "Your_API_Password"
#Making an api call for roic. This puts together the different parts of the API call
base <- "https://api.intrinio.com/"
endpoint <- "data_point"
stock <- "T"
item1 <- "long_description"
item2 <- "ceo"
#Pasting them together to make the API call
call1 <- paste(base,endpoint,"?","ticker","=", stock, "&","item","=",item1, sep="")
call2 <- paste(base,endpoint,"?","ticker","=", stock, "&","item","=",item2, sep="")
#Now we use the API call to request the data from Intrinio's database
ATT_description <- GET(call1, authenticate(username,password, type = "basic"))
ATT_CEO <- GET(call2, authenticate(username,password, type = "basic"))
#That gives us the ROIC value, but it isn't in a good format so we parse it
test1 <- unlist(content(ATT_description,"parsed"))
test2 <- unlist(content(ATT_CEO,"parsed"))
#Then make your data frame:
df1 <- data.frame(test1)
df2 <- data.frame(test2)
#From here you can rbind or cbind, and create loops to get the same data for many tickers
You can get your API keys here. Full API documentation here. This is what the CEO dataframe looks like:
Related
I am unable to even run the example code given on the botrnot documentation. Unsure what's happening.
# libraries
library(rtweet)
library(tweetbotornot)
# authentication for twitter API
auth <- rtweet_app()
auth_setup_default()
users <- c("kearneymw", "geoffjentry", "p_barbera",
"tidyversetweets", "rstatsbot1234", "RStatsStExBot")
## get most recent 10 tweets from each user
tmls <- get_timeline(users, n = 10)
## pass the returned data to botornot()
data <- botornot(tmls)
Expecting data frame titled data should be created and have an additional column that is the probability of the user being a bot. Instead I have this error.
?Error in botornot.data.frame(tmls) : "user_id" %in% names(x) is not TRUE
The table in the bottom of the documentation is what I'm hoping to achieve.
https://www.rdocumentation.org/packages/botrnot/versions/0.0.2
I'm using r to download data from an api that uses a key. I've downloaded the data for AK into a df called officials and I would like to download the data for the remaining states using rbind to add each state to the df officials. But the format of the call to the api requires the state abbreviation without ". That is, stateId=AK not "AK". Is there a way to do this? I tried the code below and then realized my error in the GET command specifying stateID. My code inserts "AL" not AL.
states <- c("AL","AR","AZ","CA","CO","CT")
for(i in 1:length(states)) {
temp_raw <- GET("http://api.votesmart.org/Officials.getByOfficeTypeState?key=xxx&officeTypeId=L&stateId=states[i]&o=JSON")
my_content <- content(temp_raw, as = 'text')
my_content2 <- fromJSON(my_content)
temp_officials <- my_content2$candidate$candidate
officials2022 <- rbind(officials2022,temp_officials)
}
Try this variation, using the paste command to combine the strings together into the URL:
Also, notice the simplified way to perform a for loop over states, where i is directly available.
Edit: forgot the GET
states <- c("AL","AR","AZ","CA","CO","CT")
for(i in states) {
temp_raw <- GET(paste0("http://api.votesmart.org/Officials.getByOfficeTypeState?key=xxx&officeTypeId=L&stateId=", i, "&o=JSON"))
...
}
I have setup an API access key with a data provider of stock market data. With this key i am able to extract stock market data based on ticker code (E.g. APPL: Apple, FB: Facebook etc).
I am able to extract stock data on an individual ticker basis using R but I want to write a piece of code that extracts data based on the multiple stock tickers and puts them all in one data frame (the structure is the same for all stocks). I m not sure how to create a loop that updates the data frame each time stock data is extracted. I get an error called 'No encoding supplied: defaulting to UTF-8' which does not tell me much. A point in the right direction would be helpful.
I have the following code:
if (!require("httr")) {
install.packages("httr")
library(httr)
}
if (!require("jsonlite")) {
install.packages("jsonlite")
library(jsonlite)
}
stocks <- c("FB","APPL") #Example stocks actual stocks removed
len <- length(stocks)
url <- "URL" #Actual url removed
access_key <- "MY ACCESS KEY" #Actual access key removed
extraction <- lapply(stocks[1:len],function(i){
call1 <- paste(url,"?access_key=",access_key,"&","symbols","=",stocks[i],sep="")
get_prices <- GET(call1)
get_prices_text <- content(get_prices, "text")
get_prices_json <- fromJSON(get_prices_text, flatten = TRUE)
get_prices_df <- as.data.frame(get_prices_json)
return(get_prices_df)
}
)
file <- do.call(rbind,extraction)
I realised that this is not the most efficient way of doing this. A better way is to update the url to include multiple stocks rather then using a lapply function. I am therefore closing the question.
I'm sure everyone is aware of the limitations with Twitter's API and the inability to get the number of replies a given tweet has.
I'm after this information, and am trying to loop over all the followers for a given account and look back at all the times they've tweeted, and if these tweets go to the original account.
I believe this will get around the issue of only being able to get the last weeks worth of tweets, which using the search function does.
Code below doesn't work, any help would be very useful! I know Python can do this, but I'd like an R solution if possible.
library(rtweet)
library(plyr)
library(dplyr)
##set name of tweeter to look at (this can be changed)
targettwittername <- "realDonaldTrump"
##get this tweeter's timeline
tmls <- get_timeline(targettwittername, n=3200, retryonratelimit=TRUE)
##get their user id
targettwitteruserid <- as.numeric(select(lookup_users(targettwittername), user_id))
##get ids of their tweets
tweetids <- select(tmls, status_id)
tweetids <- transform(tweetids, status_id_num=as.numeric(status_id))
##get list of followers (who are most likely to reply)
targetfollowers <- get_followers(targettwittername, retryonratelimit=TRUE)
master_reply_counts <-data.frame(reply_to_status_id_num=integer(),n=integer())
getfollowersreplies <- function(follower){
follower <- as.numeric(filter(targetfollowers, user_id==follower))
followertl <- get_timeline(user = follower, n=3200, retryonratelimit=TRUE)
followertl <- filter(followertl, reply_to_user_id == targettwitteruserid)
followertl <- transform(followertl, reply_to_status_id_num=as.numeric(reply_to_status_id))
join <- inner_join(followertl, tweetids, by=c("reply_to_status_id_num"="status_id_num"))
replycounts <- join %>%
group_by(reply_to_status_id_num) %>%
summarise(n=n())
master_reply_counts <- rbind(master_reply_counts, replycounts)
return(follower)
}
data <- ldply(targetfollowers$user_id, getfollowersreplies)
Is there some place I can programmatically download the ROIC and other data typically reported in a company's quarter report?
I know that I can access the daily price data of a stock from http://chart.yahoo.com/table.csv, but I can't find anything about the financial performance.
Thanks!
Intrinio provides that data in R using the httr package. You can follow the instructions here, I'll modify them here to get ROIC:
#Install httr, which you need to request data via API
install.packages("httr")
require("httr")
#Create variables for your usename and password, get those at intrinio.com/login
username <- "Your_API_Username"
password <- "Your_API_Password"
#Making an api call for roic. This puts together the different parts of the API call
base <- "https://api.intrinio.com/"
endpoint <- "data_point"
stock <- "T"
item1 <- "roic"
call1 <- paste(base,endpoint,"?","ticker","=", stock, "&","item","=",item1, sep="")
#Now we use the API call to request the data from Intrinio's database
ATT_roic <- GET(call1, authenticate(username,password, type = "basic"))
#That gives us the ROIC value, but it isn't in a good format so we parse it
test1 <- unlist(content(ATT_roic,"parsed"))
df <- data.frame(test1)
You can modify that code for any US ticker, and you can change roic out for hundreds of other financial metrics. If you want to pull historical roic, or roic for a specific date range, see the instructions I posted at the start of this answer.