Accessing Reuters data from R/VBA - r

I use the market QA data base and I would like to be able to access the data directly from R or vba or even another language if needed to pursue my studies. I couldn't find any API, has anyone here already done this before?
Thanks

The 'tm' package has some worked examples. You may need to be more specific about what or of data you are targeting.

Related

Get portfolio returns and attributions from Bloomberg

I have a mostly general question.
I have a Bloomberg Terminal, I have different portfolios I created in it. So, I know their names.
I want to get from Bloomberg PORT information like:
Total return for different periods.
Attributions for different periods.
I'm trying to use 'Rblpapi' package for this, but I can't understand how to get the information I need through this package.
So, is it possible to get this information through 'Rblpapi' or not? If not, what package should I use for this?
Thank you.

Download Bloomberg Terminal's information using R

I've been looking for some clear examples for this approach. I know it requires an API in some cases. I've found the libraries Rblpapi and RblDataLicense, but I haven't been able to find a clear example to base on.
I need to download data from the DDIS function in the bloomberg terminal for a credit risk modeling I'm currently developing.
I'll appreciate a lot if anyone could help me out.
There are examples in the vignette and manual, which you can find at https://CRAN.R-project.org/package=Rblpapi .
I have never tried to do this, but I don't think you can just download the DDIS data as is. I suspect you'd have to recreate it by finding all the bonds for the company(ies) you're interested in and then downloading the info you want for each one. Looks as though you'd need to explore the bsrch() function.

effective dating plsql packages

We make extensive use of PLSQL packages for reporting purposes. We have the need to change these report generating packages at the beginning of each year. I am looking for a way to deliver the change for 2014 before it is needed for acceptance testing (and to keep things flowing rather than delivering several all at once).
We would like to have both the 2013 and 2014 package installed on the db at the same time and use effective dating to determine which is called if possible. Is this possible? Is there another way to approach. For various reasons it would be difficult to use a solution that required storing these packages with different names or API's.
Maybe you can work around name restrictions with synonyms.
CREATE PACKAGE report_2013 AS...
CREATE PACKAGE report_2014 AS...
then use just
DROP SYNONYM report_package;
CREATE SYNONYM report_package FOR report_2013;
and
DROP SYNONYM report_package;
CREATE SYNONYM report_package FOR report_2014;
to switch between them.

Is there a good R API for accessing Google Docs?

I'm using R for data analysis, and I'm sharing some data with collaborators via Google docs. Is there a simple interface that I can use to access a R data.frame object to and from a Google Docs spreadsheet? If not, is there a similar API in other languages?
There are two packages:
RGoogleDocs on Omegahat: the package allows you to get a list of the documents and details about each of them, download the contents of a document, remove a document, and upload a document, even binary files.
RGoogleData on RForge: provides R access to Google services through the Google supported Java API. Currently the R interface only supports Google Docs and Spreadsheets.
As of 2015, there is now the googlesheets package. It is the best option out there for analyzing and editing Google Sheets data in R. Not only can it pull data from Google Sheets, but you can edit the data in Google Sheets, create new sheets, etc.
The GitHub link above has a readme with usage details; there's also a vignette for getting started, or you can find the official documentation on CRAN.
This may partially answer the question, or help others who want to begin by only downloading FROM public google spreadsheets: http://blog.revolutionanalytics.com/2009/09/how-to-use-a-google-spreadsheet-as-data-in-r.html#
I had a problem with certificates, and instead of figuring that out, I use the option ssl.verifypeer=FALSE. E.g.:
getURL("https://<googledocs URL for sharing CSV>, ssl.verifypeer=FALSE)
I put up a Github project to demonstrate how to use RGoogleDocs to read from a Google Spreadsheet. I have not yet been able to write to cells, but the read path works great.
Check out the README at https://github.com/hammer/google-spreadsheets-to-r-dataframe
I just wrote another package to download Google Docs spreadsheets. Its much simpler than the alternatives, since it just requires the URL (and that 'share by link' is enabled).
Try it:
install.packages('gsheet')
library(gsheet)
gsheet2tbl('docs.google.com/spreadsheets/d/1I9mJsS5QnXF2TNNntTy-HrcdHmIF9wJ8ONYvEJTXSNo')
More detail is here: https://github.com/maxconway/gsheet
Since R itself is relatively limited when it comes to execution flow control, i suggest using an api to an high-level programming language provided by google: link text.
There you can pick whichever you are most familiar with.
I for one always use python templates to give R a little more flexibility, so that would be a good combination.
For the task of exporting data from R to google docs, the first thing that comes to my mind would be to save it to csv, then parse and talk to g/docs with one of the given languages.

What are some good online sources for data sets?

Some time ago I came across as site online who's sole purpose was the collection of various data sets, location data, district census data, or whatever sets community members were interested in maintaining.
My question is, do you know the site that I'm thinking of, or can you suggest any other sites that perform a similar service?
I'll suggest GeoNames, a great source for zip/postal, lat/long and lots of other geographical infomation.
Take a look at these websites:
IBM Many Eyes
Swivel
Data 360
Here are some others out of my bookmarks:
DatabaseAnswers.org
Discogs
Freebase
Another that was just shown to me is sig.ma, a search tool for retrieving ontologies. If you're building web 2.0 services, this would be a great tool for bootstrapping.
I use the Common Data Hub a lot:
http://www.commondatahub.com/home
especially for tables of standardized data.
buzzdata: http://buzzdata.com/ seems like a pretty good way to publish and share large data sets to me.
What about sports: The baseball archive has stats about everything baseball from 1871...
http://baseball1.com/statistics - great source of date if one wants to train some statistics.
The most comprehensive dataset for music is MusicBrainz http://musicbrainz.org/
Search
Edit: I don't mean search generally, I mean to say: Amazon is hosting a few selected public domain data sets for all to use. You can find out what data sets they have available by searching.

Resources