I am looking for an R package which can run "Spatial Vector Autoregression".
tandfonline.com/doi/full/10.1080/17421770701346689
According to Chen and Conley (2001), this is a "vector autoregression (VAR) whose coefficient matrix and shock covariance matrix are functions of economic distances between agents. The impact of other agents’ variables on the conditional mean of a given agent’s variable is a function of their economic distances from this agent. Similarly, covariances of VAR shocks are functions of distances between agents in the previous period, a property we refer to as being isotropic."
(Chen, X & Conley, T.G. (2001) A new semiparametric spatial model for panel
time series, Journal of Econometrics, 105, 59–83)
Surprisingly, however, I could only see until "Spatial Autoregression" which is still not what I need for my purpose. May I get help finding the package for this please? Otherwise, may I know an official way to run this Spatial Vector Autoregression model using R programming?
I think I've found what you're looking for, devtools::install_github("James-Thorson/VAST"). VAST stands for "Vector-Autoregressive Spatio-Temporal." This package is a wrapper around a package that incorporates spatial modeling. Essentially it adds to it.
You can see coding examples here. If you want to look at help, use ?VAST::VAST and select one of the three hyperlinks at the bottom of the short description and details (make_settings, fit_model, and plot_results).
Please note:
When I installed this package to check out what it included, it came back with a conflict that the package TMB required an earlier version of the Matrix package. I had not had TMB installed before installing this package. I had no issues installing TMB independently (without a conflict with the version of the Matrix package). However when I called the library VAST it still gave me that error. When I called the library TMB, then the library VAST I didn't receive the warning and both libraries loaded.
I'm trying to replicate this paper but using different time period
https://www.dropbox.com/s/edwdpgwsbli93f1/SM35%282%29-09-modelling.pdf?dl=0.
This paper is about detecting regime shifts in Malaysian currency i.e the ringgit. From what I understand it uses Markov Switching-Autoregressive method (MS-AR). I've been trying to replicate this method in R, but to no success. There has been some question asking about it lately which can be found here
Error when using msmFit in R
Basically I'm having the same problem. When I tried to do the MS-AR the error came out. I'm not sure what the exact calculation for the msmFit, but from some examples online they use this to get the fit for MS-AR. So my question is, is it actually possible to do MS-AR(p) in R? Is there any other software besides R or Eviews 8 (since I don't have this at the moment) that can actually do this?
Thank you. Really appreciate your insight.
link msmFit: http://cran.r-project.org/web/packages/msm/msm.pdf
There is a package for MATLAB called MS_Regress, it should do the job:
https://sites.google.com/site/marceloperlin/matlab-code/ms_regress---a-package-for-markov-regime-switching-models-in-matlab
I was trying to fit the MS-AR model in R, but I get the same error message. Could you provide us with a link to the examples you found about getting the fit in R?
I found a package 'bivpois' for R which evaluates a model for two related poisson processes (for example, the number of goals by the home and the away team in a soccer game). However, this package seems to no longer be useable in newer versions of R.
Is there a reasonable way to modify the glm() function to do a similar process, or run this older package on my new version of R? I have found very little literature on these sorts of processes and have found very little in terms of easy implementation in other statistical packages like STATA.
Any suggestions would be much appreciated.
While CRAN does not host a current binary of bivpois, you can build the package from the archived source code (see http://cran.r-project.org/doc/manuals/R-exts.html#Checking-and-building-packages ). Building bivpois 0.50-3.1 from source (available at http://cran.r-project.org/src/contrib/Archive/bivpois/) works for me on R 2.15.0 Windows x64. The zipped Windows binary I built is available here: http://commondatastorage.googleapis.com/jthetzel-public/bivpois_0.50-3.1.zip .
You can feel free to refer to odds modelling and testing inefficiency of sports-bookmakersas I had modified the relevant functions inside bivpois package.
Instead of starting to code in Matlab, I recently started learning R, mainly because it is open-source. I am currently working in data mining and machine learning field. I found many machine learning algorithms implemented in R, and I am still exploring different packages implemented in R.
I have quick question: how do you compare R to Matlab for data mining application, its popularity, pros and cons, industry and academic acceptance etc.? Which one would you choose and why?
I went through various comparisons for Matlab vs R against various metrics but I am specifically interested to get answer for its applicability in Data Mining and ML.
Since both language are pretty new for me I was just wondering if R would be a good choice or not.
I appreciate any kind of suggestions.
For the past three years or so, i have used R daily, and the largest portion of that daily use is spent on Machine Learning/Data Mining problems.
I was an exclusive Matlab user while in University; at the time i thought it was
an excellent set of tools/platform. I am sure it is today as well.
The Neural Network Toolbox, the Optimization Toolbox, Statistics Toolbox,
and Curve Fitting Toolbox are each highly desirable (if not essential)
for someone using MATLAB for ML/Data Mining work, yet they are all separate from
the base MATLAB environment--in other words, they have to be purchased separately.
My Top 5 list for Learning ML/Data Mining in R:
Mining Association Rules in R
This refers to a couple things: First, a group of R Package that all begin arules (available from CRAN); you can find the complete list (arules, aruluesViz, etc.) on the Project Homepage. Second, all of these packages are based on a data-mining technique known as Market-Basked Analysis and alternatively as Association Rules. In many respects, this family of algorithms is the essence of data-mining--exhaustively traverse large transaction databases and find above-average associations or correlations among the fields (variables or features) in those databases. In practice, you connect them to a data source and let them run overnight. The central R Package in the set mentioned above is called arules; On the CRAN Package page for arules, you will find links to a couple of excellent secondary sources (vignettes in R's lexicon) on the arules package and on Association Rules technique in general.
The standard reference, The Elements of Statistical
Learning by Hastie et al.
The most current edition of this book is available in digital form for free. Likewise, at the book's website (linked to just above) are all data sets used in ESL, available for free download. (As an aside, i have the free digital version; i also purchased the hardback version from BN.com; all of the color plots in the digital version are reproduced in the hardbound version.) ESL contains thorough introductions to at least one exemplar from most of the major
ML rubrics--e.g., neural metworks, SVM, KNN; unsupervised
techniques (LDA, PCA, MDS, SOM, clustering), numerous flavors of regression, CART,
Bayesian techniques, as well as model aggregation techniques (Boosting, Bagging)
and model tuning (regularization). Finally, get the R Package that accompanies the book from CRAN (which will save the trouble of having to download the enter the datasets).
CRAN Task View: Machine Learning
The +3,500 Packages available
for R are divided up by domain into about 30 package families or 'Task Views'. Machine Learning
is one of these families. The Machine Learning Task View contains about 50 or so
Packages. Some of these Packages are part of the core distribution, including e1071
(a sprawling ML package that includes working code for quite a few of
the usual ML categories.)
Revolution Analytics Blog
With particular focus on the posts tagged with Predictive Analytics
ML in R tutorial comprised of slide deck and R code by Josh Reich
A thorough study of the code would, by itself, be an excellent introduction to ML in R.
And one final resource that i think is excellent, but didn't make in the top 5:
A Guide to Getting Stared in Machine Learning [in R]
posted at the blog A Beautiful WWW
Please look at the CRAN Task Views and in particular at the CRAN Task View on Machine Learning and Statistical Learning which summarises this nicely.
Both Matlab and R are good if you are doing matrix-heavy operations. Because they can use highly optimized low-level code (BLAS libraries and such) for this.
However, there is more to data-mining than just crunching matrixes. A lot of people totally neglect the whole data organization aspect of data mining (as opposed to say, plain machine learning).
And once you get to data organization, R and Matlab are a pain. Try implementing an R*-tree in R or matlab to take an O(n^2) algorithm down to O(n log n) runtime. First of all, it totally goes against the way R and Matlab are designed (use bulk math operations wherever possible), secondly it will kill your performance. Interpreted R code for example seems to run at around 50% of the speed of the C code (try R built-in k-means vs. flexclus k-means); and the BLAS libraries are optimized to an insane level, exploiting cache sizes, data alignment, advanced CPU features. If you are adventurous, try implementing a manual matrix multiplication in R or Matlab, and benchmark it against the native one.
Don't get me wrong. There is a lot of stuff where R and matlab are just elegant and excellent for prototyping. You can solve a lot of things in just 10 lines of code, and get a decent performance out of it. Writing the same thing by hand would be hundreds of lines, and probably 10x slower. But sometimes you can optimize by a level of complexity, which for large data sets does beat the optimized matrix operations of R and matlab.
If you want to scale up to "Hadoop size" on the long run, you will have to think about data layout and organization, too, unless all you need is a linear scan over the data. But then, you could just be sampling, too!
Yesterday I found two new books about Data mining. These series of books entitled by ‘Data Mining’ address the need by presenting in-depth description of novel mining algorithms and many useful applications. In addition to understanding each section deeply, the two books present useful hints and strategies to solving problems in the following chapters.The progress of data mining technology and large public popularity establish a need for a comprehensive text on the subject. Books are: “New Fundamental Technologies in Data Mining” here http://www.intechopen.com/books/show/title/new-fundamental-technologies-in-data-mining & “Knowledge-Oriented Applications in Data Mining” here http://www.intechopen.com/books/show/title/knowledge-oriented-applications-in-data-mining These are open access books so you can download it for free or just read on online reading platform like I do. Cheers!
We should not forget the origin sources for these two software: scientific computation and also signal processing leads to Matlab but statistics leads to R.
I used matlab a lot in University since we have one installed on Unix and open to all students. However, the price for Matlab is too high especially compared to free R. If your major focus is not on matrix computation and signal processing, R should work well for your needs.
I think it also depends in which field of study you are. I know of people in coastal research that use a lot of Matlab. Using R in this group would make your life more difficult. If a colleague has solved a problem, you can't use it because he fixed it using Matlab.
I would also look at the capabilities of each when you are dealing with large amounts of data. I know that R can have problems with this, and might be restrictive if you are used to an iterative data mining process. For example looking at multiple models concurrently. I don't know if MATLAB has a data limitation.
I admit to favoring MATLAB for data mining problems, and I give some of my reasoning here:
Why MATLAB for Data Mining?
I will admit to only a passing familiarity with R/S-Plus, but I'll make the following observations:
R definitely has more of a statistical focus than MATLAB. I prefer building my own tools in MATLAB, so that I know exactly what they're doing, and I can customize them, but this is more of a necessity in MATLAB than it would be in R.
Code for new statistical techniques (spatial statistics, robust statistics, etc.) often appears early in S-Plus (I assume that this carries over to R, at least some).
Some years ago, I found the commercial version of R, S-Plus to have an extremely limited capacity for data. I cannot say what the state of R/S-Plus is today, but you may want to check if your data will fit into such tools comfortably.
have been searching around the internet and stackoverflow, but haven't been able to find any information on libraries for machine learning in s-plus or R. does anyone know of any or could perhaps point me in the right direction? thank you!
You should probably start at the CRAN Task View on Machine Learning & Statistical Learning which covers the R side.
this one maybe helpful:machine learning open source software Filter by Programming Language
If your interest in ML on R includes classification and regression algorithms, then you will want to explore CARET. Be advised that, in a recent evaluation, I found the R offerings quite underwhelming when compared to the other ML OSS.