I am new to Fortran and am trying to learn how to do simple plots. I already have a program that creates a file of the values that I'm looking to test out in a simple plotting exercise, but every example I've seen so far uses gnuplot. As the computer I'm using is not a personal computer, installing or downloading gnuplot is not really the easiest option at first glance.
Would it be correct to assume that without gnuplot, plotting using Fortran 90 is very difficult?
Fortran is a general purpose programming language. It is designed to work on any type of computer, even those without any screen or operating system (with some new possibilities to ineract with an OS if it is present).
All such languages, like Fortran, C or C++ cannot directly do any graphical output or plotting. They require external libraries which are written in a system-specific way to interact with the graphical interface. There are such libraries available for Fortran, but using them is not trivial. It is much (MUCH!) harder than installing gnuplot, if you already know how to use gnuplot.
I will not recommend any such libraries as it is off-topic here.
You can use gtk-fortran. It is a GTK / Fortran binding and it offers also an interface to PLplot:
https://github.com/vmagnin/gtk-fortran/wiki
But you need a Fortran 2003 compliant compiler (it is the case of all recent compilers).
Plotting with Fortran is generally not easy because you need to install such libraries and need to learn their functioning.
Related
I'm a R newbie.
is there a way i can calculate
(x+x^2+x^3)^2
in R?
so i will get the result:
x^6+2 x^5+3 x^4+2 x^3+x^2
I get an Error: object 'x' not found.
Thanks!
R isn't well suited for this. Some interface packages to languages and libraries that are better at this do exist, such as rSymPy, which allows you to access the SymPy Python library for symbolic mathematics (you'll need to install both). In a similar vein, Ryacas links to the yacas algebra system.
Those interfaces are useful if you need symbolic manipulation as part of an R workflow. Otherwise, consider using the original tools. The ones above are open source and freely available, while other free use alternatives also exist, such as the proprietary web based Wolfram Alpha (for limited use).
Does one need to have SAS IML installed to use the SAS/R interface? or should/could one use the sas x command to run R and feed data to it?
If you want to actually use the SAS/R interface, then yes, you must license and have SAS/IML installed as it is specifically a feature of SAS/IML (which makes sense, as SAS/IML is SAS's matrix programming language, and R is a matrix programming language).
However, you're welcome to use R the way you describe (by submitting R programs via xcmd); you will, however, need to use a CSV file or similar to exchange data between the two programs. There are several ways to do it, so look at the different options available to see what's easiest for you.
If you're choosing between the different ways to do this, here is a list of the advantages of using IML which serves as a nice comparison between the two (perhaps a biased one (Rick is the lead developer of SAS/IML), but it is sufficiently detailed in what you won't have available to you running it as a separate program that it should be helpful in making the decision).
I'm looking for some kind of references which explain the pro's en con's of using Rcpp when compared to using rdyncall.
Can someone who has used both explain the basic differences from an R package developers perspective who is interested in providing R wrappers to C++ code.
Thanks.
I think we mention rdyncall in the (brief) comparison to other approaches the intro vignette / JSS paper. It is a neat package, but aims for a much lower-level connection. As I understand it, it gives you C-level APIs with least amount of fuzz, as motivated by say, the rgl package. there is very good and detailed paper about rdyncall in a recent R Journal issue.
And unless I miss something, it does nothing for you on the C++ side. Whereas Rcpp makes use as .Call() to pass complete R objects back and forth, and manages to map a wide variety of R and C++ types automatically for you---with the possibility add your own mappers.
Over at http://scicomp.stackexchange.com I asked this question about parallel matrix algorithms in IDL. The answers suggest using a multi-threaded LAPACK implementation and suggest some hacks to get IDL to use a specific LAPACK library. I haven't been able to get this to work.
I would ideally like the existing LAPACK DLM to simply be able to use a multi-threaded LAPACK library and it feels like this should be possible but I have not had any success. Alternatively I guess the next simplest step would be to create a new DLM to wrap a matrix inversion call in some C code and ensure this DLM points to the desired implementation. The documentation for creating DLMs is making me cross-eyed though, so any pointers to doing this (if it is required) would also be appreciated.
What platform are you targeting?
Looking at idl_lapack.so with nm on my platform (Mac OS X, IDL 8.2.1) seems to indicate that the LAPACK routines are directly in the .so, so my (albeit limited) understanding is that it would not be simple to swap out (i.e., by setting LD_LIBRARY_PATH).
$ nm idl_lapack.so
...
000000000023d5bb t _dgemm_
000000000023dfcb t _dgemv_
000000000009d9be t _dgeqp3_
000000000009e204 t _dgeqr2_
000000000009e41d t _dgeqrf_
000000000023e714 t _dger_
000000000009e9ad t _dgerfs_
000000000009f4ba t _dgerq2_
000000000009f6e1 t _dgerqf_
Some other possibilities...
My personal library has a directory src/dist_tools/bindings containing routines for automatically creating bindings for a library given "simple" (i.e., not using typedefs) function prototypes. LAPACK would be fairly easy to create bindings for (the hardest part would probably be to build the package you want to use ATLAS, PLAPACK, ScaLAPACK, etc.). The library is free to use, a small consulting contract could be done if you would like it done for you.
The next version of GPULib will contain a GPU implementation of LAPACK, using the MAGMA library. This is effectively a highly parallel option, but only works on CUDA graphics cards. It would also work best if other operations besides the matrix inversion could be done on the GPU to minimize memory transfer. This option costs money.
I would like to convert an ARIMA model developed in R using the forecast library to Java code. Note that I need to implement only the forecasting part. The fitting can be done in R itself. I am going to look at the predict function and translate it to Java code. I was just wondering if anyone else had been in a similar situation before and managed to successfully use a Java library for the same.
Along similar lines, and perhaps this is a more general question without a concrete answer; What is the best way to deal with situations where in model building can be done in Matlab/R but the prediction/forecasting needs to be done in Java/C++? Increasingly, I have been encountering such a situation over and over again. I guess you have to bite the bullet and write the code yourself and this is not generally as hard as writing the fitting/estimation yourself. Any advice on the topic would be helpful.
You write about 'R or Matlab' to 'C++ or Java'. This gives 2 x 2 choices which is too many degrees of freedom for my taste. So allow me to concentrate on C++ as the target.
Let's consider a simpler case: Prototyping in R, and deploying in C++. If and when the R package you use is actually implemented in C or C++, this becomes pretty easy. You "merely" need to disentangle the routine you are after from its other dependencies (header files, defines, data structures, ...) and provide it with the data and parameters needed. I have done that in the past for production systems.
Here, you talk about the forecast package. This happens to depend on the RcppArmadillo package which itself brings the nice Armadillo C++ library to R. So chances are you can in fact re-write this as a self-contained unit.
Armadillo is also interesting when you want to port Matlab to C++ as it is written to help with exactly that task in mind. I have ported some relatively extensive Matlab code to C++ and reaped a substantial speed gain.
I'm not sure whether this is possible in R, but in Matlab you can interact with your Matlab code from Java - see http://www.cs.virginia.edu/~whitehouse/matlab/JavaMatlab.html. This would enable you to leave all the forecasting code in Matlab and have e.g. an interface written in Java.
Alternatively, you might want to have predictive code written in Java so that you can produce a model and then distribute a program that uses the model without having a dependency on Matlab. The Matlab compiler maybe be useful here, but I've never used it.
A final simple way of interacting messily between Matlab and Java would be (on linux) using pseudoterminals where you would have a pty/tty pair to interface Java and Matlab. In this case you would send data from Java to Matlab, and have Matlab return the forecasting results. I expect this would also work in R, but I don't know the syntax.
In general though, reimplementing the code is a decent solution and probably quicker than learning how to interface java+matlab or create Matlab libraries.
Some further information on the answer given by Richante: Matlab has some really nice capabilities for interop with compiled languages such as C/C++, C#, and Java. In your particular case you might find the toolbox Matlab Builder JA to be particularly relevant. It allows you to export your Matlab code directly to Java, meaning you can directly call code that you've constructed during your model-building phase in Matlab from Java.
More information from the Mathworks here.
I am also concerned with converting "R to Java" so will speak to that part.
As Vincent Zooneykind said in his comment - the PMML library in R makes sense for model export in general but "forecast" is not a supported library as of yet.
An alternative is to use something like https://www.opencpu.org/ to make a call to R from your java program. It surfaces the R code on a http server. Can then just call it with parameters as with a normal http call and return what is neede using java.net.HttpUrlConnection or a choice of http libraries available in Java.
Pros: Separation of concerns, no need to re-write the R code
Cons: Invoking an R server in your live process so need to make sure that is handled robustly