R programming spectrum analysis - r

hello am new to R programming in r studio . I will be analyzing the spectral data of raman spectrum in future.
which package will be useful to for the spectral data analysis.I would like to learn that package. I have attached the image how I want to analyze. Please give me suggestions, how to plot the graph as shown in the fig in r studio
thanks in advance

There is a free package called hyperSpec that was specifically designed to handle spectral data together with associated extra data (e.g. experimental parameters etc.). The package also provides interface for common operations, like baseline correction, selection of spectral ranges, normalization, PCA, etc. Moreover, there is a host of plotting functions.
You can install it from CRAN with install.packages("hyperSpec"), however, as of today the CRAN version is outdated. I would recommend you to fetch the recent build from gitHub and install it via Rstudio (look for packages->install->from package archive file).
hyperSpec comes with an extensive documentation and example datasets. To browse through tutorials, run
browseVignettes("hyperSpec")
Plotting is as easy as
plot(chondro) # left plot
qplotspc(chondro) + ggtitle("Example dataset") # right plot
To import your own data, look for functions inside of hyperSpec, whose name starts with read. Just start typing hyperSpec::read and a pop-up will appear. A lot of device-specific data formats are supported. See vignette("fileio") for details.

Related

Eliminating the need for packages in base R?

I know one of the reasons R is so popular is because of its amazing packages. But for data security reasons, I can't install packages on my work computer. So, it got me thinking if I could still make R do what I would typically make it do using packages with just base R, since packages are, after all, a compiled list of functions. I am wondering if it is possible run regression models and make charts in base R (without using, say ggplot2, caret, etc.). Is it possible to copy the functions in these packages into base R to get the same functionality out of base R as one would if they were using the packages? Is the list of functions that are published as part of these packages available somewhere publicly by chance?
I am wondering if it is possible run regression models and make charts in base R (without using, say ggplot2, caret, etc.).
Yes, before ggplot2 was invented, R was genereally praised for publication ready graphics. R comes with great plotting capabilities without ggplot2 even though the latter is definitively an improvement.
Obviously, people used R for regression decades before caret was invented. A base R installation comes with a solid set of linear and nonlinear regression methods but obviously, all those packages (well, most of them) have a reason to exist. It will mainly depend on what you plan to do use. Many things are implemented in a base installation, many are not.
You can find lists of packages included with all binary distributions of R here: https://cran.r-project.org/doc/manuals/r-release/R-FAQ.html#Add_002don-packages-in-R
You will find, that that not only includes the stats package but lots of useful modelling packages like MASS, splines, boot, mgcv, nlme, cluster, rpart, spatial and survival, so a large number of even specialized models is at hand without additional downloading of packages.
Is it possible to copy the functions in these packages into base R to get the same functionality out of base R as one would if they were using the packages?
Many packages contain just plain R code, others will contain code in other languages, mostly C and C++, which will need a compiler to be translated on your system. However, where the use of foreign code / packages is considered a security breach, you should refrain from that and talk to your employer.
If it is not considered a problem but they do not want to make exceptions for you and your installation -- I was in the same place for quite some time and I just ran R from a USB stick. If that is allowed and feasible on your system, you can download packages to that USB stick installation.

Need an example data for PVF package(Photo Voltaic and Solar Forecasting)

I am working on PV forecasting(predicting the AC power that can be generated by a solar power plant). I am trying to use PVF package for that.
I tried to see how to use that package, but there is no sample data given for this package. The package is available at https://github.com/iesiee/PVF
There is nothing given in Readme.md file too.
It would be of great help if someone can get me an example dataset to work with PVF package.
I am seriously struggling on how to start working on it as I don't have any data and flow of functions of what to use.
You can suggest if there is a way I can contact the contributor.
The meteorological data was retrieved from Meteogalicia using the meteoForecast package. The output power was obtained from actual measurements of private PV plants. We are not allowed to publish these datasets, but the package is designed to work with almost any file.
Both meteorological data and power measurements were combined to be used as
input to the prediction functions from PVF package as described in this paper and in the help pages of the package and its functions.

R package, size of dataset vis-a-vis code

I am designing an R package (http://github.com/bquast/decompr) to run the Wang-Wei-Zhu export decomposition (http://www.nber.org/papers/w19677).
The complete package is only about 79 kilobyte.
I want to supply an example dataset especially because the input objects are somewhat complex. A relevant real world dataset is available from http://www.wiod.org, however, the total size of the .Rdata object would come to about 1 megabyte.
My question therefore is, would it be a good idea to include the relevant dataset that is so much larger than the package itself?
It is not usual for code to be significantly smaller than data. However, I will not be the only one to suggest the following (especially if you want to submit to CRAN):
Consult the R Extensions manual. In particular, make sure that the data file is in a compressed format and use LazyData when applicable.
The CRAN Repository Policies also have a thing or two to say about data files. There is a hard maximum of 5MB for documentation and data. If the code is likely to change and the data are not, consider creating a separate data package.
PDF documentation can also be distributed, so it is possible to write a "vignette" that is not built by running code when the package is bundled, but instead illustrates usage with static code snippets that show how to download the data. Downloading in the vignette itself is prohibited, as the manual states that all files necessary to build it must be available on the local file system.
I also would have to ask if including a subset of the data is not sufficient to illustrate the use of the package.
Finally, if you don't intend to submit to a package repository, I can't imagine a megabyte download being a breach of etiquette.

Using Protovis with R

As question, I have satisfied with what R and ggplot2 can do for static graph, but what about interactive graphs? How combine R and Protovis to make the graphs?
There is somethings called rwebvis but seems it is no longer active.
Any suggestion? Thanks.
Well, first you need a web server. Ooh, R has one of those now. Then you need some way of generating output on the web from R code - ooh, R has one of those too:
http://jeffreybreen.wordpress.com/2011/04/25/4-lines-of-r-to-get-you-started-using-the-rook-web-server-interface/
So you can then write R server pages that return JSON-encoded data that you can feed to Protovis - or if you want to get right up to date, to D3, which is Protovis++ and made of win.
Iplots is a fairly useful package that allows interactive graphing ( by this I mean selection linking between graphs, color linking, etc). It has some limitations and is not really made for producing plots as much as exploring data trends.
Acinonyx also was recently updated which is supposed to be an updated version of iplots, but from what I can tell it still has some work to do.
Not familiar with protovis or rwebvis.
There is a package from google called googlevis that enables some interactivity. This produces plots that are embeddable online. If you like protovis, the same author has another library called D3.
For running R on a webserver, I have been experimenting with RApache, which enables you to link your R installation to an apache server.
If the interactivity does not to be online, RStudio have a package called manipulate which may also be of interest.

Creating interactive pplets from R Output

Currently, I generate results from statistical analyses (e.g., a three-dimensional plot) and then "manually" move it to processing - a graphics programming language) where I can (with some simple coding) export an interactive java applet (e.g., allow the person viewing the plot to move in, out, and around the data points). Can I keep this whole process within R? Specifically, I want to create an applet (doesn't have to be Java but would need to be web embeddable, interactive (so not a movie), and not require the user to work in R or have to download things) that can be passed on.
Thanks.
Not totally clear on your requirements: can you be sure that the user will have R installed (e.g. can you run a script on their desktops to install everything first)? Does it have to run over the web?
The animation package (http://cran.r-project.org/web/packages/animation/) isn't interactive, but it can create moving images.
The iplots package is useful, although it requires R: http://rosuda.org/iPlots/iplots.html
Similarly, rggobi is extremely useful for interactive graphics, but it also requires R. You can read more http://www.jstatsoft.org/v30/b07/paper and http://www.ggobi.org/rggobi/.
A last example is biplotgui: http://r-forge.r-project.org/projects/biplotgui/
I heard that there's a project in development to create Flash output from R, but I can't find anything about it.
Can I keep this whole process within R?
Check out GGobi:
GGobi is an open source visualization program for exploring high-dimensional data. It provides highly dynamic and interactive graphics such as tours, as well as familiar graphics such as the scatterplot, barchart and parallel coordinates plots. Plots are interactive and linked with brushing and identification.

Resources