Package for Converting Time Series to be Stationary in R [closed] - r

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Are there any packages in R out there that will do the work of transforming a uni-variate or bi-variate time series to be stationary?
Thanks; any help would be greatly appreciated!

Is there a one for all package with a bunch of different functions to convert non stationary time series to stationary? No (As far as I know)
Its all about the data and figuring out what method would work.
To check if your time series is stationary - can try box.test, adf.test or kpss.test
Did you try diff()? diff calculates the differences between all consecutive values of a vector.
"One way to make a non-stationary time series stationary — compute the differences between consecutive observations. This is known as differencing." - from link
Another way would be log() transformation which is often used with diff().
Other methods are square, log difference, lag. Could try different combinations of those techniques for example log square difference or try other things like Box-Cox transformations.

Related

Compute first few principal components of a large data set, quickly [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I'm working with large data sets (matrices of dimension 6000 x 3072), and using the prcomp() function to do my principal component calculation. However, the function is extremely slow. Even using the rank argument which can limit the number of components to calculate, it still takes 7-8 minutes. Now, I need to calculate principal components 45 times, as well as do some other intermediate calculation that take a few minutes on their own. So I don't want to sit staring at my computer screen for 8-9 hours on this simple analysis.
What are the fastest principal component analysis packages, I can use to speed up the process. I only need to calculate the first 20, so, so the majority of the computation can be ignored.

How to create a mathematical function from data plots [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am by no means a math person, but I am really trying to figure out how create a graphable function from some data plots I measure from a chemical titration. I have been trying to learn R and I would like to know if anyone can explain to me or point me to a guide to create a mathmatic function of the titration graph below.
Thanks in advance.
What you are looking for is a Interpolation. I'm not a R programmer, but I'll try to answer anyway.
Some of the more common ways to achieve this function you want is by Polynomial Interpolation which usually gives back a Nth degree polynomial function, where N is the number of data points minus one (1 point gives a constant, 2 points make a line, 3 makes a*x^2 + b*x + c and so on).
Other common alternatives I've learn are used in Computer Graphics are Splines, B-spline, Bézier curve and Hermite interpolation. Those make the curve smoother and good looking (I've told they originated in the car industry so they are less true to the data points).
TL;DR: I've found evidence that there is a implementation of spline in R from the question Interpolation in R which may lead you to your solution.
Hope you get to know better your tool and do a great work.
When doing this kind of work in Computer Science we call it Numerical Methods (at least here in my university), I've done some class and homework in this area while attending to the Numerical Methods Course (it can be found at github) but it's nothing worth noting.
I would add a lot of links to Wikipedia but StackOverflow didn't allow it.

spline approximation with specified number of intervals [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
So - edited because some of us thought that this question is off-topic.
I need to build spline (approximation) on 100 points in one of environments listed in tags. But I need it with exact number of intervals (maximum of 6 intervals - separate equations - in whole domain). Packages / libraries in R and Maxima which I know let me for building spline on this points but with 25-30 intervals (separate equations). Does anyone know how to build spline with set number of intervals without coding whole algorithm all over again?
What you're looking for might be described as "local regression" or "localized regression"; searching for those terms might turn up some hits.
I don't know if you can find exactly what you've described. But implementing it doesn't seem too complicated: (1) Split the domain into N intervals (say N=10). For each interval, (2) make a list of the data in the interval, (3) fit a low-order polynomial (e.g. cubic) to the data in the interval using least squares.
If that sounds interesting to you, I can go into details, or maybe you can work it out yourself.

How can I calculate predictor coefficients in linear prediction model in R? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm quite new to R and I have the following problem:
I have a time series / signal and I want to build linear prediction model. It seems something like Matlab lpc will be great but I can't find corresponding function in R. Which package should I use?
It seems like you're talking about an autoregressive (AR) model - Yule-Walker equations seem to be at the heart of what you linked to. In which case, the ar function in the basic R installation may suffice or for more complicated models the arima function, also in the basic installation.
You should also look at the Time Series Task View on CRAN for additional information on suitable packages and I recommend you consult it for further options.
The kind of analysis you are trying to to can be done using packages in the timeseries task view. Most likely you want some kind of Autoregressive model (AR), for which I refer to the Forecasting and Univariate Modeling section of that task view. The linear filtering method you mentioned is probably implemented in packages like robfilter, more info can be found in the Decomposition and Filtering section of the task view.

How to calculate Total least squares in R? (Orthogonal regression) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I didn't find a function to calculate the orthogonal regression (TLS - Total Least Squares).
Is there a package with this kind of function?
Update: I mean calculate the distance of each point symmetrically and not asymmetrically as lm() does.
You might want to consider the Deming() function in package MethComp [function info]. The package also contains a detailed derivation of the theory behind Deming regression.
The following search of the R Archives also provide plenty of options:
Total Least Squares
Deming regression
Your multiple questions on CrossValidated, here and R-Help imply that you need to do a bit more work to describe exactly what you want to do, as the terms "Total least squares" and "orthogonal regression" carry some degree of ambiguity about the actual technique wanted.
Two answers:
gx.rma in the rgr package appears to do this.
Brian Ripley has given a succinct answer on this thread. Basically, you're looking for PCA, and he suggests princomp. I do, too.
I got the following solution from this url:
https://www.inkling.com/read/r-cookbook-paul-teetor-1st/chapter-13/recipe-13-5
r <- prcomp( ~ x + y )
slope <- r$rotation[2,1] / r$rotation[1,1]
intercept <- r$center[2] - slope*r$center[1]
Basically you performa PCA that will fit a line between x and y minimizing the orthogonal residuals. Then you can retrieve the intercept and slope for the first component.
For anyone coming across this question again, there exists a dedicated package 'onls' by now for that purpose. It is similar handled as the nls package (which implements ordinary least square algorithms)

Resources