Regression i Maple - finding the best possible curve - plot

Given some data points plotted in a graph by:
plot(<<a>|<b>>,style=point);
How do I do regression through the points? That is finding the best possible curve, straight line etc.

You could look at the help-pages for the CurveFitting package, or those of Statistics,Regression. (Note that the latter of those should appear by entering regression in the Maple 16 Help browser's search box.)
a:=<1,2,3,4,5>:
b:=<2,5,10,17,26>:
P1:=plot(<a|b>,style=point,color=red):
C1:=Statistics:-Fit(s+t*x,a,b,x);
C2:=Statistics:-Fit(s+t*x+u*x^2,a,b,x);
plots:-display(
P1,
plot(C1, x=min(a)..max(a), color=green),
plot(C2, x=min(a)..max(a), color=blue)
);

Related

Does this curve represent non-linearity in my residuals vs fitted plot? (simple linear regression)

Hi,
I am running a simple linear regression model in R at the moment and wanted to check my assumptions. As seen by the plot, my red line does not appear to be flat and instead curved in places.
I am having a little difficulty interpreting this - does this imply non-linearity? And if so, what does this say about my data?
Thank you.
The observation marked 19 on your graph (bottom right corner) seems to have significant influence and is pulling down your line more than other points are pulling it up. The relationship looks linear all in all, getting rid of that outlier by either nullifying it by increasing sample size (Law of large numbers) or removing the outlier(s) should fix your problem without compromising the story your data is trying to tell you and give you the nice graph you're looking for.

LOESS smoothing - geom_smooth vs() loess()

I have some data which I would like to fit with a model. For this example we have been using LOESS smoothing (<1.000 observations). We applied LOESS smoothing using the geom_smooth() function from the ggplot package. So far, so good.
The next step was to acquire a first derivative of the smoothed curve, and as far as we know this is not possible to extract from geom_smooth(). Thus, we sought to manually create our model using loess() and use this to extract our first derivative from this.
Strangely however, we observed that the plotted geom_smooth() curve is different from the manually constructed loess() curve. This can be observed in the figure which is shown underneath; in red the geom_smooth() and in orange the loess() function.
If somebody would be interested, a minimal working reproducible example can be found here.
Would somebody be able to pinpoint why the curves are different? Is this because of the optimization settings of both curves? In order to acquire a meaningful derivative we need to ensure that these curves are identical.

Confidence ellipse formula in JS or R

What I have: A scatter chart(plot) of PCA. Plotted in JS. I have Rtools that Ive used to push PCA data to the client side.
What I'm trying to do: Plot a confidence ellipse formula.
I can't seem to find a straight forward formula for the CI ellipse. I came across a lot of theory and a lot of examples in R which give you the end result - an ellipse (One can use ggplot or CRAN to plot it).
But Im looking for a formula that I could use in the client side to plug my scatter chart points and calculate the ellipse or even better a function in R that would give me a formula for the ellipse.
I have the covariance matrix and Eigen vectors as well (calculated in R).
All suggestions much appreciated.
Haven't found a formula but after using Momocs:::conf_ell library I managed to get the vertices and the x,y points of an ellipse.
I will update this answer once I find the second part to my answer - a straight forward formula.

Why does my linear regression fit line look wrong?

I have plotted a 2-D histogram in a way that I can add to the plot with lines, points etc.
Now I seek to apply a linear regression fit at the region of dense points, however my linear regression line seems totally off where it should be?
To demonstrate here is my plot on the left with both a lowess regression fit and linear fit.
lines(lowess(na.omit(a),na.omit(b),iter=10),col='gray',lwd=3)
abline(lm(b[cc]~a[cc]),lwd=3)
Here a and b are my values and cc are the points within the densest parts (i.e. most points lay there), red+yellow+blue.
Why doesn't my regression line look more like that on the right (hand-drawn fit)?
If I was plotting a line of best fit it would be there?
I have numerous plots similar to this but still I get the same results....
Are there any alternative linear regression fits that could prove to be better for me?
A linear regression is a method to fit a linear function to a set of points (observations) minimizing the least-squares error.
Now imagine your heatmap indicating a shape where you would assume a vertical line fitting best. Just turn your heatmap 10 degrees counter clock-wise and you have it.
Now how would a linear function supposed to be defined which is vertical? Exactly, it is not possible.
The result of this little thought experiment is that you confuse the purpose of linear regression and what you most likely want is - as indicated already by Gavin Simpson - the 1st principal component vector.

Measuring the limit of a point on a smooth.spline in R

I'm not sure if that's the right terminology.
I've entered some data into R, and I've put a smooth.spline through it using the following command.
smoothingSpline = smooth.spline(year, rate, spar=0.35)
plot(x,y)
lines(smoothingSpline)
Now I'd like to measure some limits (or where the curve is at a given x point), and maybe do some predictive analysis on points that extend beyond the graph.
Are there commands in R for measuring (or predicting) the points along a curve?
Is ?predict.smooth.spline what you are looking for?

Resources