pass value from R (run in SPSS) to open SPSS dataset - r

Completely new to R here. I ran R in SPSS to solve some complex polynomials from SPSS datasets. I managed to get the result from R back into SPSS, but it was a very inelegant process:
begin program R.
z <- polyroot(unlist(spssdata.GetDataFromSPSS(variables=c("qE","qD","qC","qB","qA"),cases=1),use.names=FALSE))
otherVals <- spssdata.GetDataFromSPSS(variables=c("b0","b1","Lc","tInv","sR","c0","c1","N2","xBar","DVxSq"),cases=1)
b0<-unlist(otherVals["b0"],use.names=FALSE)
b1<-unlist(otherVals["b1"],use.names=FALSE)
Lc<-unlist(otherVals["Lc"],use.names=FALSE)
tInv<-unlist(otherVals["tInv"],use.names=FALSE)
sR<-unlist(otherVals["sR"],use.names=FALSE)
c0<-unlist(otherVals["c0"],use.names=FALSE)
c1<-unlist(otherVals["c1"],use.names=FALSE)
N2<-unlist(otherVals["N2"],use.names=FALSE)
xBar<-unlist(otherVals["xBar"],use.names=FALSE)
DVxSq<-unlist(otherVals["DVxSq"],use.names=FALSE)
z2 <- Re(z[abs(c(abs(b0+b1*Re(z)-tInv*sR*sqrt(1/(c0+c1*Re(z))^2+1/N2+(Re(z)-xBar)^2/DVxSq))-Lc))==min(abs(c(abs(b0+b1*Re(z)-tInv*sR*sqrt(1/(c0+c1*Re(z))^2+1/N2+(Re(z)-xBar)^2/DVxSq))-Lc)))])
varSpec1 <- c("Xd","Xd",0,"F8","scale")
dict <- spssdictionary.CreateSPSSDictionary(varSpec1)
spssdictionary.SetDictionaryToSPSS("results", dict)
new = data.frame(z2)
spssdata.SetDataToSPSS("results", new)
spssdictionary.EndDataStep( )
end program.
Honestly, it was mostly pieced together from somewhat-related examples and seems more complicated than it should be. I had to take the new dataset created by R and run MATCH FILES with my original dataset. All I want to do is a) pull numbers from SPSS into R, b) manipulate them-in this case, finding a polyroot that fit certain criteria- , and c) put the results right back into the SPSS dataset without messing up any of the previous data.
Am I missing something that would make this more simple? Keep in mind that I have zero R experience outside of this attempt, but I have decent experience in programming SPSS and matlab.
Thanks in advance for any help you give!

R in SPSS can create new SPSS datasets, but it can't modify an existing one. There are a lot of situations where the data from R would be dimensionally inconsistent with the active SPSS dataset. So you need to create a dictionary and data frame using the apis above and then do whatever is appropriate on the SPSS side if you need to match back. You might want to submit an enhancement request for SPSS at suggest#us.ibm.com

Related

Hampel Matlab vs R

I am trying to rewrite code from Matlab in R and I am failing at the Hampel-Filter which seems to be distinct.
absmeasuredAccelerations<-(9.817899,9.923724,9.915009,9.414430,9.912013,9.822199,9.662423,9.809928,9.812976,9.883809)
The vector I am applying the Hampel-Filter in Matlab and R is identical as seen below.
Matlab Source:
Filtered = hampel(absmeasuredAccelerations,30,0.05);
In R:
Filtered<- hampel(absmeasuredAccelerations, k=15,t0=0.05)$y
Comparing the output between Matlab and R, there is a significant distinction as seen below:
Assuming the implementation of both filters is distinct, however, somehow it must be possible to reproduce results? Anyone has some ideas or what am I doing wrong?

R-Studio - computing crosstab and creating a table

I am new to programming and R. My R experience thus far has been Udemy's courses; specifically the Beginner and Intermediate R courses.
My data analysis background is heavily Excel and SPSS, as such I am trying to carry over those skills and find applicable analysis strategies in R.
I am attempting to compute a crosstabs, which will output the frequencies for the sets of 'character' data I am analyzing.
Below is a piece of code I am used to create a crosstab:
crosstab(Survey, row.vars = c("testcode","outcome"), col.vars = "svy1", type = "j")
I am able to see the data output in the Console, but I am unable to move it/put it into its own matrix like table in the Environment; the purpose being to create matrix like tables for reporting. I am sure there is an easy fix I am overlooking but any help is appreciated.

R loop from a txt or csv file

I am learning to use R and I am working with the for loop
Here is an example:
for (loopvalues in c(1,5,8,10,19)){
print(paste("The number is", loopvalues))
}
I was wondering what can be done if the list of values is as big as 100 or 1000 different values and they follow no patterns.
I imagined I can have the values saved in a csv or a txt file beforehand, but how could I tell the loop command to read the values from that file?
I am sure the question is very basic, so I thank you beforehand for your help!
For loops can be used for extremely long lists however you will often find that they become slow and you will want to use other commands such as the apply family.
You do not have to name all the values in the loop. One way to accomplish this is using the in. Here is an example using the mtcars data set that is preloaded into R.
for(c in unique(mtcars$carb)){
print(c)
}
By using the unique function, I don't even have to know what all the possible values of mtcars$carb are but I can still loop through them.
Additionally, you probably want to practice your googling skills instead of asking StackOverflow. Most of the questions you're going to ask when learning R are already out there.

How to export multivariate forecast results from R to excel

I'm terribly new with R, so I apologize if there's a way to do this using a slight variation of an existing code/package.
I've created yearly forecasts of a variable (student enrollment) for 129 countries using the predict command, and then i have them binded. I've done this because I'm forecasting using a multivariate regression.
Here's what I'm doing (if this helps)
`fm1=lm(log(y+1)~Var.Ind)
XNew=data.frame(Var.Ind)
(rse<-summary(fit)$sigma(fm1)* df.residual(fm1))/2
rse<-summary(fm1)$sigma
yhat1=exp(predict(fm1,XNew)+rse*rse/2)-1
pos2014=which(Var.Ind[,1]==c(2014))
Var.Ind.2015=model.matrix(~as.matrix(Imp.Data4[pos2014,-2])-1)
head(Var.Ind.2015)
Var.Ind.2015=data.frame(Var.Ind.2015)
Var.Ind.2015.Ord=as.data.frame(Var.Ind.2015[order(Var.Ind.2015[,3],Var.Ind.2015[,1]), ])
head(Var.Ind.2015.Ord)
X.New.New=data.frame(cbind(model.matrix(~as.matrix(Var.Ind.2015.Ord))))
head(X.New.New)
ColNames.N=ColNames[-2]
colnames(X.New.New)=c("Int",ColNames.N,"Lag1","Lag2")
head(X.New.New)
Beta.Coef=matrix(as.numeric(fm1$coefficients),ncol=1)
Beta.Coef
Pred2015=as.data.frame(cbind(X.New.New[,3],exp(as.matrix(X.New.New)%*%Beta.Coef+rse*rse/2)-1))
dim(Pred2015)
colnames(Pred2015)=c("country","Yhat")
*And so on for subsequent years until 2030)
cbind(Pred2015, Pred2016, Pred2017, Pred2018, Pred2019)`
I need to figure out if there is a way to make sense of these results:
a) how to export the forecast results to excel
b) alternatively, if I could put these results into a table using R.
Also, these results do not appear in the Global Environment, only in the results section of the program, which is why I am not asking how to export data, but rather these specific results.
As previously mentioned, my coding knowledge is limited to my 1 week experience with R (I usually work with STATA).
Any help would be greatly appreciated!

reaching max.print on R

I just found a bunch of weather data that I would like to play around with in glmnet in R. First I've been reading and organizing the data in R, and right now I am just trying to look at the raw data of each variable. Unfortunately, each variable has a lot of data and R isn't able to print it all. Is there a way I can view all the raw data in R or just in the file itself? I've tried opening the file in excel to no success. Thanks!
Try to use Frequency tables, you can group by segments.
str() , summary(), table(), pairs(), plots() etc. There are several libraries (such as decr) which facilitate analyzing numerical and factor levels. Let me know if you need help with any.

Resources