I am trying to process NanoBRET assay data to analyze competition between Ternary Complex (TC) formation and binary binding between Chimeric Targeted Molecule and weaker affinity interacting species using R. I could not locate the correct library function that helps perform the biphasic dose-response curve fit using the following formula. Can someone direct me to the appropriate R Library if available?
Concn CompoundX CompoundX
0.00001 0.309967 0.28848
0.000004 0.239756 0.386004
0.0000015 0.924346 0.924336
0.00000075 1.409483 1.310479
0.00000025 2.128796 2.007222
0.0000001 2.407227 2.371517
3.75E-08 2.300768 2.203162
1.63E-08 1.826203 1.654133
6.25E-09 0.978104 1.06907
2.5E-09 0.483403 0.473238
1.06E-09 0.235191 0.251971
4.06E-10 0.115721 0.114867
1.56E-10 0.06902 0.053681
6.25E-11 0.031384 0.054416
2.66E-11 0.023007 0.028945
1.09E-11 0.003956 0.020866
Plot generated in GraphPad PRISM using biphasic dose-response equation.
I needed to answer my questions by following further links in the article suggested by #I_O. Apparently the bell-shaped response curve which I thought looked more like the "bell-shaped" model described in the skimpy Prism documentation is precisely what the referenced article was calling "biphasic". See: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4660423/pdf/srep17200.pdf
The R code to do the fitting is in the supplemental material referenced at https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4660423/#S1
dput(dat)
structure(list(Concn = c(1e-05, 4e-06, 1.5e-06, 7.5e-07, 2.5e-07,
1e-07, 3.75e-08, 1.63e-08, 6.25e-09, 2.5e-09, 1.06e-09, 4.06e-10,
1.56e-10, 6.25e-11, 2.66e-11, 1.09e-11), CompoundX = c(0.309967,
0.239756, 0.924346, 1.409483, 2.128796, 2.407227, 2.300768, 1.826203,
0.978104, 0.483403, 0.235191, 0.115721, 0.06902, 0.031384, 0.023007,
0.003956), CompoundX.2 = c(0.28848, 0.386004, 0.924336, 1.310479,
2.007222, 2.371517, 2.203162, 1.654133, 1.06907, 0.473238, 0.251971,
0.114867, 0.053681, 0.054416, 0.028945, 0.020866)), class = "data.frame", row.names = c(NA,
-16L))
> m0<-drm(CompoundX~log(Concn), data = dat, fct = gaussian())
> summary(m0)
Model fitted: Gaussian (5 parms)
Parameter estimates:
Estimate Std. Error t-value p-value
b:(Intercept) 2.031259 0.086190 23.567 9.128e-11 ***
c:(Intercept) 0.012121 0.040945 0.296 0.7727
d:(Intercept) 2.447918 0.067136 36.462 7.954e-13 ***
e:(Intercept) -16.271552 0.045899 -354.509 < 2.2e-16 ***
f:(Intercept) 2.095870 0.195703 10.709 3.712e-07 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error:
0.07894641 (11 degrees of freedom)
> plot(m0, type = "all", col= "black", log = "")
Warning message:
In min(dose[dose > 0]) : no non-missing arguments to min; returning Inf
That Supplement goes on to compare a variety of model variations and constraints, so it should be read, digested and followed more closely than
space permits here.
Related
So ARDL package in R implements dynlm which is an accepted input in stargazer as per this question and answer.
However, I am unable to get stargazer table from ardl or auto_ardl. It throws the unrecognized object type error. Is there a way out of this?
Here's a reproducible example:
set.seed(10)
library(ARDL)
library(stargazer)
x=rnorm(100,mean = 5,sd=2)
y=rnorm(100,mean = 7,sd=3)
df=cbind(x,y)
model1=auto_ardl(y~x,data = df,max_order = 4)
class(model1)
[1] "list"
stargazer(model1)
% Error: Unrecognized object type.
class(model1$best_model)
[1] "dynlm" "lm" "ardl"
stargazer(model1$best_model)
% Error: Unrecognized object type.
I'm sorry I don't know how to do this in stargazer, but this model type is supported out-of-the box by the latest version of the modelsummary package (disclaimer: I am the maintainer).
set.seed(10)
library(ARDL)
library(modelsummary)
x=rnorm(100,mean = 5,sd=2)
y=rnorm(100,mean = 7,sd=3)
df=cbind(x,y)
model1=auto_ardl(y~x,data = df,max_order = 4)
modelsummary(model1$best_model)
Model 1
(Intercept)
6.849
(1.705)
L(y, 1)
0.061
(0.106)
x
-0.103
(0.166)
L(x, 1)
-0.027
(0.167)
L(x, 2)
-0.075
(0.166)
L(x, 3)
0.043
(0.167)
L(x, 4)
0.048
(0.169)
Num.Obs.
96
R2
0.013
R2 Adj.
-0.054
AIC
492.8
BIC
513.3
Log.Lik.
-238.398
I have one ts object which contain one column with weekly data (freqency = 52) for the period 2016-2019(only one week from 2019).
#>TEST_1
#>Time Series:
#>Start = c(2016, 1)
#>End = c(2019, 1)
#>Frequency = 52
So I am performing forecast with this ts object with, function forcast() from forecast package.This function give me selection of best model ETS (Exponential smoothing) for my series.
Forecast method: STL + ETS(M,A,N)
Model Information:
ETS(M,A,N)
Call:
ets(y = x, model = etsmodel, allow.multiplicative.trend = allow.multiplicative.trend)
Smoothing parameters:
alpha = 0.0044
beta = 0.0044
Initial states:
l = 496.0001
b = -0.7495
sigma: 0.2538
AIC AICc BIC
2328.009 2328.406 2343.290
But here arise a new problem for me. Namely I trying to perform residual diagnostics for residuals from this model with function checkresiduals() but I receive this message.
#> Warning message:
#> In modeldf.default(object) :
#> Could not find appropriate degrees of freedom for this model.
So can anybody help me how to find appropriate degrees of freedom for this model
with checkresiduals() function? Below is data from residuals.
residuals<-structure(c(103.861587225712, 232.922530738897, -177.501044573567,
-32.3310448885088, 51.8658720663952, -127.669525632371, -21.3736988850188,
31.8283388622758, 134.388167819753, -202.279672375648, -150.211885150427,
59.7872220312138, 7.21928088178879, -31.0067512774922, 240.664063232754,
-259.693899860492, 51.2068097649542, 133.051059120384, 153.754774108432,
-245.448120335887, -41.7151580882252, 329.736089553496, -176.574681226445,
-5.49877539363433, -57.9440644242901, -141.920372666123, 59.631632197218,
30.3566233456523, -19.5674149569647, 49.8299466802158, 8.08039437858747,
-179.219757481181, 61.6262480548803, 14.2886335749734, 147.521659709062,
-203.114556948222, 232.39658682842, 17.0359701527633, 122.671792930753,
1.17404214154658, -21.3604900851155, 43.6067134825538, 56.6694972222097,
-74.206099457236, 22.2154797604099, -42.6209506582884, -69.0881062270763,
44.9935627424999, -65.4843011281191, 45.9859871219855, 38.48475732006,
217.607886572158, -81.752879329815, -62.3165846738133, 91.3280029935076,
13.8065979268541, -27.5160607993942, -2.45614326754531, 8.82428074173083,
-21.9816546447523, 58.6350169306539, 2.99591624137327, 25.4548944489055,
-7.80971451574547, -33.741824891111, 148.727324165574, -103.887619405031,
13.6976122890256, -6.22642628362576, -89.0151943344358, 151.68500527824,
113.373271376477, 165.103295852743, -295.039665234726, 213.698114407198,
-76.4034402042766, -9.34573346398901, -71.4103830503603, 122.800589573655,
-55.724016585403, 63.7939569095491, 44.9784699409192, 151.519180259845,
-58.4408170188741, -74.3037359893916, -47.7713298497972, 163.367074626196,
-249.379445021869, -112.112655284116, -43.5458433646284, -53.5666005867634,
281.491207440336, -121.212142480196, -33.9138735682901, -31.1438180301793,
-31.2555698825003, 20.3181357200996, -46.2564548372715, 19.2769399131227,
82.0903051423776, -53.9874588993755, -81.7381076026692, -109.42037514781,
-128.567530337503, 239.606771386708, -163.928615298084, 88.3650587021525,
22.3840519205474, -19.7936259061341, 133.392615761316, 14.8789465334592,
-7.35384302392632, -193.309220279654, 199.807229000058, 124.081926626315,
-52.3795507957004, 26.248230162833, -123.352126375918, -136.687848362162,
242.06397333675, -49.2896526387001, -47.0413692896267, -315.639803224046,
122.111855110991, -135.453045844048, -34.9514109509343, -51.0671430546247,
75.2304903204274, 58.5168476811577, 205.900859581612, -195.231017102347,
17.0666471041718, -55.7835085816988, -105.931678098968, -173.52733115843,
229.313605012801, 4.76417288414814, 24.9291766474627, -324.904858037879,
449.500524512662, -126.709163220759, 18.7291455153395, -76.1328146141673,
-298.217791616455, 137.973841964018, -16.2916958267025, -31.8650948708939,
99.4876416447454, -49.4760819558044, 84.1071094148195, 44.155870901787,
-133.53348599245, 117.30321085781, 35.0222913102854, 71.5981819455558,
-87.2032279610021, -272.900607282635), .Tsp = c(2016, 2019, 52
), class = "ts")
The degrees of freedom are: DF = (# lags to be tested) - (# parameters that you estimate).
You can manually set the degrees of freedom using:
checkresidual(..., df=DF)
Hope it works.
Goosse
I know that Zelig is a wrapper... But still, it provides nice simulation capabilities (which I wouldn't be able to do on my own).
Lets say I have this data,
set.seed(123)
x1 = rnorm(5)
x2 = rnorm(5)
z = 1 + 2*x1 + 3*x2
pr = 1/(1+exp(-z))
y = rbinom(5,1,pr)
df = data.frame(y=y,x1=x1,x2=x2)
Now, we estimate the model,
library(Zelig)
relogit <- zelig(y ~ x1 + x2, model = "relogit", data = df)
And now, we (try to) make the table
library(texreg)
texreg(relogit)
... only to get this error.
Error in (function (classes, fdef, stable):
unable to find an inherited method for function ‘extract’ for
signature ‘"Zelig-relogit"’
I am aware of the $getvcov() and $getcoef() functions. But I wonder how I could make a straightforward table using texreg. Any advice will be greatly appreciated. Thanks!
texreg uses a generic function called extract to pull the relevant data from a model object and then processes the resulting texreg object to create a regression table. In order to extend the range of models texreg is applicable to, you can write your own methods for the extract function.
Zelig-relogit objects apparently store a glm object with the relevant data somewhere inside the object and attach a different class name to it. So it should be relatively straightforward to create a copy of this sub-object, fix its class name, and apply the existing extract.glm method to this object to extract the data. More specifically:
# extension for Zelig-relogit objects (Zelig package >= 5.0)
extract.Zeligrelogit <- function(model, include.aic = TRUE, include.bic = TRUE,
include.loglik = TRUE, include.deviance = TRUE, include.nobs = TRUE, ...) {
g <- model$zelig.out$z.out[[1]]
class(g) <- "glm"
e <- extract(g, include.aic = include.aic, include.bic = include.bic,
include.loglik = include.loglik, include.deviance = include.deviance,
include.nobs = include.nobs, ...)
return(e)
}
setMethod("extract", signature = className("Zelig-relogit", "Zelig"),
definition = extract.Zeligrelogit)
This code creates a Zelig-relogit method for the extract function. You can use it by typing something like screenreg(relogit), where relogit is the name of your Zelig-relogit object. The result should look like this:
==================================
Model 1
----------------------------------
(Intercept) -9446502571.59 ***
(62615.78)
x1 19409089045.70 ***
(141084.20)
x2 856836055.47 ***
(98175.65)
----------------------------------
AIC 6.00
BIC 4.83
Log Likelihood -0.00
Deviance 0.00
Num. obs. 5
==================================
*** p < 0.001, ** p < 0.01, * p < 0.05
More generally, if you want to make any Zelig model work with texreg, you should look at model$zelig.out$z.out[[1]] to find the relevant information. I will include the Zelig-relogit extract method in the next texreg release.
I'm using R for a pharmacodynamic analysis and I'm fairly new to programming.
The thing is, I'm carrying out linear regression analysis and in the future I will perform more advanced methods. Because I'm performing a large number of analysis (and I'm too lazy to manually copy paste every time I run the script), I would like to save the summaries of the analysis to a file. I've tried different methods, but nothing seems to work.
What I'm looking for is the following as (preferably) a text file:
X_Y <- lm(X ~ Y)
sum1 <- summary(X_Y)
> sum1
Call:
lm(formula = AUC_cumulative ~ LVEF)
Residuals:
Min 1Q Median 3Q Max
-910.59 -434.11 -89.17 349.39 2836.81
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 1496.4215 396.5186 3.774 0.000268 ***
LVEF 0.8243 7.3265 0.113 0.910640
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 619.9 on 104 degrees of freedom
(32 observations deleted due to missingness)
Multiple R-squared: 0.0001217, Adjusted R-squared: -0.009493
F-statistic: 0.01266 on 1 and 104 DF, p-value: 0.9106
I've searched for methods to save summary functions to a .csv or .txt, but those files don't represent the data in a way I can understand it.
Things I've tried:
fileConn <- file("output.txt")
writeLines(sum1, fileConn)
close(fileConn)
This returns:
Error in writeLines(sum1, fileConn) : invalid 'text' argument
An attempt using the write.table command gave:
> write.table(Sum1, 'output.csv', sep=",", row.names=FALSE, col.names=TRUE, quote=FALSE)
Error in as.data.frame.default(x[[i]], optional = TRUE, stringsAsFactors = stringsAsFactors) : cannot coerce class ""summary.lm"" to a data.frame
Using the write command:
> write(sum1, 'output.txt')
Error in cat(list(...), file, sep, fill, labels, append) : argument 1 (type 'list') cannot be handled by 'cat'
Then I was getting closer with the following:
> write.table(sum1, 'output.csv', sep=",", row.names=FALSE, col.names=TRUE, quote=FALSE)
But this file did not have the same readable information as the printed summary
I hope someone can help, because this is way to advanced programming for me.
I think one option could be sink() which will output the results to a text file rather than the console. In the absence of your dataset I've used cars for an example:
sink("lm.txt")
print(summary(lm(cars$speed ~ cars$dist)))
sink() # returns output to the console
lm.txt now looks like this:
Call:
lm(formula = cars$speed ~ cars$dist)
Residuals:
Min 1Q Median 3Q Max
-7.5293 -2.1550 0.3615 2.4377 6.4179
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 8.28391 0.87438 9.474 1.44e-12 ***
cars$dist 0.16557 0.01749 9.464 1.49e-12 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 3.156 on 48 degrees of freedom
Multiple R-squared: 0.6511, Adjusted R-squared: 0.6438
F-statistic: 89.57 on 1 and 48 DF, p-value: 1.49e-12
#Roland 's suggestion of knitr is a bit more involved, but could be worth it because you can knit input, text output, and figures in to one report or html file easily.
The suggestion above work great. Depending what you need you can use the tidy() function for the coefficients and glance() for the table.
library( broom )
a <- lm(cars$speed ~ cars$dist)
write.csv( tidy( a ) , "coefs.csv" )
write.csv( glance( a ) , "an.csv" )
Should you want to re-import the data into R but still want to have it in a text file, there is also dput, e.g.,
dput(summary(lm(cars$speed~cars$dist)),file="summary_lm.txt",control="all")
This allows to re-import the summary object via
res=dget("summary_lm.txt")
Let's check the class of res
class(res)
[1] "summary.lm"
try apaStyle package:
library(apaStyle)
apa.regression(reg1, variables = NULL, number = "1", title = " title ",
filename = "APA Table1 regression.docx", note = NULL, landscape = FALSE, save = TRUE, type = "wide")
I love apsrtable(), and have found it somewhat simple to extend to other classes (in particular, I have adapted it for mlogit objects. But for some reason, the apsrtableSummary.sarlm() function doesn't work quite like other hacks I have written.
Typically, we need to redefine the coefficients matrix so that apsrtable() knows where to find it. The code for this is
"apsrtableSummary.sarlm" <- function (x){
s <- summary(x)
s$coefficients <- s$Coef
return(s)
}
We also need to redefine the modelInfo for the new class, like this:
setMethod("modelInfo", "summary.sarlm", function(x){
env <- sys.parent()
digits <- evalq(digits, envir=env)
model.info <- list(
"$\\rho$" = formatC(x$rho, format="f", digits=digits),
"$p(\\rho)$" = formatC(x$LR1$p.value, format="f", digits=digits),
"$N$" = length(x$fitted.values),
"AIC" = formatC(AIC(x), format="f", digits=digits),
"\\mathcal{L}" = formatC(x$LL, format="f", digits=digits)
)
class(model.info) <- "model.info"
return(model.info)
})
After defining these two functions however, a call to apsrtable() doesn't print the coefficients (MWE using example from lagsarlm in spdep package).
library(spdep)
library(apsrtable)
data(oldcol)
COL.lag.eig <- lagsarlm(CRIME ~ INC + HOVAL, data=COL.OLD,
nb2listw(COL.nb, style="W"), method="eigen")
summary(COL.lag.eig)
# Load functions above
apsrtable(COL.lag.eig)
## OUTPUT ##
\begin{table}[!ht]
\caption{}
\label{}
\begin{tabular}{ l D{.}{.}{2} }
\hline
& \multicolumn{ 1 }{ c }{ Model 1 } \\ \hline
% & Model 1 \\
$\rho$.rho & 0.43 \\
$p(\rho)$.Likelihood ratio & 0.00 \\
$N$ & 49 \\
AIC & 374.78 \\
\mathcal{L} & -182.39 \\ \hline
\multicolumn{2}{l}{\footnotesize{Standard errors in parentheses}}\\
\multicolumn{2}{l}{\footnotesize{$^*$ indicates significance at $p< 0.05 $}}
\end{tabular}
\end{table}
As you can see, everything works out great except for that the coefficients and standard errors are not there. It's clear that the summary redefinition works, because
apsrtableSummary(COL.lag.eig)$coefficients
Estimate Std. Error z value Pr(>|z|)
(Intercept) 45.0792505 7.17734654 6.280768 3.369041e-10
INC -1.0316157 0.30514297 -3.380762 7.228517e-04
HOVAL -0.2659263 0.08849862 -3.004863 2.657002e-03
I've pulled my hair out for several days trying to find a way out of this. Any tips?
Well, I think I may be the only person on earth who uses both of these packages together, but I figured out a way to work through this problem.
It turns out that the source of the error is in the coef method for summary.sarlm class objects. Typically this method returns a matrix with the coefficients table, but for this class it just returns the coefficients. The following code fixes that problem.
setMethod("coef", "apsrtableSummary.sarlm", function(object) object$coefficients)
I also found it useful to include the rho term as a model coefficient (the methods are not consistent on this).
apsrtableSummary.sarlm <- function (x){
s <- summary(x)
s$rholine<- c(unname(s$rho), s$rho.se, unname(s$rho/s$rho.se),
unname(2 * (1 - pnorm(abs(s$rho/s$rho.se)))))
s$Coef <- rbind(s$rholine, s$Coef)
rownames(s$Coef)[1] <- "$\\rho$"
s$coefficients <- s$Coef
return(s)
}