How can I visualize single trees from a random survival forest? - r

I have trained a random survival forest using the R package randomForestSRC. For publication, I would like to visualize some selected trees, preferably using the ggraph package, much like here: https://shiring.github.io/machine_learning/2017/03/16/rf_plot_ggraph
The randomForest package has a convenient function randomForest::getTree, but so far, I have not found an analogue function in in randomForestSRC.
How are the trees stored in the random survival forest, and how I can access them? I'd be grateful for any hints!

Related

random forest for imputation with hyperparameter optimization

I would like to impute my data using rfImpute() from randomForest CRAN package in R. However, I was wondering if it is also possible to optimize the hyperparameters 'niter' and 'ntree' and use the most optimal number for imputation on my data?
I saw that there is hyperparameter optimization for prediction and classification using randomforest, but is it also possible to do so for rfimpute()? :)
thanks in advance for any help,

Is it possible to visualize an individual tree from a random forest obtained via tidymodels?

Good day,
for presentation purposes I would like to plot a couple of decision trees from a random forest (with about 100 trees). I found a post from last year where its clear is not really possible or there is not an function using tidymodels. R: Tidymodels: Is it possible to plot the trees for a random forest model in tidy models?
I´m wondering if somebody has found a way! I remember I could easily do this using the "Caret" package, but tidymodels makes everything so convenient I was hoping for someone with a solution.
Many thanks!
Summarizing what trees can be ploted with tidymodels based in comments comments and other Stackoverflow posts
Decision trees. There are some options but the function rpart.plot()seems to be the most popular.
Individual tree from a random forest. Doesn´t seem to be possible to plot one (yet) using the tidymodel environment. See this post: here
XGBoost models: See Julia comment:
You should be able to use a function like xgb.plot.tree() with a
trained tidymodels workflow or parsnip model by extracting out the
underlying object created with the xgboost engine. You can do this
with extract_fit_engine()

Combining multiple cox models to present in a forest plot in R

I have 6 different multivariate cox models which run different sub groups of a dataset with the same other co-variates that I am adjusting for the in the model. I would like to plot the hazard ratios of the subgroups in one forest plot from 6 different models.
I am using R
I can't find a suitable solution for this anywhere. Please help. I have looked into the metafor package and this doesn't seem to help. ggplot package doesn't seem to be able to combine different multivariate models.
Willing to try solutions outside of R also.
Check out my forestplot package. This is exactly for what it was designed for. In particular you want the multiple confidence band section in the vignette.

How can I plot a tree selected from the random forest created using "caret" package in R

I am newbie in R and I need to know how to plot a tree selected from a random forest training model created using the train () function in caret package.
First and foremost, I used a training dataset to create a fitting model of a random forest using the train() function. The created random forest contains about 500 trees. Is there any methodology to create a plot of a selected tree?
Thank you.
CRAN package party offers a method called prettyTree.
Look here
As far as I know, the randomForest package does not have any built-in functionality to plot individual trees. You can extract trees using the getTree() function, but nothing is provided to plot / visualize it. This question may be a duplicate as a quick search yielded approaches other people have used to extract trees from a random forest are found
here and here and here

prediction intervals with caret

I've been using the caret package in R to run some boosted regression tree and random forest models and am hoping to generate prediction intervals for a set of new cases using the inbuilt cross-validation routine.
The trainControl function allows you to save the hold-out predictions at each of the n-folds, but I'm wondering whether unknown cases can also be predicted at each fold using the built-in functions, or whether I need to use a separate loop to build the models n-times.
Any advice much appreciated
Check the R package quantregForest, available at CRAN. It can easily calculate prediction intervals for random forest models. There's a nice paper by the author of the package, explaining the backgrounds of the method. (Sorry, I can't say anything about prediction intervals for BRT models; I'm looking for them by myself...)

Resources