I had an odd experience regarding warnings when I used R, version 3.5.0. The warnings come out after I already finished data manipulation. A while later - about 10 minutes later, the warnings for one data set appeared when I was manipulating another data set. At first, I thought I made some mistakes I was not aware of, and re-did everything. However, this kept happening.
So, I again re-did everything, and specifically typed warnings() after I finished the part that the warnings pointed to, and I got nothing, meaning there was no warnings. As expected, a while later, when I am manipulating some other data, the exactly same warnings come out!
Did anybody else encounter this also?
Thanks a lot!
Under ?options you see:
warn:
sets the handling of warning messages. If warn is negative all warnings are ignored. If warn is zero (the default) warnings are stored until the top–level function returns. If 10 or fewer warnings were signalled they will be printed otherwise a message saying how many were signalled. An object called last.warning is created and can be printed through the function warnings. If warn is one, warnings are printed as they occur. If warn is two or larger all warnings are turned into errors.
So basically by default the warnings don't print until the top level function returns. If R thinks that a function is not done (a plotting function maybe) then it will wait to issue the warnings. Try entering options(warn=1) and re-running the code to see where the issue is.
Related
This should be simple based on posts like this but somehow I cannot get it to work. What is wrong with this example?
x<-1
y<-0
if(x>y){warning("careful, one is greater than zero!")}
It works with stop():
if(x>y){stop("careful, one is greater than zero!")}
So either I'm making a simple syntax mistake or warning is not supposed to be used outside of functions?
Your code works fine with me. I'm using R 3.3.2.
I think a possibile solution to your problem is to check if warning messages are enabled in your session.
If you read ?options, you'll notice that between the values returned by the function there is the warn value.
From the reference:
warn:
sets the handling of warning messages. If warn is negative all warnings are ignored. If warn is zero (the default) warnings are stored until the top–level function returns. If 10 or fewer warnings were signalled they will be printed otherwise a message saying how many were signalled. An object called last.warning is created and can be printed through the function warnings. If warn is one, warnings are printed as they occur. If warn is two or larger all warnings are turned into errors.
So, if you have a negative value for warn, you won't see warning messages.
You can enable warning messages in the following way:
options(warn=1)
Try changing this and re-run your code.
I have been using some R libraries to analyze some large data recently, and I find myself frustrated by waiting several hours for the beginning of an analysis, just to get to the end and receive some trivial error, like that I did not install a prerequisite library, or that one of my parameters was wrong. So, then I have to start all over, do the exact same analysis, generate the same variables that it had when it died, and wait a long time. Please note that these are not handled exceptions--they are fatal errors from R.
This is just a thought--and perhaps it is too good to be true, so please at least explain why it wouldn't work--but is there any way to cause R to execute "browser()" in the environment whenever it has a fatal error? For example, say it is executing a script, and encounters "require(notInstalledYet)". Instead of just dying, and losing all the variables in the memory, it would be great if it would give me a browser() at the place it died, so that I could at least save the variables, and at best, fix the problem (e.g. install the library) and try again.
You can change the error option to open a browser on error
options(error=browser)
the default is
options(error=NULL)
One of my R scripts produced a message that there are some warnings during processing. However, since this is not an interactive session, I can't use warnings() to access the warnings. What is the standard location, if any, of the most recent R session's warnings log file, so that I could review them? Thank you!
From ?warnings:
It is undocumented where last.warning is stored nor that it is
visible, and this is subject to change.
However, you can use the function warnings in your script and specify a file, where the last warnings should be saved.
warnings(file = "C:/Rwarnings.txt")
I have a longer, complex code (>7000 lines) with many nested functions, each of them enclosed in a separate tryCatch. The code works perfectly except for a "pseudo-error":
Error in doWithOneRestart(return(expr), restart): no function to return from, jumping to top level
doWithOneRestart() is internal in R as an element of the tryCatch function. I call it "pseudo-error", because the tryCatch should lead to stop() if an error ocurrs and write the error message in a log file. Instead, this "error" is not stopping the program (actually not influencing it at all) and it is shown only on the console and not written into the log file. Usual debugging procedures did not help, because the error is not reproducible (!): it may ocurr at different processing stages of the program. Changing the warning options to 0 or -1 will not help.
Since the program does the job, this error is not critical. But I would like to understand what is happening. Maybe someone has already experienced the same problem, or could come up with an original debugging strategy ...
Update (28.10.2013):
I found out where the problem came from. It's linked to a problem with java heap overflow (I was using the xlsx package to read Excel files). Among many other problems: although the connection to the Excel file is closed (definitely!), the system considers it as an unused connection (shown in traceback()), tries to close it, but finds out it is already closed: you get the "pseudo-error" described above, and never exactly at the same moment (not reproducible). Using the garbage collector gc() at the right place solved the problem. The script is now running stable for several days.
Advice from Peter Dalgaard on R-help.
The easiest way to get that message is to execute return() from the
top level:
return(1)
You might be trying to return() from source()d file. Or maybe
source()ing something that was intended to be inside a function body
(extraneous '}' characters can do that).
The usual debugging strategies should work: calling traceback() after the error, or setting options(error = recover).
Is it possible to detect if there was a problem when running roxygenize (package roxygen2)?
I want to automate the process documenting, checking and building a package, and would like to stop when documenting goes wrong.
The roxygenize help says the return value is NULL, and I searched stackoverflow without success. Currently, I need to look at the output and search if there was a line starting with "Error".
Any hint appreciated!
When roxygenize finds an error, for example, you included stop("Raise an error") in your code, then roxygenize will return an error.
The other scenario (which is what you are getting at), is that roxygenise is able to finish, but some aspects of the documentation process are incorrect. In this case, these errors are stored as warnings. So one solution is to change warnings to errors.
For example, suppose you had a file containing the line:
#' #XXX
This would cause:
roxygenise("pkg/")
to raise a warning
Warning: XXX is an unknown key in block AllGenerics.R:5
If we changed warnings to errors:
##All warnings are now errors
options(warn=2)
Then
roxygenise("pkg/")
would raise the error:
Error: (converted from warning) XXX is an unknown key in block AllGenerics.R:5
You can then use the standard tryCatch technique for dealing with errors.