I am trying to execute a Airflow script and get error when checking the logs of the Task_id in the Graph View:
Hi,
I am getting a log file isn't local error when running a Airflow script. given below is the error message I get from the Graph view.
I am using Sqlite DB locally and the function I am trying to execute is connecting to a Amazon Redshift DB.
Could anyone assist. Thanks..
The url looks strange: http://:8793/log... - the host name is missing.
It seems to me that there is no correct configuration of the base_url or web_server_host parameter in airflow.cfg.
If this is all setup correctly, then the settings for log storage might be off.
Related
Well, for sure after trying my script sometimes, I forgot that the Airflow has a limit to load data and now I can't access the XComs to delete my data, is there a way to do that using the Ubuntu because I'm running my Airflow on LocalExecutor?
Here is the SS from the error
I have installed apache airflow locally on windows system on ubuntu WSL. I have created a tag consisting of two task. First task is fetching the data from data source and storing in data frame. Second task is sending data to database from data frame. My first task got executed and pulled the data to second task using xcom. Even giving correct connection credentials I am getting this error("Error while connecting to mysql"). I have used hooks too but still the same error. I am not getting what other configurations i have to do and why am I getting this error? Please suggest something.
I have been using Boxfuse to deploy my app (PROD and TEST) without issues for well over a year, but now when trying to deploy to TEST (using the same command i've always used - boxfuse run app-name -env=test), I am getting this error.
"Running app/image failed!"
and that is it. It shows up just after "Waiting for AWS to create an encrypted AMI for app/image in us-west-2 (this may take up to 90 seconds)..." and the stack trace is
at com.boxfuse.client.core.Boxfuse.run(Boxfuse.java:653)
at com.boxfuse.client.commandline.Main.run(Main.java:325)
at com.boxfuse.client.commandline.Main.main(Main.java:133)
I am not sure where to begin in debugging this as the error message is not very descriptive and nothing has changed in my AWS account/settings or app configuration/setting etc. Any help or suggested places to start would be greatly appreciated. Thanks!
When I start my jobs using fast export they sometimes end with an error:
TDWALLETERROR(543): Teradata Wallet error. The helper process is already being traced
When I restart them, they work.
I'm using saved-key protection scheme.
Can someone explain to me why is that error occuring and how to fix it?
Looks like you have a trace activated in one of the scripts, run in the system.
Teradata has a shiffer code that attempts to validate if tracing is running during the wallet invocation - which triggers this error.
I cannot seem to figure out this issue. I have a qlikview document that pulls in a bunch of data and aggregates/joins it up. Typical qlikview stuff. At the end of my process I have an oracle stored procedure call. I am not retrieving anything back. This is a simple call to a database to trigger a process. I have setup my ODBC connection and User DSN on my local machine for the connection. When I run my qvw file on my local machine everything works just fine. The proc call is made and the script executes without any errors.
However, when I put the document on our reload server and after I setup a reload task for it the process throws a general script error when the sql proc is called. What could cause this? The user running the document has execute permissions. Do I need to setup a DSN on the reload server?
Really not sure at all here. Hopefully someone here can help me out. Thanks.
Unfortunately QlikView's SQL error messages are not that helpful for debugging purposes. In this case you can try turning on ODBC logging (http://support2.microsoft.com/kb/274551) and then reload the script to try and capture the cause of the error.
Finally, if your script refers to a "local" DSN then this also needs to be present on the machine that will perform the reload, in this case the QlikView server.