Why the logs are not update? - airflow

I have a simple PythonOperator:
def simple():
print ("start simple")
for i in range (10):
time.sleep(0.7)
print("hello from simple")
print ("end simple")
When looking on the logs window, I need to refresh the page in order to see new logs.
Is it possible for the logs to be updated automatically?

For Airflow>=2.4.0:
Supprt for logs auto refresh was added to Grid View (See PR)
You can configure the refresh interval by setting auto_refresh_interval
(Note this effect other views as well)
For Airflow<2.4.0:
There is no option for that.
There is an open feature request to add this ability for future Airflow versions: Auto-refresh of logs
You can use browser plugins to get this functionality. Chrome has Easy Auto Refresh (and probably many other plugins in that area)

Related

Should you use next/link (prefetched client side transitions) for pages with any dynamic content?

From: next/link
You can see that the <Link> component from next/link enables client-side transitions and link prefetching, which are great features, but maybe not for all cases.
Please see the caveat I've run into. Let's say I have the following pages:
Home - Some landing page with a nav bar
Latest - Here I can see my latest posts
Admin - Here I can add more posts
The Latest page from the example above uses getStaticProps with revalidate. Something like:
export const getStaticProps : GetStaticProps<HomeRoute> = async () => {
const preloadedState = await getPreloadedState();
return({
revalidate: 1,
props: {
preloadedState
}
});
};
In theory, after 1 second, it should send the last stale response for the next request and trigger a new static regeneration to be served for the subsequent requests. After 1 second, the process repeats and you get fresh data at least after every second, which is pretty much immediately.
Now, see the caveat I've run into with next/link:
User lands on the Home page. There is a Link on the nav bar pointing to Latest. That link will be prefetched by next/link.
In some other browser, an admin goes to the Admin page and adds one more post (which should appear on the Latest page at some point).
Now user clicks on the Latest page link. The new post is not there.
Clicks on Home again. And clicks again on Latest. New post is still not there and never will be.
The transitions in this case are blazing fast, which is nice. But from my experience so far, I think that that user is locked inside a version of my website where the new post will never be available, because that 1st prefetch happened during a time where the new post didn't exist.
The only way that user will ever see the new post is if he/she presses F5 to do a full website reload. And it might be necessary to refresh twice, because the 1st one might return the previous stale version while triggering the regeneration for the next one.
I mean, what is the workaround to this issue? Should I not use next/link for pages that contain any dynamic data? Should I just use normal <a> tags?
UPDATE
From: https://nextjs.org/docs/basic-features/data-fetching#statically-generates-both-html-and-json
From the excerpt above, we can see that indeed, client-side transitions will not trigger a page regeneration, because they'll not call getStaticProps. They only fetch the pre-built JSON object for the page to use as props.
AFAIK, it means that you'll be locked to the version of the page that existed when you first visited the website. You can go back and forth and nothing in the pages would change, because the JSON data is probably cached on client anyway.
PS: I've tested this (like I've mentioned in the question above) and this is exactly what happens.
So how to workaround that? I would like for users that keep an open tab of my website to be able to get updates for the pages when they navigate from one page to the other.
A POSSIBLE SOLUTION
Set some kind of idle time counter, and if the user gets like 10 minutes of idle time (it means that they left the tab open). Whenever he comes back and do some action, I could refresh the whole website to make sure they get the new version of the pages.
Has anyone faced that problem before?
I've posted this very same question in several forums and this is the response I've got:
It seems what you described is true. next/link caches results in the client-side and your visitor will not fetch a revalidated result out of the box unless there is a full-page reload.
Depending on the likelihood of content changes, you might want to use <a> instead or you can look at some client-side content reload strategy that kicks in after mount and query data source for updated content.
Given that fact, I'll stick to using next/link and client-side transitions. But I'll also use something like a setInterval() to do a full website reload from time to time, so I'm sure my users will keep getting revalidated pages eventually.

Setting evars in DTM

I'm trying to set an evar in DTM but nothing is actually firing. I am trying to capture the user id that is set in a data layer using evar5 and am trying to pass it in a page load rule, but I don't see anything in the debugger tool or adobe report suite.
Attached is a screenshot showing how I am setting the evar, any advice on why this may not be working?
Few things you should try:
Check if the page load rule where in you have defined the above is executing or not ? Either type _satellite.setDebug(true); in browser console before you load the page or use Adobe DTM switch browser plugin.
If rule is executing, check if userid data element is receiving any value. Type _satellite.getVar('user id'); in the console and check.
If rule is not executing, debug the page load rule condition in the browser console after putting checkpoints in the satellite JS rendering on the page.
These steps will make you reach closer to the root cause of the issue.
If you haven't published the rule, Make sure the page you are on is loading the staging library or make sure you tell the page to load it.
localStorage.setItem('sdsat_stagingLibrary',true);
Then refresh the page.

Applying read only permission to Kibana dashboard?

is there a way to set some sort of permissions when sharing kibana dashboard with others. I'm worried that someone would either delete it or make changes and save it. I googled but didn't find anything.
A lot has happened since the question was asked. Role based access control is now available in the community edition since May.
https://www.elastic.co/blog/security-for-elasticsearch-is-now-free
Without shield, you can try to lock .kibana index on read only mode :
curl -XPUT 'localhost:9200/.kibana/_settings' -d '{ "index.blocks.read_only" : true }'
It's working well, nobody can save/delete a dashboard / search / visualization. If user resize / move something on dashboard he can reset easily by bookmarking dashboard without parameters in url (or load it directly).
You can check more options about indices : https://www.elastic.co/guide/en/elasticsearch/reference/1.4/indices-update-settings.html
When you go to the configure items of widgets and the dashboard (top right of the screen) you can set the editable flag to false. Beware though, now you cannot change it yourself anymore as well. Other options are available to provide the dashboard as a script, you can export it, enough options. But there is not something like, I am user x and I am the only one who can make changes.
Well like what jettro said, "there is not something like, i am user x and i am the only one who can make changes"
With that being said, others can still overwrite your dashboard. They just need to save it and than open it up using editor. Once that is done, they can start editing it. Unless they are someone who don't know anything about it.

H2 console throws away current work if the connection times out: is there no way to recover the entered queries?

I had been composing a series of sql statements inside the H2 Console. When I pressed execute, the page refreshed and I saw the initial logon screen, shown below. Using the browser history does not help: the H2 Console page insists on returning to the state shown below.
This is a serious usability issue - I will not always remember to do all of my work in a separate editor. Has anyone come up with a workaround?
Lost form data is also a problem for other web pages (bug tracking for example).
For Google Chrome, there is an extension Lazarus: Form Recovery. It "Autosaves everything you type so you can easily recover from form-killing timeouts, crashes and network errors."

How to use Chrome's network debugger with redirects

The Chrome network debugger gives me a great view of all the HTTP resources loaded for a page. But it clears the list whenever a new top-level HTML page is loaded. This makes it very difficult to debug pages that automatically reload for one reason or another (running script or 300 responses).
Can I tell Chrome not to clear the network debugger when a new top-level page is loaded? Or can I go back and look at the previous page's network resources?
Or can I somehow force Chrome to pause before loading a new page when I don't control the page I'm trying to debug that's doing the redirecting? It's part of an OpenID dance that's going awry, so the combination of SSL and credentials makes it extremely difficult to debug with command-line tools.
This has been changed since v32, thanks to #Daniel Alexiuc & #Thanatos for their comments.
Current (≥ v32)
At the top of the "Network" tab of DevTools, there's a checkbox to switch on the "Preserve log" functionality. If it is checked, the network log is preserved on page load.
The little red dot on the left now has the purpose to switch network logging on and off completely.
Older versions
In older versions of Chrome (v21 here), there's a little, clickable red dot in the footer of the "Network" tab.
If you hover over it, it will tell you, that it will "Preserve Log Upon Navigation" when it is activated. It holds the promise.
I don't know of a way to force Chrome to not clear the Network debugger, but this might accomplish what you're looking for:
Open the js console
window.addEventListener("beforeunload", function() { debugger; }, false)
This will pause chrome before loading the new page by hitting a breakpoint.
Another great solution to debug the Network calls before redirecting to other pages is to select the beforeunload event break point
This way you assure to break the flow right before it redirecting it to another page, this way all network calls, network data and console logs are still there.
This solution is best when you want to check what is the response of the calls
P.S:
You can also use XHR break points if you want to stop right before a specific call or any call (see image example)
Just update of #bfncs answer
I think around Chrome 43 the behavior was changed a little. You still need to enable Preserve log to see, but now redirect shown under Other tab, when loaded document is shown under Doc.
This always confuse me, because I have a lot of networks requests and filter it by type XHR, Doc, JS etc. But in case of redirect the Doc tab is empty, so I have to guess.

Resources