The Redux Dev Tools shows a certain amount of dispatched actions for the time travel feature - is there a way to increase that default amount?
You can use maxAge option:
number (>1) - maximum allowed actions to be stored in the history tree. The oldest actions are removed once maxAge is reached. It's critical for performance. Default is 50.
E.g.
window.__REDUX_DEVTOOLS_EXTENSION_COMPOSE__({ maxAge: 5 });
Switch to the redux dev tools tab. There are always only five action histories.
Or, you can set this option in the redux dev tools browser extension.
The option: Limit the action history to xx items.
Related
My data is stocks in news so everyday once the data changes in my page but in order to keep the page up to date when the data is updated, I used revalidate:1 in getStaticProps it's working fine now but is this comparatively better than using getServerSideProps?
Note: My data updates only once in a day (every day morning)
We have to consider whether the data fetched per user is personalised in nature, if its personalised towards the user currently logged in, its better to use getServerSideProps as it fetches the data during each request and pre-renders it every time, which doesn't involve the usage of caching in any way.
In the case of ISR, it's better suited for rendering a page with content common to all users. Background regeneration ensures traffic is served uninterruptedly, always from static storage, and the newly built page is pushed only after it's done generating. Even when revalidating, the visitor first receives the cached version and only then the updated version. This caching strategy is commonly known as “stale-while-revalidate.”
Having a revalidate score of 1s is low , knowing that u get the update only once a day, it would be better to have it revalidate once every 60s or 120s.
Server-side rendering would be slow on the first render and might decrease the performance when compared to ISR.
I've setup a Zapier automation to fire an event every time a new deal is made on a 3rd party CRM. The automation triggers fine, and retrieves the GA Client ID stored in the CRM. The goal of this automation is to add the value of the deal to the client's session history. This works completely fine on a new test GA View I made as well as the original one (the one left without any filters).
However, there's one GA View which has both, anti-bot/spider setting and 3 filters set up. I tried disabling all four of them, yet the event still wasn't being fired - not in real-time, nor User Explorer. Wondering what could be the cause of this. All views are, of course, of the same property. Are there any other filters (besides the anti-bot/spider setting and view filters) or options I may have missed that are view-specific that would cause events sent by Zapier not to fire on just this one view?
Any help is appreciated!
The update of the settings, in the specific case relating to the filters, may not be immediate. If you leave the filters disabled, you can certainly check if after midnight (or after a few hours after midnight) you see that data in the reports.
This happens because after midnight the data is reprocessed again, so for that day (which has therefore become the previous one), if you have removed the filters, you should find all the data.
I have a Meteor application which displays a calendar (using fullcalendar.io), and subscribes to bookings within a given date range. The app uses FlowRouter and grabs the date from the URL, and then uses this to subscribe to the bookings (URL date through to URL date + 14 days). This all works fine and I can skip through the days in the calendar, loading events for each day with no refresh as they are coming from minimongo. What I would like to do is to refresh this subscription in the background when the user switches date. This is possible using flow router e.g.:
FlowRouter.go('/diary/2017-04-11')
or by setting the subscription date in a Session / Reactive variable.
This will load the events from 2017-04-11 to 2017-04-25. The issue is that as the entire subscription is recreated there is a slight delay whilst it is loading. What I am trying to achieve is a 'moving window' - for example, if I am subscribed to events from 2017-04-10 and I change the publication to 2017-04-11, then only the 1 extra day gets loaded, rather than all data getting removed and replaced. This would ensure that I am able to skip through the days of the calendar without any load times. If the user selected a date > 14 days in the future manually then they would see the load time, this is perfectly acceptable.
it sounds like your subscriptions are tied to the template that's loaded with each route change. then, when you switch routes, the template is reloaded and the subscription along with it.
there are a couple options for cache managers, which would allow you to keep a sub active across templates.
e.g. https://github.com/kadirahq/subs-manager
note that, while this will allow your client to keep subs active as i've described, it will probably work in an "additive" function. so it won't by itself solve your moving window problem, but it will pick up new items from the publisher as you navigate.
second note: with this package, you're not limited to a single manager. i've found that it works best if you keep one manager / sub. once i started loading multiple subs to a manager, it started behaving strangely.
I am logging custom metrics using TrackMetric:
var telemetry = new TelemetryClient();
telemetry.TrackMetric($"Cache Size", cache.Count());
But nothing appears in the portal:
The output window when debugging shows the metrics being sent. I'm not sure how else to debug this.
There can be several things that make new items show up later than you'd like:
latency in AI pipeline itself, which is usually just a couple minutes or less (you can always check http://aka.ms/aistatus to see if there's any non-normal latency going on)
if you added a new custom property or new custom metric, it might take time for that new field to show up as a field in the metadata that the charts/etc use to build themselves. depending on timing here, especially if this is a brand new app, it can take up to ~15 minutes for a new property to show up in metadata if the stars are all unaligned... but normally much less than that.
once it is available in metadata, you might need to refresh in the portal if you've already opened Metrics Explorer for that AI resource for it to re-request metadata to see your field (normally just the "refresh" command on Metrics Explorer or an Overview blade is good enough to get that working, but doing a full refresh in the browser works as a last resort)
I have a question regarding "Add Calendar By URL" function in Google Calendar:
How often it is updated (most sources I've found says 24h per day). Does caladress.ics?noCache workaround still works?
How it is updated? If I have a large calendar (e.g 2008 - 2016) and add a single event, does Calendar reupload the whole calendar or check for diff? If check for diff, is there any limitations?
Is there any limit to how long events could be? E.g is it possible to set 5 year event?
1. How often it is updated (most sources I've found says 24h per day).Does caladress.ics?noCache workaround still works?
Based from the Google thread, updates may take a few hours for the new information to be parsed and viewable by your users.
Note: It might take up to 12 hours for changes to show in your Google Calendar.
You can use no-cache to indicate that the returned response cannot be used to satisfy a subsequent request to the same URL without first checking with the server if the response has changed. Here is the documentation and example.
2. How it is updated? If I have a large calendar (e.g 2008 - 2016) and add a single event, does Calendar reupload the whole calendar or check for diff? If check for diff,is there any limitations?
Calendar is updated based on how you will implement the "incremental synchronization" of calendar data. It can be Initial full sync or Incremental sync.
Initial full sync is performed once at the very beginning in order to fully synchronize the client’s state with the server’s state. You can optionally restrict the list request using request parameters if you only want to synchronize a specific subset of resources.
While Incremental sync allows you to retrieve all the resources that have been modified since the last sync request. You need to perform a list request with your most recent sync token specified in the syncToken field. Keep in mind that the result will always contain deleted entries, so that the clients get the chance to remove them from storage.
3. Is there any limit to how long events could be? E.g is it possible to set 5 year event?
For the limitation, the Google Calendar API has a courtesy limit of 1,000,000 queries per day. You can see the calendar usage limits here. It is possible to set an event as long as you haven't reached the limit for the number of events you can create.