Classic asp code using more resources - asp-classic

I am using classic asp for a web application. I am running the web application on internet explorer.
I had developed few reports related to sales data. All the sales report are linked to Sales Dashboard. Every report has some selection criteria like customer selection date period selection product group selection and other few.
Now the problem which I am facing.
I open a total sales report for the entire year which takes almost 15 minutes to load on screen. while the report is executing if I try to open any other from the sales dashboard the page with selection criteria will appear after the first report is completely executed. If I copy the link location for the second report and open it in new window of internet explorer it will open normally.
I am not able to trace the problem did anyone had face the same problem.

First, I agree with this comment posted under the question:
IIS/ASP only allows one concurrent request per session. This is why the second request does not happen until after the first one completes. If you open a new browser instance or a different browser then this is treated as a different session.
Second, if all that is being asked here is whether other people have similar issues or not, then the answer is yes, due to what johna said in the comment.
If you're looking for a way to get around that for yourself, the way described in the comment (open a new browser instance or a different browser) will work.
However, if you're after a way to bypass the 15 minute wait time entirely, give some though to preparing the data before the report is called. What I mean by that is either schedule the report to run after close of business each day and store the relevant HTML or data separately, and/or provide a button to prepare the report based on current data which can be run whenever the user wants.

Related

Metrics and dimensions are not showing up for hits of type 'event'

I have a fully functional and working code (written in C#) which purpose is to track some hits to the specified GA property. This code has been tested and still works successfully. It adds some predefined dimensions (like App Version) to each hit and a several custom metrics to a certain hit types (like event with a certain Event Action).
So far so good everything works flawlessly when taking into account the first property to which these hits are being sent. Also everything is fine when I set up a brand new GA property and track my hits to it - that is, I'm able to see events in Realtime reports, and events show up in custom reports after a while so that I can see my custom metrics.
The issue is that when I try to send absolutely the same hits to the existing property which had been created and configured ages ago - there is no both dimensions (even predefined) and custom metrics in my custom reports. I see these events in realtime and behavior reports, and I'm even able to create a custom report against events count - but that's it. I'm able to use for example Day of the month as a primary dimension, but when I try to use App Version as a dimension or my custom metrics as report metrics - it says "There is no data for this view.".
I've already tried everything I could, have read almost every post about custom definitions in Google and viewed almost each related SO question and answer - still with no luck.
We use measurement protocol
There is a correct User Agent being sent with each hit
17 days has passed since my first attempt to track these hits with existing property
There is no filters and segments at all
There is only one view
For me it looks like a property misconfiguration, but I've inspected each configuration page (I have all possible rights granted) and have not found anything related.
Will appreciate any help with this issue.
UPD: The hit itself (with the sensitive data replaced):
t=event&ec=session&ea=connection_end&el=b225d53a-6bb2-8021-f7b5-ae7004ae0a00&cm1=174960&cm2=1751494&cm3=479033&tid=UA-XXXXXXX-X&cid=4119e77f-be87-4530-04d3-33882f8eea77&v=1&av=XX.XX.99.555&an=my-awesome-app&aid=app.awesome.my
UPD: Here is what I'm trying to achieve (screenshot was made at the test property, where everything works like a charm):

GA Measurement Protocol does not fire event for just one of the views

I've setup a Zapier automation to fire an event every time a new deal is made on a 3rd party CRM. The automation triggers fine, and retrieves the GA Client ID stored in the CRM. The goal of this automation is to add the value of the deal to the client's session history. This works completely fine on a new test GA View I made as well as the original one (the one left without any filters).
However, there's one GA View which has both, anti-bot/spider setting and 3 filters set up. I tried disabling all four of them, yet the event still wasn't being fired - not in real-time, nor User Explorer. Wondering what could be the cause of this. All views are, of course, of the same property. Are there any other filters (besides the anti-bot/spider setting and view filters) or options I may have missed that are view-specific that would cause events sent by Zapier not to fire on just this one view?
Any help is appreciated!
The update of the settings, in the specific case relating to the filters, may not be immediate. If you leave the filters disabled, you can certainly check if after midnight (or after a few hours after midnight) you see that data in the reports.
This happens because after midnight the data is reprocessed again, so for that day (which has therefore become the previous one), if you have removed the filters, you should find all the data.

Google Analytics - have active user but missing info

Environment: injected Google Analytics tracking in my SharePoint Online site - all good.
Now I have been clicking pages for more than an hour and wanted to check results. I see that tracking is working (see screenshot) - at that moment there was one active user (me) and have >30 page views in a 30min time window.
Problem: reports (user and page view count) seem to be empty, but I assume that there should be at least one user and multiple page view information. Is that correct or I am misusing Google Analytics?
Update:
Pressed "Refresh Report" and Google fetched new data. Unfortunately, nothing changed and data still is empty.
This report was generated on 12/04/2019 at 16:17:25
The time of when the report was generated is not reflective of how "fresh" the data is. For the free version of GA, there is no SLA and it can take upwards of 2 days, but generally under 24 hours. https://support.google.com/analytics/answer/7084038?hl=en
Seeing the user in realtime doesn't mean the data for the reports are updated. You need to be patient and wait.
If you're seeing data in the real time reports then the standard reports should populate. This can take time though in my experience the latency is usually less than 1 hour. Are you looking at the standard reports in an entirely unfiltered view - might be worth checking to see if any filters are impacting your data though they should effect the real time reports as well.

Google Maps - Caching - Methods

Ok! So I have spoken to a google representative about this issue, however since I am not enterprise level, he can't push me to tech support and suggested that I use the SO for answers. Here is the question...
In Google Maps Terms it states the following:
(b) No Pre-Fetching, Caching, or Storage of Content. You must not pre-fetch, cache, or store
any Content, except that you may store: (i) limited amounts of Content for the purpose of
improving the performance of your Maps API Implementation if you do so temporarily (and in
no event for more than 30 calendar days), securely, and in a manner that does not permit
use of the Content outside of the Service; and (ii) any content identifier or key that
the Maps APIs Documentation specifically permits you to store. For example, you must not
use the Content to create an independent database of "places" or other local listings
information.
This led me to originally believe that google would not allow caching of any type of information. However, then I read the following:
When to Use Client-Side Geocoding
The basic answer is "almost always." As geocoding limits are per user session, there is no risk that your application will reach a global limit as your userbase grows. Client-side geocoding will not face a quota limit unless you perform a batch of geocoding requests within a user session. Therefore, running client-side geocoding, you generally don't have to worry about your quota.
Two basic architectures for client-side geocoding exist.
Run the geocoding and display entirely in the browser. For instance, the user enters an address on your page. Your application geocodes it. Then your page uses the geocode to create a marker on the map. Or your app does some simple analysis using the geocode. No data is sent to your server. This reduces load on your server, but doesn't give you any sense of what your users are doing.
Run the geocode in the browser and then send it to the server. For instance, the user enters an address. Your application geocodes it in the browser. The app then sends the data to your server. The server responds with some data, such as nearby points of interest. This allows you to customize a response based on your own data, and also to cache the geocode if you want. This cache allows you to optimize even more. You can even query the server with the address, see if you have a recently cached geocode for it, and if you do, use that. If you don't, then return no result to the browser, and let it geocode the result and send it back to the server to for caching.
So one side says you cannot cache, the other side tells you, you should. Another solution it states is to always use clientside when you can, but then this becomes a grey area as well, because both examples state that you must have a user input data. What if the jquery read data from a div or span and then geocoded the information? The user wouldn't have actually done the geocode,but it was still done client-side? I'm trying to create a site that has a bunch of events generated by users and this site could get pretty loaded, so I am trying to determine the best practice in being able to do this. Google suggested here, so before you go and say this is "off-topic" please note, this is where they stated me to post.
Any feedback would be greatly appreciated.
The first quote does not explicitly forbid caching data at all. It is ambiguous as to how much you can cache (what number explicitly is "limited amounts"?) but it does not forbid caching.
You are allowed to cache the data if it helps improve the performance of your site as long as you retain the data for no longer than 30 days and do not make it available in any way to any other service except the service that originally retrieved the data.
Regarding user interaction - if your user explicitly enters a page with the expectation that they will be shown geocoded information I would assume that this would fulfill "user interaction".
As an example from a project I worked on last year I had it set up to do the following:
- Show markers on the map
- If the user clicked a marker they were shown a popup with data from the cache if available, otherwise a geocode would be performed and the returned information would be cached along with the date/time of the cache.
Another page of the site showed a history of these markers at 5 minute intervals throughout the day. If cached data was present (from clicking the map marker as in the previous part) this would be shown, otherwise a geocode would be performed and the data cached as before. The user clicking to run the report was (in my opinion) enough "user interaction" to not count as pre-fetching as the user had to manually select a timeframe before the report would be displayed.
A cronjob then ran every day at midnight which would go through each record with cached data over 25 days old and remove it.
As it was I was caching much less than 10% of the marker positions being shown (20+ markers being updated every minute, but the report was being run on maybe 3-5 markers each day and only geocoding data for every 5th point).

How to handle concurrency in an ASP.Net WebSite? (auctioneer site)

my problem is follow:
I have an auctioneer site, in which many different objects will be auctioneerd.
My problem is very simple to clear for an more experience user I thinK. How I can handle business and database logic without opened a site or them?
My problem is to say directly, if nights at 3 no user is on the site, the winner (e.g.) must be set - if a page is opened or not.
So I need some kind of "every 2 seconds, do this method" - without opened a site.
My idea was a sepereate application which uses the same business and database-layer as the asp.net page and let this run at the server. Is that a good or bad idea?
The separate process (scheduled app or Windows service) is the only reliable way you can achieve this.
Using the same BLL and DAL are exactly the right thing to do too.
Check out this article on windows services http://msdn.microsoft.com/en-us/library/aa984074(VS.71).aspx
To let every user query the database every 2 seconds would create unnecessary traffic on your site, which is not a good idea. (users tend to refresh the page many times just before the auction closes anyway)
My thoughts:
Add a date to the auction when it closes.
The last user that places a bid is always the winner, and you can't place bids after 3 am (the date the auction closes), so if you visit the site after 3 am (you can't place a bid and) the winning user is displayed. If somebody opens the site just before 3 am and places a bid after 3 am your business logic should check the date of the bid and deny it.. (also: users might live in different timezones so consider displaying the server time on your site).
Setting the date to 'now' would close the auction immediately.
You can also add javascript to refresh the page if your clock passes the hour or something like that. (or use the number of seconds left before the auction closes in a setTimeout function or metarefresh)

Resources