Post/Redirect/Get: Get is called multiple times - http

I have a webpage which has a form that is submitted via POST. This POST-route processes some data and redirects the User to a GET route. The problem: The GET route is called multiple times, usually 3 times. So three GET's are fired and the user see's the first GET request. The other two GET requests I can just see on my logs.
Occasionally it even happens that the GET is called more than three times...
(Of course the POST route is just called once...)
By the way I'm using JRuby/JRack/Sinatra on Jetty (-> Google App Engine.) The problem happens locally and remotely.
Philip

I have had that same issue in my code before (although different platform). It turned out to be elements in the page referencing the same url as the page. I had 1 broken image and 2 ignored css files that had been set to the parent page.
If it's the same kind of issue you can use Firebug's net tab to verify and debug.

Not sure without seeing the code, but in most cases a script will continue to execute after a call to a redirect function. Try returning from your method immediately after calling redirect_to.

There is a logical bug in your code. Fix it.

Related

Incorrect page saved for visits on the index page

We have two nopcommerce shops running on pretty much the same code. Both of those have GA handled through GTM with more or less the same code, however for some reason on one of those two shops all visits to the index page end up being registered incorrectly.
There are two index pages:
http://domain.co.uk/en/
http://domain.co.uk/pl/
Visiting http://domain.co.uk redirects to one of the above. Anyway when I visit either of those two, analytics adds the domain name at the end for some reason, so if I visit http://domain.co.uk/en/ GA registers http://domain.co.uk/en/domain.co.uk.
I tried adding canonical (with the actual address) to the index page, but it changed nothing. Note this problem only happens on the index page, other pages are registered correctly in GA. Anyone got an idea what could possibly make analytics save those addresses incorrectly?
It's hard to reproduce this error, the only way is via "RequestURI" filter. Is there any chance you have such, it could be the reason for this.
Barnettt's suggestion made me (and my boss) look through the Analitics' settings some more and I think I figured it out. Someone set default page (view settings) to domain.co.uk. The value is obviously incorrect, but more interesting is the fact that it was probably there from the start, yet the incorrect page started getting registered only a few months ago.

DTM giving a different report suite for custom links and page calls?

I'm getting some very strange behavior in DTM. When our page loads (from a local instance of the website) we get the expected call going out with the proper dev report suite. When a custom link call is made from that page, for some reason DTM sends it with a production report suite. If I look in Adobe Analytics for the custom link name reported under the prod RSI, it does not show up in there.
Any ideas on what is going on and how I can fix this issue?
This is my shot in the dark based on what you have said, and it is based on the assumption that your statements are true (e.g. you aren't seeing pink elephants, that the request was indeed showing your prod rsid in the proper portion of the request url, that you did in fact check your prod rsid after an acceptable amount of time has past, no segment or other filter shenanigans, etc..: in short, that you do know how to accurately perform the basic QA song and dance).
Under that assumption, the below is a scenario that can plausibly reproduce what you are describing. I could be partially right or totally off for your specific situation, but there's really no way for me to know for sure without having access to your DTM instance.
The Scenario
Long story short is it sounds like you have a blend of custom code and DTM automatic settings enabled, and DTM is overriding and/or not caring about your custom code for link tracking.
More specifically, it sounds to me like you have AA implemented as a tool in DTM, and in the config settings, you have your production and staging rsids specified in the text fields.
Then in the General section, you either do NOT have values specified for Tracking Server and Tracking Server Secure, or else they are set to the wrong values.
Then, in the Library Management section, you have either selected "Managed by Adobe" in which case DTM takes care of the library, or else you have selected "Custom" and you are adding the library yourself AND you have NOT checked "Set report suites using custom code below".
Then, somewhere in DTM (e.g. the Library Management > Custom code box, or Customize Page Code codebox) you have code that pops rsid stuff (e.g. s.account, s_account, dynamicAccountList stuff), and possibly also trackingServer and trackingServerSecure.
Finally, you (like most other people, because DTM's double script include for staging vs. prod is.. dumb) just use the prod script include on your page, and either use the debug/staging mode or rely on whatever rsid routing logic you've setup to route to dev.
So.. when the page is first loaded, DTM loads the AA library and it sets variables and stuff based on what you specified in the tool config. During this time, it is also popping any custom code blocks you have in the tool config, which may or may not override what you have specified in the tool config fields, depending on what you enabled. Then after that, it pops stuff you have in page load rules (if any), etc..
But then comes the link click.. As I have mentioned in other posts on SO, DTM has this caveat (IMO bug) about how it references the AA object after the initial page load/AA request: basically, it doesn't. Instead, it makes use of internal methods (the main one being a .getS() method) to create a new instance of the AA object, based on whatever things you have configured in the tool config section. Well here's the rub.. it does NOT account for or execute any custom coding you have done in code boxes in the tool config section.
So that basically happens whenever an event based or direct call rule is triggered, and it effectively screws you. Why does DTM do this? I do not know. IMO Adobe needs to change this feature caveat bug. Either they should refactor DTM to execute the code boxes, OR they could, you know.. just reference the original AA object created, like any normal script would do..
But in any case..
So for example, my theory here is that page loads fine, points to dev rsid based on your setup. But then you click a link and an event triggers, and DTM makes a new AA object not caring about your custom code, so all it has to go on is what you have in the tool's config fields.
Since DTM doesn't actually have any rules around the prod vs. dev rsids you specify in those fields (you have to write custom code in the custom code boxes - that DTM ignores!), it just pops the prod rsid, because that's the script include you have on your page.
Then as far as not seeing the data actually show up in your prod rsid: again, since DTM ignores what you set in your custom code boxes, it's defaulting to what is specified in the trackingServer fields in the tool config, and my assumption here is they are either blank or wrong (you should be able to look at the request url to adobe to verify this). This theory is because you said the prod rsid is right, and you see a request being made. So the next culprit would be wrong tracking server specified.
So, that is my theory of what's going on. Maybe it's all right, maybe it's some right, hopefully it may point you in the right direction at least.
Edit:
If you can confirm that this is indeed how you have things setup, then you will naturally ask "Okay, well what do I do about that?". As I have said in a lot of my other SO answers.. basically, your only option is to uncheck all the settings that make DTM automate AA, and in all your rules, keep the AA section disabled and whatever AA vars you wanna set, set them yourself and make the s.t() or s.tl() call yourself in a 3rd party script code box, so that it continues to reference and pop based off the originally instantiated AA object.
Update
Based on your comments below, okay so yeah.. that sounds like what I described, and accounts for prod rsid popping. As for data not showing up in report.. so if you are certain tracking server is set correct (the request url looks good) then this isn't a DTM issue. Here are some other explanations for why the data wouldn't show up:
Are you sure the request is being sent to your prod rsid? I don't know what you are looking at to verify this, but this is where you should be looking: In the request URL to AA: "http://[trackingServer value]/b/ss/[s.account value]/1..."
Click request isn't making it to Omniture. Verify in a packet sniffer that the request is actually made and that you are getting a 200 OK or NS_Binding_Aborted response.
You aren't waiting long enough to check for the data. Even basic hit data and looking at "real time" reports takes a little bit of time to show up.
You have a segment/filter active that's not jiving with the data you are trying to look at. Make sure that you don't have anything applied. Or, if you are using those things to find your data (and aren't seeing it), ensure that you are correctly applying it.
You recently created the rsid and the "go live" date hasn't passed yet. Data will not show up in the report suite until up to 24 hours after the specified "go live" date.
You have a vista rule in place that's affecting data showing up. Some companies have a vista rule in place for a number of reasons and there are a million ways it could affect data (e.g. routing to a different report suite). For shits and grins, check your dev (or other rsids) to see if your data showed up there. Even if that doesn't make sense, at least it's a step forward.
You have a bots / ip exclusion rule in place that's catching data from your location.
The data sent in from the link click isn't relevant to the report. For example, maybe you are looking at e.g. prop10 report and prop10 isn't actually sent in the click request.
I know a lot of these are basic things to check, and no doubt you've checked, but check again. Have someone else check for you to be sure. I'm not questioning g your abilities here, but even the best of coders forget to cross their t's and dot their i's sometimes, and manage to miss obvious things. If you are sure about all of these then contact Adobe ClientCare because I really can't think of anything else that wouldn't involve an issue with Adobe's backend.
I ran into a similar problem with my implementation. Essentially what I did was set the s.account variable directly inside the doPlugins, so it would be set on all tracking calls. I wrote specifics here also: DTM Tracking Account

Do I have to use queryProfiles every time to get the profile id?

I am playing with Google Analytics API and found that when I get the web property list, I have a defaultProfileId very useful. It can just help me pass the queryProfiles call, to save one request and make the whole app works faster.
But I noticed that some web properties just don't have the defaultProfileId thing.
Just for the information, most of the situations happens to a tracking ID like UA-XXXX-1.
Any tips?
Thanks!
You are correct webProperty does not always return a defaultProfileId. I was also unable to find any information on the Web Properties page as to how the API decides what a Default Profile Id is. I submitted a bug report for it, with the Analytics dev team you can find it at: defaultProfileId - not always sent with a WebProperty. Lets hope they come with a response you are correct this is a very useful feature.
Yes you are probably going to have to query the profiles every time to get the correct profile you are after.
I just found this:
https://www.googleapis.com/analytics/v3/management/accounts/~all/webproperties/~all/profiles?oauth_token={Token}
There should be away of working that to make one request for accounts, one to get all the Web Properties , then one to get all the Profiles.

Avoid form Re-Submit

I'm developing ASP.NET applications and stuck with a "problem" relating to resubmit behaviour.
I'm controling the re-submit using a counter in form submit event which disables the submit if it's already been posted.
My application is a 3 step workflow and when the 3rd step is shown the transaction was submited from step2 to step 3.
What's my problem? Well... i want to avoid the user to resubmit the data by pressing the F5 or all other possibility. I don't want to disable the key because may be workarounds.
I'm wondering if i can remove the post data in a HTTP module that runs after the render was completed and right before the response is sent to the user.
You can use Post/Redirect/Get "pattern", where when user post data, you redirect it (after processing submitted data) to another page that will response to get. Just like stackoverflow and another sites are doing.
Here's is the Wikipedia page that explains the Post/Redirect/Get Pattern.
Maybe I misunderstood your question/issue but it sounds like you may be making things harder than they have to be. If you are already keeping track of whether or not the form has been submitted before, why cant you just check that flag on the code behind before performing whatever logic you execute on the submit? If it has already been submitted before, just ignore the resubmit event and maybe set an error message.

Using cookies to prevent access to certain non secure pages in a site

If I have a small microsite and on the first page I want to ensure that the user cannot jump to a non secure page between (e.g. 2 or 3), what would be the best way to implement this? The next page can only be seen if the user sets a certain item in a drop down box.
My first thought is cookies. If the user goes to the second page and the cookie's value is null, then there is a redirect to a failure page. If the user chooses the right value, the cookie's value is set to being a success. Would this approach work if I send a link on the 2nd page to a friend on another PC?
Is there a better way?
Cheap, downa and dirty? The cookie or session value work. Neither are reliable long term.
If you are making it so a user can only see certain info after selecting a drop down, you can hide it in a panel and only show that panel when the drop down is selected. This is the most useful if you do not mind the user having to select from drop down each time. You can use this with a cookie, as well, if you want the user to be able to see the data without selecting the drop down.
Hidden in the same page (drop down in one panel, info in another), you can keep it hidden perpetually.
If this has to be a second page, you can also put the page in another directory, and then put a web.config file in there that requires log in. You can then make it like a "log in" by "logging in" every person that answers. You end up using the Membership bits, but they are not hooked up to anything.
Cookies are not a good idea for this for one specific reason. They are under the control of the user, not you.
If a user has cookies disabled (globally or just for your site), they won't be able to get to page 2 now matter how many times they've read page 1.
In addition, if they know what your cookie contains (i.e., it's not encrypted), they can easily create it themselves or forward the method to a friend to get them to create it.
Regarding your question on whether you could send the page 2 link to someone else, cookies belong to the computer. That means the "someone else" would almost certainly not have the correct cookie for properly viewing page 2: they'd get an error.
We implemented a similar scheme (many years ago so there may be better ways to do it now). It involved storing a special "one-time" key when delivering page 1 to an IP address. The links in that page 1 were modified to include this key as an argument so that, when you requested page 2, the key was sent through as well.
The keys had a 30-minute lifetime (configurable but we ended up at 30 minutes). In order for us to deliver a page 2, the request had to come from the same IP address and have the proper key.
This prevented forwarding of links to other places and ensured the links had limited lifetimes.
Whether that's a viable solution for you is a question only you can answer. I know we got a few complaints from people who bought up page 1, then went out for a coffee. When they got back, their attempt to access page 2 was unsuccessful. We fixed this by simply redirecting them to page 1 with a suitable error message that their key had timed out.
Not perfect but, since the users were educated as to why it was happening, they understood its necessity.
If I understand your question correctly then the link you send to your friend will not work as they will not have the cookie stored in their browser memory or on their machine. This would also be true if you stored the value in Session as they will be creating their own new session when they opened the link.
To get this kind of behaviour when sharing links you will need to pass the value in a querystring i.e. when you select the desired option on page 1 and sublit the form the postback takes the selected option and then redirects to page 2 with option appended to the url as a querystring value.

Resources