Sending google-analytics pageview is not logged - as active user - google-analytics

When I send the request from fiddler or from .net code - I see the counter in Analytics active users
But when it is sent from a cloud machine - I get 200 OK but the counter does not change
Possible problems:
response is from cache
Results are filtered by region (I couldn't find any active filter)
Needed header
Request format: https://www.google-analytics.com/collect?t=pageview&v=1&dp={myDp}&tid={myTid}&cid={myCid}
The request was built based on this: https://developers.google.com/analytics/devguides/collection/protocol/v1/reference
Any more ideas?
Dekel

This can happen if you are looking at a view where Bot Filtering is enabled:
It happens because your call has few parameters (for example it doesn't have the parameter of screen resolution, host name, etc...) therefore it is interpreted as a bot.
Try to see the data in a View where there is no filter (or disable this option) or to enrich the call with more parameters.

Related

How to create article in Drupal 8 via JMeter?

I am trying to create a JMeter test case for article creation in Drupal 8. I am able to add steps for other navigations. But when clicking the Create Article button after entering some values in the form fields, from JMeter I am getting HTTP response 200. But the article is not getting created.
If I do the same steps in browser I am getting HTTP response 303 and article getting created successfully.
I found this in request headers of POST request while hitting the Create Article button. I am suspecting this might be the reason Drupal server is not accepting the request. Because I am not sure how this dynamic ID "JJPKbuyIinQT5mQZ" is getting generated
Is this being generated by browser? If yes, how to do the same action in JMeter?
Is this being generated by server? If yes, I don't see this token in previous request, like form_token.
This dynamic ID should be automatically generated by JMeter given you tick Use multipart/form-data for POST box, this is so called multipart boundary
Other things to be considered:
Don't forget to add HTTP Cookie Manager, otherwise you will not be able to even perform a login
Correlate form_build_id and form_token. You can do this using CSS/JQuery Extractor
Correlate changed, you can generate timestamp like 1532969982 using __groovy() function like: ${__groovy(Math.round(System.currentTimeMillis() / 1000),)}
Correlate created[0][value][date]. You can do this using __time() function like ${__time(YYYY-MM-dd,)}
Correlate created[0][value][time]. You can do this using the same __time() function like ${__time(HH:mm:ss,)}
That's probably it, other values should be good to be used from the recorder.

Why am i not able to login successfully in asp.net web application when using jmeter?

I have to record a load test for a successful login and further browsing in to the asp.net application.
After recording the script in jmeter, my samplers are 1) get request (login page) 2) post request (posting the credentials and click login) 3)and other samplers (after successful login).
My problem is it shows incorrect username password error(manually it is working) whenever i play the the script, i have parameterized the valid credentials, also did correlation(by seeing the post request i got to know the fields that were posted) with event state validation, viewstate genearator, viewstate and hdnkey from get response(sampler 1) to my post request(sampler 2) and tried again, but i am getting the same error everytime.
Please let me know, what should be done to login successfully. So i can perform the load test on this asp.net application.i have came accross lots of sites for this issue but nothing solved it. Please help!
You can try to debug your JMeter script. Look here and here. I would use Debug Sampler as the first step for seeing variables values.
You can use tool like Fiddler to record the requests that are made when you manually login into the site. Then you can compare them with your JMeter script.
Make sure to add HTTP Cookie Manager to your Test Plan
Make sure to correlate any dynamic parameters like View State, EventValidation, etc.
Run your test with 1 virtual user and 1 loop and inspect request and response details using View Results Tree listener. Compare requests you send with JMeter with what real browser sends (can be found in your browser Developer Tools "Network" tab") - requests should be the same (apart from dynamic parameters)
Check out ASP.NET Login Testing with JMeter article for example test plan building explained.

Generate Get Request with No User Agent

I have a website that has been experiencing errors because of null references due to poorly coded logic regarding the user agent. Basically, there has been a slew of incoming requests that contain no user agent which leads to null reference exceptions in the user agent tracking. (It contained a call to "Request.UserAgent.ToLower()) I am correcting this logic to avoid the error condition. Since I'm certain these requests are coming from specialized tools and not ordinary users, I'm also blocking empty user agents via URL rewrite rules.
I need to test both of these changes. However, I can't seem to find a user agent spoofer that will enable me to generate a simple get request with NO USER AGENT. All of the tools that I have tried will allow me to do a custom agent string, but they won't let that string be left empty and there are no options that I can find to tell it to send no user agent.
So my question is, what tools are available, for a Windows-based system, that I can use to emulate a browser request with NO USER AGENT so that I can verify that my changes are working properly?
I believe that value is coming from the request headers. If yes, just try
Fiddler. Go to composer tab (see below) - by default it adds User-Agent to the request, however when you delete it in the Composer it seems to disappear from the request.

ASP.NET form scraping not working

I'm trying to scrape some pages on a website that uses ASPX forms. The forms involve adding details of people by updating the server (one person at a time) and then proceeding to a results page that shows information regarding the specified people. There are 5 steps to the process:
Hit the login page (the site is HTTPS) by sending a POST request with my credentials. The response will contain cookies that will be used to validate all subsequent requests.
Hit the search criteria page by sending a GET request (no parameters). The only purpose of this is to discover the __VIEWSTATE and __EVENTVALIDATION tokens in the HTML response to be used in the next step.
Update the server with a person. This involves hitting the same webpage in step 2 but using a POST request with form parameters that correspond to the form controls on the page for adding person details and their values. The form parameters will include the __VIEWSTATE and __EVENTVALIDATION tokens gained from the previous step. The server response will include a new __VIEWSTATE and __EVENTVALIDATION. This step can be repeated using the new __VIEWSTATE and __EVENTVALIDATION, or can proceed to the next step.
Signal to the server that all people have been added. This involves hitting the same page as the previous 2 steps by sending a POST request with form parameters that correspond to the form controls on the page for signalling that all people have been added. The server response will simply be 25|pageRedirect||/path/to/results.aspx|.
Hit the search results page specified in the redirect response from the previous step by sending a GET request (no parameters - cookies are enough). The server response will be the HTML that I need to scrape.
If I follow the process manually with any browser, filling in the form controls and clicking the buttons etc. (testing with just one person) I get to the results page and the results are fine. If I do this programmatically from an application running on my machine, then ultimately the search results HTML is wrong (the page returns valid HTML, but there are no results compared with the browser version and some null values were there should not be).
I've run this using a Java application with Apache HttpClient handling the requests. I've also tried it using a Ruby script with Mechanize handling the requests. I've setup a proxy server using Charles to intercept and examine all 5 HTTPS requests. Using Charles, I've scrutinized the raw requests (headers and body) and made comparisons between requests made using a browser and requests made using the application(s). They are all identical (except for the VIEWSTATE / EVENTVALIDATION values and session cookie values, which I would expect to differ).
A few additional points about the programmatic attempts:
The login step returns successful data, and the cookies are valid (otherwise the subsequent requests would all fail)
Updating the server with a person (step 3) returns successful responses, in that they are the same as would be returned from interaction using a browser. I can only assume this must mean the server is updating successfully with the person added.
A custom header is being added to requests in step 3 X-MicrosoftAjax: Delta=true (just like the browser requests are doing)
I don't own or have access to the server I'm scraping
Given that my application requests are identical to the browser requests that succeed, it baffles me that the server is treating them differently somehow. I can't help but feel that this is an ASP.net issue with forms that I'm overlooking. I'd appreciate any help.
Update:
I went over the raw requests again a bit more methodically, and it turns out I was missing something in the form parameters of the requests. Unfortunately, I don't think it will be of much use to anyone else, because it would seem to be specific to this particular ASP servers logic.
The POST request that notifies the server that all people have been added (step 4) requires two form parameters specifying the county and address of the last person that was added to the search. I was including these form parameters in my request, but the values were empty strings. I figured the browser request was just snagging these values because when the user hits the Continue button on the form, those controls would have the values of the last person added. I figured they wouldn't matter and forgot about them, but I was wrong.
It's a peculiar issue that I should have caught the first time. I can't complain though, I am scraping a site after all.
Review Charles logs again. It is possible that the search results and other content may be coming over via Ajax, and that your Java/Ruby apps are not actually doing all of the requests/responses that happen with the browser. Look for any POST or GET requests in between the requests you are already duplicating. If search results are populated via Javascript your client app may not be able to handle this?

HTTPClient to simulate form submission on ASPX - Invalid viewstate

I am trying to simulate a form submission on an ASPX.NET site.
The flow of the website when accessed in a browser is as follows:
1) In a browser the user visits http://mysite.com/ which is configured with Basic Authentication
2) Upon correct credentials, the user is shown a form with one input text box and a button (URL stays http://mysite.com/ but the form being served is Default.aspx)
3)User enters some text and presses submit...
4) The page reloads... URL is still http://mysite.com/... but there is a timer which triggers after 10 secs and downloads a file from http://mysite.com/Downloader
I am trying to simulate this flow in my program using HTTPClient.
1) Do a GET on http://mysite.com
2) Extract hidden form fields __EVENTVALIDATION and __VIEWSTATE
3) Create a POST request with above two and other form fields and POST it to http://mysite.com RESULTS in Invalid Viewstate exception.
How do I achieve this in HTTPClient?
The usual way to do this is as follows: First, record the HTTP traffic using WireShark or Fiddler while you are using the website from the browser. Second, analyze the packet trace in detail, and collect every HTTP header and every HTTP payload from every GET and POST message sent by the browser. Third, try to send the same messages from your code. After sending an HTTP request, you will have to analyze the response of the server, and extract all pieces of data you need to insert into the next request. Don't forget to set the referer field, for example. Add each request to your code one by one, and record the traffic when you run the code. If you assemble your HTTP requests correctly, then your request packets should look like the requests of the browser.
I'm in the same scenario, I have to create a POST request to an external ASPX page.
I have captured the traffic using FIDDLER and tryed to simulare the call using online post request tool like https://www.codepunker.com
I have not been able to recreate the request...
In my opinion (and this require time) we have to:
Create a basic webrequest to the source form
Collect all the form elements with value
Create a POST request submitting all the elements including VIEWSTATE
NOTE: may be that you need to use a webclient that accepts cookies, check:
Accept Cookies in WebClient?
Good luck

Resources