Emulating user browsing session for unit test - http

I'm searching for a framework that could allow me to emulate user browsing session.
A typical session looks like:
Browse to home page, get session
Be redirected to current page
Click on some link
Get connected
Submit a form
and co...
I would like to be able to define this session using API calls.
What frameworks would you recommend to be able to run this setup? It should be run headless (not inside the browser), to be able to execute via Hudson.
Language does not matter, python of java would be great.
Thank you,
Maxim.

There are multiple frameworks which can do this. Check out:
https://github.com/axefrog/XBrowser
http://htmlunit.sourceforge.net/
and the answer to this question:
Alternative to HtmlUnit

Have a look at htmlunit
Its even got decent javascript support, its Java based.
Support for the HTTP and HTTPS protocols
Support for cookies
Ability to specify whether failing responses from the server should throw exceptions or should be returned as pages of the appropriate type (based on content type)
Support for submit methods POST and GET (as well as HEAD, DELETE, ...)
Ability to customize the request headers being sent to the server
Support for HTML responses
Wrapper for HTML pages that provides easy access to all information contained inside them
Support for submitting forms
Support for clicking links
Support for walking the DOM model of the HTML document
Proxy server support
Support for basic and NTLM authentication
Excellent JavaScript support

take a look at Selenium WebDriver with Xvfb.
this post shows an example in Python:
'Python - Headless Selenium WebDriver Tests using PyVirtualDisplay'

Related

How can I test accessbility using Lighthouse on a single page application

A colleague showed me Lighthouse on the Chrome browser. I have a single page application (SPA), and I'm able to run it against the base URL of my application.
However, all subsequent screens are rendered by client-side JavaScript without a change to the URL in the browser.
How can I test the rest of my site?
The only way is to switch project to server side rendering, like Angular Universal
update: found this issue https://github.com/GoogleChrome/lighthouse/issues/5187
If you can't switch your project to server side rendering as Alexey Semerenko said, you can consider using another testing tool like Sitespeed.io which allows you to define user actions as a script to browse every pages of your application.
After your test you will gather the same kind of metrics.

JavaFX WebView, reCaptcha wont work (unsupported Browser)

Im trying to Code a Programm with a WebView with includes reCaptcha by Google. When i load the Web Page it says that my Browser doesnt Support reCaptcha. Is there anyway to fix this with a method or something?
Thanks!
Override user by overriding the User agent string using WebEngine.setUserAgent("use required / intended UA string");
There are a lot of issues with this:
In order to use a ReCaptcha, you need to have an HTTP/HTTPS environment, so you cannot serve the local file statically from WebView. This means that you'll have to either host your page online or on localhost and then open that within WebView.
As Priyanka mentioned, you can also override the user agent string as said, but the WebView browser still doesn't have all the JS libraries that you need (as of 2020, the "click all squares that have traffic lights" isn't functional).
I would probably recommend opening the auth page in the user's local browser and then using some kind of callback function.

Simulating a crawler on my website

I need to debug my web app which is written by asp.net to find out how it is acting when rendering the content for the crawlers like Googlebot. The first thing I found was some online/offline tools but none of them can pass the Request.Browser.IsCrawler flag.
Then I tried to simulate a handmade request adding the Googlebot UserAgent but still no chance.
I used Telerik Fidler and Chrome while setting User-Agent to Googlebot/2.1 (+http://www.googlebot.com/bot.html), including _escaped_fragment_ in the URI and successfully saw the page from crawler perspective.

Legitamate cross site communication

I am building a website, within a large intranet, that wraps and adds functionality to another site within the same intranet. I do not have access to the other site's source and they do not provide any api's for the functionality they provide. I need to, somehow, have my server-side code go to that site, fill in some forms, then press a submit button.
Is this possible? If so, how can I accomplish this?
Note: I am working in asp.NET if that matters at all.
Not the most efficient, but maybe WatiN can get you started:
http://watin.sourceforge.net/
Just look at the URL the form is supposed to submit to and the method it employs (POST or GET) and then send a request to that URL using the same method and put the field you want as parameters
Your server-side code is basically a web client to the other web site. You will need to write the code to send the HTML form data to the other web site and process the response. I would start with the System.Net.WebClient class. Take a look at System.Net.WebClient.UploadValues. That class/method will enable you to POST the form data to the web site via a NameValueCollection.

What is the .MSPX file extension?

I've noticed a lot of Microsoft sites have the *.MSPX extension. While I'm very familiar with ASP.NET, I've not seen this extension before.
Does anyone know what this identifies?
A few internet searches led me to http://www.microsoft.com/backstage/bkst_column_46.mspx, but it was a dead link. Fortunately, it was archived on the Wayback Machine and you can read it here:
http://web.archive.org/web/20040803120105/http://www.microsoft.com/backstage/bkst_column_46.mspx
The .MSPX extension is part of the "Microsoft Network Project," which according to the article above, is designed to give Microsoft's sites a consistent look-and-feel worldwide, as well as keep the design of the site seperate from the content. Here's the gist of the article:
The presentation framework includes a custom Web handler built in ASP.NET. Pages that use the presentation framework have the .mspx filename extension, which is registered in Microsoft Internet Information Services (IIS) on the Web servers. When one of the Microsoft.com Web servers receives a request for an .mspx page, this custom Web handler intercepts that call and passes it to the framework for processing.
The framework first checks to see whether the result is cached. If it is, the page is rendered immediately. If the page is not cached, the handler looks up the URL for that page in the table of contents provided by the site owner (see below) to determine where the XML content for the page is stored. The framework then checks to see if the XML is cached, and either returns the cached content or retrieves the XML from the data store identified in the table of contents file.
Within the file that holds the content for the page, XML tags identify the content template to be used. The framework retrieves the appropriate template and uses a series of XSLTs to assemble the page, including the masthead, the footer, and the primary navigational column, finally rendering the content within the content pane.
I think it's an XML based template system that outputs HTML. I think it's internal to MS only.
Well, a little googling found this:
The presentation framework includes a
custom Web handler built in ASP.NET.
Pages that use the presentation
framework have the .mspx filename
extension, which is registered in
Microsoft Internet Information
Services (IIS) on the Web servers.
When one of the Microsoft.com Web
servers receives a request for an
.mspx page, this custom Web handler
intercepts that call and passes it to
the framework for processing."
I'd like to find out more info though.
I love you guys, i was asking myself also many times, why MS uses .mspx and what it is at all?! :)
That time i couldn´t find any informations quickly and assumed it would just be something on top of asp.net or maybe not even that, because you should be able to assign the same asp.net cgi dll to .mspx also easy too ;)
But, surely, it can be anything.. also an "special" CGI itself (completely beside ASP.NET), which processes that request with much better / much more cache-use, easier editing and so on..
The end of the story was, that i came accross the view, that maybe it´s not important to know, what .mspx exactly is :)

Resources