I want to put a check on response time in page loading when my test runs through multiple pages.
If a page doesn't load within 2 seconds, test should fail.
I am using Robot Framework with Selenium2Library and everytime I would use command Wait Until Page Contains text 2s, the browser will wait for page to be loaded completely and then run this command which does not serve the purpose.
Is it possible to put a timeout on page loading in Robot Framework?
The default selenium behavior is to wait for the page to be loaded before returning the control (with the exception of some ajax calls done by the js). So in the vanilla Go To navigation call the execution will continue after the load has happened - and thus the Wait Until ... passes almost immediately.
Selenium supports overriding this behavior by settings in the desired_capabilities, but that can be a bit involving (the setting for Firefox is called "pageLoadStrategy", with the values none/eager/normal, for example).
Here's something much easier though - just use a timer, get the timestamps before and after the navigation, and the diff will be the full page load.
${before}= Get Current Date result_format=epoch
Go To https://your-url
${after}= Get Current Date result_format=epoch
Should be True ${after} - ${before} < 2 msg=The total page load time was more than 2 seconds!
The keyword Get Current Date is in the DateTime standard library, and when called with the argument "result_format=epoch" it returns a float (the seconds since 1970) - the fractional part is the milliseconds.
By subtracting the two values you get the full page load time.
Related
`I am trying to use a loop e.g. asLongAs() in Galing but not getting enough data on google about how to use.
My scenrio is to open a HTML page and that page takes some time to load and for that I have a css selector to check that once the report get loaded we have one css selector to check in the source code.
my code is like:
`exec (http("ABC -${ID} - Id -${ID2}")
.get("web/a/b/c/")
.check(css(.abc).saveAs("URL"))
.exec(session =\> {
val response = session("URL").as\[String\]
println(s"url is: \\n$response")
session
})
exec(http("Open the redirected report - ${ID1} Id-${ID2}")
.get(session =\> session("URL").as\[String\])
Some checks
.check(css(".Image").exists)`
I want to create a loop till this css(.Image) is loading. Because once the URL is hitting at that time this CSS doesn't appear and it takes time to load and i want to calculate that time only.
but not getting enough data on google
Have you tried the official documentation ? It has samples for Java, Scala and Kotlin.
https://gatling.io/docs/gatling/reference/current/core/scenario/#aslongas
I have an ASP.Net application that accesses user data from a SQL database.
Visual Studio Version 2012
Windows Server 2012 Standard 6.2
Sql Server 2012
Program in Service since 11/2007 (with problem having never happened previously)
Problem:
First reported by 2 of my customers but I was not experiencing the problem until after a recent MS update.
Unsure of the particulars of those updates or whether it was only a coincident.
Log into application and go around to a few pages, all seems ok, than I select a new Active Company (auto filters list screens by Active Company ID from a session variable, changing active company changes the ID stored in the session variable), everything works fine for a while (1 - 4 mins) switching between screens and even different active companies, than at one point I go to a page that I've been to several times (that worked fine) and it shows everything from the last time I accessed it (literally the identical page from a few mins ago). I change to another page and it appears to be updated, go back to the screen that did not update and it no matter what, will not update again. I query the database and it is indicating the correct active company ID and query the session variable and that too is correct.
** The strange thing is I can wait 4 -5 mins (I just stop doing anything) and than try to access the page again and now it updates.
I have been beating at this now for almost 2 weeks and have not been able to determine to source of the problem.
I literally have tried every settings for session caching I could read up on with no (or minimal) affect.
Since our software utilizes session variables to hold user variables to control their environment (like active company selection), I went to go as far as removing the session variables and making the profile.variables (requiring Sql Session management) with minimal affect).
It seems to work fine for a few minutes (or page accesses) than once it stops updating the page, it will no longer update under any circumstance.
It will occur on pretty much any combination of page changes (after changing the active company, since it will actually change data displayed).
This design has been out in the field for over 8 years now (and is routinely brought up-to-date with the latest dot.net compiler updates, .net framework and IronSpeed Designer engine updates. This error has never occurred before now. No update to the development tools took place prior to the appearance of this issue.
I tried various tests.
Test 1:
I added java code to reset each page.
<script type="text/javascript">
function RefreshPage()
{
window.location.reload()
}
</script>
Result: No change
Test 2:
I stopped as soon as the page did not refresh and started timing when the page would update (1 -2 mins or going back and forth between the change active company and the reports screen several times)
Result:
After 60 - 90 secs, the current page seemed to do an update (the activity icon would appear than go away) so I would than check the page the was not refreshing and it was now correct.
Since I was using the report page for my tests, I would run a report when the screen update failed, to see what active company it thought it was on (since it was also reliant on the session variable, it was bringing up the correct report data, even though the page was not indicating the correct active company. Note: Every one of our screens indicate the current user and active company name at the top, so it is easy to see when it is not updating.
Any direction as to where to look from here would be greatly appreciated, I'm at a lost as to what to check now.
P.S. I installed MS Message Analyzer and had it monitor up to the point where I get a failure. I have never user MS MA before so I don't have much of an idea as to what to look for, other than the operation status was indicating Found (302) for the Get and Post and Ok (200) for the page I received the problem for.
Thanks in advanced!
John R
I propose to check caching options. I mean caching of page, controls, javascript, and browser. As workaround I propose to add some empty paramether to your page, ajax calls. For example instead opening "default.aspx" open "default.aspx?id=someNewGoid". Also consider adding some random paramethers to your ajax calls.
Try following coe for refresh:
<script type="text/javascript">
function S4() {
return (((1+Math.random())*0x10000)|0).toString(16).substring(1);
}
function guid()
{
var guid = (S4() + S4() + "-" + S4() + "-4" + S4().substr(0,3) + "-" + S4() + "-" + S4() + S4() + S4()).toLowerCase();
return guid;
}
function RefreshPage()
{
var url = window.location;
if (url.indexOf("?")>-1){
url = url.substr(0,url.indexOf("?"));
}//this par will cut of additional paramthers
window.location = url + "?id=" + guid();
window.location.reload()
}
</script>
Imagine that you click on an element using RSelenium on a page and would like to retrieve the results from the resulting page. How does one check to make sure that the resulting page has loaded? I can insert Sys.sleep() in between processing the page and clicking the element but this seems like a very ugly and slow way to do things.
Set ImplicitWaitTimeout and then search for an element on the page. From ?remoteDriver
setImplicitWaitTimeout(milliseconds = 10000)
Set the amount of time
the driver should wait when searching for elements. When searching for
a single element, the driver will poll the page until an element is
found or the timeout expires, whichever occurs first. When searching
for multiple elements, the driver should poll the page until at least
one element is found or the timeout expires, at which point it will
return an empty list. If this method is never called, the driver will
default to an implicit wait of 0ms.
In the RSelenium reference manual (http://cran.r-project.org/web/packages/RSelenium/RSelenium.pdf), you will find the method setTimeout() for the remoteDriver class:
setTimeout(type = "page load", milliseconds = 10000)
Configure the amount of time that a particular type of operation can execute for before they are aborted and a |Timeout| error is returned to the client.
type: The type of operation to set the timeout for. Valid values are: "script" for script timeouts, "implicit" for modifying the implicit wait timeout and "page load" for setting a page load timeout. Defaults to "page load"
milliseconds: The amount of time, in milliseconds, that time-limited commands are permitted to run. Defaults to 10000 milliseconds.
This seems to suggests that remDr$setTimeout() after remDr$navigate("...") would actually wait for the page to load, or return a timeout error after 10 seconds.
you can also try out this code that waits for the browser to provide whether page loaded or not.
objExecutor = (JavascriptExecutor) objDriver;
if (!objExecutor.executeScript("return document.readyState").toString()
.equalsIgnoreCase("complete")){
Thread.sleep(1000);
}
You can simply put it in your base page so you wont need to write it down in every pageobjects. I have never tried it out with any AJAX enabled sites, but this might help you and your scenario dependency will also get away.
I have a really strange problem and I'm completely puzzled.
I have a piece of code that parses some data and stores the result in our webserver's HttpRuntime.Cache using the Insert method. This is stored for 10 seconds. There seem to be some problems so I created a test page that retrieves a simple object from the cache and displays if it is null or not. To add the object to the cache, I use:
HttpRuntime.Cache.Insert(CHECK_KEY, new object(), null, DateTime.Now.AddSeconds(10), System.Web.Caching.Cache.NoSlidingExpiration);
In a test page, I try to retrieve the object:
var isInCache = this.cacheService.Get<object>(CHECK_KEY) != null;
and the method of the cacheService is:
public T Get<T>(string key)
{
return (T)HttpRuntime.Cache.Get(key);
}
Now the strange part. If I call a URL that calls the 'Insert' method, and go to my test page to retrieve it, the value of isInCache is false in about 99% of the time. Sometimes it works correctly for the whole 10 seconds (e.g. I refresh my test page every second and I get true 10 times) but again, most of the time it just returns false.
Now, when I keep F5 pressed, I sometimes see true in my output, in the blink of an eye, which means that the key CAN be found! This is not some browser cache, because it will only flash true intermittetly for the 10 seconds of cache duration, after which is it will only display false (which is logical, since the key is expired). So my question is:
WHY will retrieving a simple object from the cache fail most of the time?
There are other items in the cache (on the same test page) that do get retrieved, just not that object.
To make things worse, this (of course!) works flawlessly on my local machine, on the test machine, just not production. I'm pretty clueless. Please help :-)
EDIT:
Ok so I'm now testing in two different browsers, IE9 and Chrome... and IE9 is correctly showing the items in the HttpRuntime.Cache but Chrome is NOT. It shows always false and no other cached data, except when keeping F5 pressed it will occassionally show it. Since when is HttpRuntime.Cache browser dependant???
Extra edit: IE9 shows no more cached data. So while it can differ across browsers, it's not that IE will always work and chrome not... it differs.
EDIT2:
So I'm passing the variables to my view using ViewData:
ViewData["machineName"] = machineName;
ViewData["isInCache"] = isInCache;
ViewData["A"] = A;
ViewData["B"] = B;
Machinename comes from Server.MachineName, isInCache is the object, variable A is not from the HttpRuntime.Cache, variable B does, which is also intermittently not present.
After much debugging and thought, it appeared that the hosting provider had the 'Maximum Worker Processes' in IIS 7 Application pool settings to a value larger than 1. The HttpRuntime.Cache is not shared in a web farm, thus it could well be that I hit the 'wrong' instance which did not have the object cached. Continuously pressing F5 would have me occassionaly hit the instance which did have the value cached.
guys!
I'm developing an online auction with time limit.
The ending time period is only for one opened auction.
After logging into the site I show the time left for the open auction. The time is calculated in this way:
EndDateTime = Date and Time of end of auction;
DateTime.Now() = current Date and Time
timeLeft= (EndDateTime - DateTime.Now()).Seconds().
In javascript, I update the time left by:
timeLeft=timeLeft-1
The problem is that when I login from different browsers at the same time the browsers show a different count down.
Help me, please!
I guess there will always be differences of a few seconds because of the server processing time and the time needed to download the page.
The best way would be to actually send the end time to the browser and calculate the time remaining in javascript. That way the times should be the same (on the same machine of course).
Roman,
I had a little look at eBay (they know a thing or two about this stuff :)) and noticed that once the item is inside the last 90 seconds, a GET request gets fired every 2 seconds to update the variables in the javascript via a json response. you can look at this inside firebug/fiddler to see what it does.
here is an example of the json it pulls down:
{
"ViewItemLiteResponse":{
"Item":[
{
"IsRefreshPage":false,
"ViewerItemRelation":"NONE",
"EndDate":{
"Time":"12:38:48 BST",
"Date":"01 Oct, 2010"
},
"LastModifiedDate":1285932821000,
"CurrentPrice":{
"CleanAmount":"23.00",
"Amount":23,
"MoneyStandard":"£23.00",
"CurrencyCode":"GBP"
},
"IsEnded":false,
"AccessedDate":1285933031000,
"BidCount":4,
"MinimumToBid":{
"CleanAmount":"24.00",
"Amount":24,
"MoneyStandard":"£24.00",
"CurrencyCode":"GBP"
},
"TimeLeft":{
"SecondsLeft":37,
"MinutesLeft":1,
"HoursLeft":0,
"DaysLeft":0
},
"Id":160485015499,
"IsFinalized":false,
"ViewerItemRelationId":0,
"IsAutoRefreshEnabled":true
}
]
}
}
You could do something similar inside your code.
[edit] - on further looking at the eBay code, altho it only runs the intensive GET requests in the last 90 seconds, the same json as above is added when the page is initially loaded as well. Then, at 3 mins or so, the GET request is run every 10 seconds. therefore i assume the same javascript is run against that structure whether it be >90 seconds or not.
This may be a problem with javascript loading at different speeds,
or the setInterval will trigger at slightly different times depending on the loop
i would look into those two