capture all the browser URL in Webdriver - webdriver

I want to print all the browser url navigated during a test in webdriver.
For eg. :- the reference site is m.rechargeitnow.com and i made a transaction failure. Now the task is to capture all the browser url navigated during a complete transaction.

You'd need to build a list as you perform each step. Something like:
urls = []
urls.append(driver.current_url) /starting URL
/do step one
urls.append(driver.current_url) /URL following step 1
/do step two
urls.append(driver.current_url) /URL following step 2
/do all steps
print(urls)
If you only want to see that list of urls in the case of a test-fail or test-error, use a try block.

Related

Understand Dynamic Links Firebase

I would like to understand better Firebase Dynamic Links because i am very new to this subject.
What i would like to know :
FirebaseDynamicLinks.instance.getInitialLink() is supposed to return "only" the last dynamic link created with the "initial" url (before it was shorten) ?
Or why FirebaseDynamicLinks.instance.getInitialLink() doesn't take a String url as a parameter ?
FirebaseDynamicLinks.instance.getDynamicLink(String url) doesn't read custom parameters if the url was shorten, so how can we retrieve custom parameters from a shorten link ?
My use case is quite simple, i am trying to share an object through messages in my application, so i want to save the dynamic link in my database and be able to read it to run a query according to specific parameters.
FirebaseDynamicLinks.instance.getInitialLink() returns the link that opened the app and if the app was not opened by a dynamic link, then it will return null.
Future<PendingDynamicLinkData?> getInitialLink()
Attempts to retrieve the dynamic link which launched the app.
This method always returns a Future. That Future completes to null if
there is no pending dynamic link or any call to this method after the
the first attempt.
https://pub.dev/documentation/firebase_dynamic_links/latest/firebase_dynamic_links/FirebaseDynamicLinks/getInitialLink.html
FirebaseDynamicLinks.instance.getInitialLink() does not accept a string url as parameter because it is just meant to return the link that opened the app.
Looks like there's no straightforward answer to getting the query parameters back from a shortened link. Take a look at this discussion to see if any of the workarounds fit your use case.

Buffering a New url which is navigated on other page using tosca in run time

There is an website called X, When u click on the particular button from website X, it navigates in another tab with new url & i want to buffer that new url at run time. How to do in tosca?
I was able to successfully buffer a URL from IE. Here's how I did it.
First, I found this article on tricentis: https://support.tricentis.com/community/article.do?number=KB0015575
Following the instructions in that article, I scanned a new module for IE itself by selecting UIA during the scan (in the article). I captured the editbox of the URL bar as a module element.
Then, in a test case, I just used action-mode Buffer to read and store the URL into a buffer.

How to download HTML Report from HP ALM Performance Center 11.0 using rest API

I want to download HTML default report for a test run from Performance Center storage (using Rest API). Actually I need just summary.html file.
I was using the following steps in PC 11.5:
Request test scenarios:
http://{server:port}/qcbin/rest/domains/{domain}/projects/{project}/tests?fields=id,last-modified,name,owner&query={subtype-id[=PERFORMANCE-TEST]}&page-size=max
Let user choose the scenario (id) and request all its runs:
http://{server:port}/qcbin/rest/domains/{domain}/projects/{project}/runs?page-size=max&fields=id,owner,pc-start-time,duration,status,test-id&query={test-id[=234]}
Let user choose the run (id) and request Report (result entity):
http://{server:port}/qcbin/rest/domains/{domain}/projects/{project}/results?page-size=max&query={run-id[=123];name[=Reports]}&fields=id,name
Request "summary.html" file using file-id taken from previous step response:
http://{server:port}/qcbin/rest/domains/{domain}/projects/{project}/results/{file-id}/storage/report/summary.html
However it is not working with Performance Center 11.0. It fails at last step:
qccore.general-error
Not Found
I guess it is because the path of report was changed.
Can someone tell the path for summary.html for Performance Center 11.0?
I've been able to have a little bit of success with this. Rather than use the request you are using above I used the following:
http://{server:port}/qcbin/rest/domains/{domain}/projects/{project}/results/{file-id}/logical-storage/
This gave me a zip file, which contained the report inside it.

Benchmarking Solr indexing with Jmeter

I want to use Jmeter to update a document on solr using http post.
I want it to take a different file to update in every iteration, create a proper http post request and monitor the responses from the server.
Can someone guide me of how this can be done:
Taking a different file every time.
Creating a http post from it.
Your use case can be split into 2 parts:
Get list of files to send
Send them to server
In regards to point 1, I would suggest to obtain file list via scripting.
Assuming following Test Plan Structure:
Add a Thread Group (all defaults)
Add a JSR223 Sampler as a child of Thread Group
Select "beanshell" as language
In "Script" area add following code:
File folder = new File("PATH TO FOLDER WITH YOUR FILES");
File [] files2send = folder.listFiles();
int counter = 1;
for (File file : files2send)
{
vars.put("FILE_" + counter, file.getPath());
counter++;
}
This will save files, you will be sending as JMeter Variables like:
FILE_1=d:\2solr\sometxtfile.txt
FILE_2=d:\2solr\somewordfile.docx
FILE_3=d:\2solr\someexcelfile.xlsx
After that you can use For Each Controller to iterate through files and add them to your request
Add For Each Controller as a child of Thread Group (same level as JSR223 Sampler)
Make sure that For Each Controller has following configuration:
Input variable prefix: FILE
Output variable name: CURRENTFILE
Add _ before number is checked
Add HTTP Request as a child of For Each Controller
Access file you want to send as ${CURRENTFILE} in "Send Files With The Request" stanza of the HTTP Request
It's just one of the approaches, if you are not very comfortable with JSR233 or Beanshell you may wish to use CSV Data Set Config instead.

Google Analytics Realtime Sandbox Environment

I am looking for a way to setup a google analytics sandbox environment that will allow me
to test out my custom js code near real time.
My app will be using custom variables for advanced segmentation, and I would like to test out multiple scenarios quickly, as opposed to setting up a dummy GA account and wait for a whole day to confirm the test.
Thanks
Great question.
For GA, server updates occur every four hours, and after every sixth such update, the entire set is recalculated, which means a 24-hour lag from code change to reliable feedback. This delay also applies to most customizations to the GA Browser (e.g., "custom filters").
So if you are going to use GA as your web metrics system, and you expect to actually rely on those data then a test rig is essential.
For me, it's useful to group test systems for client-side analytics using two rubrics: (i) complete, self-contained (closed-loop) systems; or (ii) simpler automated data pulls from the production system (by "production system" here i mean GA's system, not the Site whose pages the GA code is tracking).
For the latter, just add this line to each page of your Site that contains the GA tracking code, just below '__trackPageview()':
pageTracker._setLocalRemoteServerMode();
That line will cause a copy of each transaction line to be logged to your server's activity log--so in essence, you get the data captured by GA in real-time That's all you need to do to capture the data; to parse it, you can use, for instance, any of the excellent open source web log analyzers like AWStats, or roll your own.
This is simple and reliable--but all it can do is tell you (in real-time) "does the analytics code i just implemented on pages served by my production server actually work?"
Usually, that's not good enough--you would rather know if your code will work before it's on your production server. To do that, you need to simulate the production environment and find a way to access in real-time the data GA collects.
This kind of test rig is a little more involved, but still not difficult.
In sum, it requires these steps:
host/serve the ga.js and the
tracking pixel locally;
log the __utm.gif requests (in the
GA data flow, each request
corresponds to one logged
transaction); and
parse the headers into some
convenient human-readable form.
If you want more detail than that (ie, a step-by-step implementation), here it is:
I. Hosting/Serving the GA Script (& automating updates
To do that, you can create a small shell script like this one to wget the latest ga.js version into your local directory (replacing the extant version it finds there).
#!/bin/sh
rm /My_Sites/sitename.com/analytics/ga.js
cd /My_Sites/sitename.com/analytics/
wget http://www.google-analytics.com/ga.js
chmod 644 /My_Sites/sitename.com/analytics/ga.js
cd ${OLDPWD}
exit 0;
(Thanks to AskApache.com, which provided the original motivation and config details to do this in a production context.)
II. Create __utm.gif file
This is just a transparent 1x1 pixel gif image, which you will place in Site directory (doesn't matter where, it just needs to match the location recited in your pages)
III. Log the __utm.gif Requests
For a testing protocol in which you are the source of the client-side activity (e.g., you want to verify the cross-browser fidelity of some event-tracking code you've added to a page on your Site, so you automate 5000 clicks on the button you just wired up,serving the page from your dev server set up for this purpose) it's probably simplest to just log the Request Headers, because it's in those headers that the GA script directs the client to gather various data from the DOM, from the location bar (url), and from prior http headers, and append them to a request for a resource on the GA server (__utm.gif, which is just a 1x1 transparent pixel).
For this type of protocol, i use the Firefox addon, LiveHTTPHeaders. You install it like any other Firefox addon, a few mouse clicks is all. Next, open it, and click the "Generator" tab. From this window, you can see the actual requests in real time. At the bottom of the window is a 'save' button to store the log. I find it easier to configure LiveHTTPHeaders to log only the __utm.gif requests; to do that, just click the 'Edit' tab and create a siimple filter to exclude everything except these particular gif images (using the check boxes on the right, and the large text box to the right).
Other kinds of test protocols require you to work from your Server Activity Logs; in that case just add this line to each page of your Site, just below __trackPageview():
pageTracker._setLocalRemoteServerMode();
IV. Parse those logged requests so you can actually read them
So now your log will contain individual transction lines, each one of which is a string appended to an HTTP Request for the GA tracking pixel. This string is just a concatenation of key-value pairs, each key begins with the letters "utm" (probably for "urchin tracker"). Each of these parameters corresponds to a variable that you see in the GA Dashboard (here's a complete list and description of them). This is all you need to know to build a parser. In more detail:
First, here's a sanitized __utm.gif request (the entries in your LiveHTTPHeaders log):
http://www.google-analytics.com/__utm.gif?utmwv=1&utmn=1669045322&utmcs=UTF-8&utmsr=1280x800&utmsc=24-bit&utmul=en-us&utmje=1&utmfl=10.0%20r45&utmcn=1&utmdt=Position%20Listings%20%7C%20Linden%20Lab&utmhn=lindenlab.hrmdirect.com&utmr=http://lindenlab.com/employment&utmp=/employment/openings.php?sort=da&&utmac=UA-XXXXXX-X&utmcc=__utma%3D87045125.1669045322.1274256051.1274256051.1274256051.1%3B%2B__utmb%3D87045125%3B%2B__utmc%3D87045125%3B%2B__utmz%3D87045125.1274256051.1.1.utmccn%3D(referral)%7Cutmcsr%3Dlindenlab.com%7Cutmcct%3D%2Femployment%7Cutmcmd%3Dreferral%3B%2B
This is my parser (in Python):
# regular expression module imported
import re
pattern = r'\&{1,2}'
pat_obj = re.compile(pattern)
# splitting the gif request on the '&' character
# (which GA originally used to concatenate each piece to build the request)
# (here, i've bound the __utm.gif to the variable by 'gfx')
gfx1 = pat_obj.split(gfx)
# create a look-up table to map a descriptive name to each gif request parameter
# (note, this isn't the entire list, which i've linked to above)
keys = "utmje utmsc utmsr utmac utmcc utmcn utmcr utmcs utmdt utme utmfl utmhn utmn utmp utmr utmul utmwv"
values = "java_enabled screen_color_depth screen_resolution account_string cookies campaign_session_new repeat_campaign_visit language_encoding page_title event_tracking_data flash_version host_name GIF_req_unique_id page_request referral_url browser_language gatc_version"
keys = keys.strip().split()
#create the look-up table
GIF_REQUEST_PARAMS = dict(zip(keys, values))
# parse each request parameter and map the parameter name to a descriptive name:
pattern = r'(utm\w{1,2})=(.*?)$'
pat_obj = re.compile(pattern)
for itm in gfx1 :
m = pat_obj.search(itm)
if m :
fmt = '{0:25} {1:10}'
print( fmt.format( GIF_REQUEST_PARAMS[m.group(1)], m.group(2) ) )
The result looks like this:
gatc_version              1         
GIF_req_unique_id         1669045322
language_encoding         UTF-8     
screen_resolution         1280x800  
screen_color_depth        24-bit    
browser_language          en-us     
java_enabled              1         
flash_version             10.0%20r45
campaign_session_new      1         
page_title                Position%20Listings%20%7C%20Linden%20Lab
host_name                 lindenlab.hrmdirect.com
referral_url              http://lindenlab.com/employment
page_request              /employment/openings.php?sort=da
account_string            UA-XXXXXX-X
cookies
To avoid making this longer still, i left out the cookies' value. They obviously require a separate parsing step, though it's virtually identical to the step i just showed. Again, each request represents a single transaction, so you can store them as you need to.

Resources