I am using HP loadrunner for my automatic tests.
Every time, when i run my application, it creates some transfer and also generates id in URL.
How can i get the id from URL?
Thanks in advance!
The web_reg_save_param function in LoadRunner is used for this. The following line will save the current page URL to the parameter (URL).
web_reg_save_param("URL", "LB/ic=Location: ", "RB=\r\n", "Search=Headers", LAST);
If you know what the ID is that your looking for, ie. http://www.example.com/?id=298374293847 you can adjust the call accordingly.
web_reg_save_param("URL", "LB/ic=Location: http://www.example.com/?id=", "RB=\r\n", "Search=Headers", LAST);
Hope this helps.
Recording with Siebel 8.1 on Loadrunner 11 having issues,posted a question on HP and got the same comment. But usually we can try the below mentioned option
You can record in Siebel-Web or web (http/html) and playback as
either too (if you want to record in Siebel-Web and play back in
regular web, just copy the contents of the script to a regular web
script and save).
Try a proxy mode recording in LR.
Changing registry and disables NTLM.
Turn off all autocorrelation rules
Turn on record as URL mode (as an alternate use web_custom_requests())
Use a sniffer to capture the traffic and then build a script by hand.(Best Option)
Change settings on the Siebel server side as well (Enable Automation=True, EnableWebClientAutomation = TRUE)
If you are recording your scripts using Web http/html you can use automatic correlation. For automatic correlation go to Design Studio
If you are unable to find the value there,then you must correlate manually using web_reg_save_param by giving the left and right boundaries.
This is going to sound belligerent, condescending and downright offensive. It is not to be meant as a reflection on you, but upon your management who has placed you in this position.
The topic of Correlation is one covered extensively in the class for LoadRunner web script development. It is the topic of a full 1/3 of the class and an additional appendix. All told some four different techniques for collecting dynamic data are covered, presented or documented as a part of the class materials. This capability, the handling of dynamic data, is a foundation tool skill.
Vardges, your management has placed you in a tough spot. Personally I would bolt for greener fields, for any management which is willing to do this to a line-of-business employee is also willing to toss that same person under the bus to salvage their own hide or a client relationship. Blaming you for something that management is unwilling to address is not a question of "if?" when training and mentoring does not occur, but only "when?" will the blame be placed on you.
James Pulley
Moderator: YahooGroups Advanced-LoadRunner, YahooGroups LoadRunner, SQAForums LoadRunner, LinkedIn LoadRunner, GoogleGroups lr-loadrunner
Related
I work for a nonprofit which help disabled military veterans. We have all our participants register with us using Salesforce as the repository of their registrations. We have dashboard components in Salesforce Lighting which totals up the number of active participants we have. I would like to display the component on our WordPress site but have never done anything like that before. I was hoping to find someone who has done something like that and offer some direction on how to go about doing it.
I tried looking up WordPress plugins which integrate with Salesforce. Most seem to be geared towards sending registrations back and forth but not displaying information. From a little bit of research, it seems like coding might need to be involved. Maybe doing a REST API with a Post option which will send the data through an HTTP URI? But to my understanding is that it would require WordPress to be an API. I am sure there are gaps in my logic.
I dont have an extensive amount of programing language experience but am willing to learn. I have taken a few Java and JavaScript classes in school.
I have not attempted this yet. I am just looking for feedback and direction.
Few options here, in no specific order...
Do Wordpress users have real Salesforce accounts or is their data simply stored in SF? Ask your Salesforce admin if there's a "customer community" configured (if your SF org is really old he might refer to it as customer portal). Communities offer nice way of exposing SF to poeple who don't need full SF user licenses. Think like collaborating with real SF users on "My Cases", viewing reports & dashboards... But for this you'd really need people logged in to SF so it won't work if you want just something anonymous. Some more info
Another option might be using Sites (Visualforce pages that expose SF data to guest users). Think like displaying a product catalog, FAQ, web-to-lead form or some other generic "contact us" page that's anonymous. So if you have SF developer (or admin with good copy-paste skills) you could use some Visualforce charts. They can be 100% coded (like this) or fed data from a report (like this) so it's simpler for admin to change the report filters or something without really writing code. Not sure if the simple route will work on a Site, there are some old answers that say "No", you might have to try it out. Worst case you'd need Apex code (or JavaScript) to query SF for results and display them. And display that SF Site page as <iframe> in Wordpress.
A slight twist on the Sites option - do you use Chatter (bit like Twitter inside SF)? There's way to take a snapshot of a report when a milestone has been met and post it to chatter ("congrats for hitting X participants"). And embed feeds on Visualforce pages too. Docs
What SF edition you're on (Group/Professional/Enterprise...)? If you have API access to Salesforce you could query the info yourself from Wordpress and display it using whatever charting library's easiest for you (Google Charts, Flot...). There are tons of examples how to connect to SF from PHP (or maybe you could cannibalize a WP plugin). Technically it's one POST message to log in to SF and one GET to run a query (something as simple as SELECT COUNT() FROM Contact WHERE isActive__c = true?)
That'd be more or less everything in terms of pulling data out of Salesforce. I mean if you have API access enabled you can slice & dice it how you want, extract data with raw PHP code or use some middleware but overall idea doesn't change. Write queries yourself or use "Analytics API" to access report results (so your administrator has power to change it without coding)...
So how about pushing? SF could notify you about current participants count. At scheduled intervals or even realtime. That'd be "just" raw data though, you'd have to write visualisation yourself.
Plenty of options here
workflow rules (code-free), sends XML message to specified URL so you'd need a WP page that can "capture" the result. Could be sent on creation of new record or update of existing. Won't give you totals, it'd be data related to that particular record so you'd have to build kind of +1 / -1 counter... Or if you use a report + analytic snapshot (helper object to store report results) and have workflow on that - that could be really close to what's needed.
scheduled apex job to run some queries and send the results to you. Again - you'd need a WP url that can be called from SF
if there's a CometD plugin for Wordpress you should look at Salesforce Streaming API, Platform Events or (newer and even simpler to configure) Change Data Capture. Basically you "subscribe" to a topic (a SF query) and whenever SF data changes and SF decides it'd change the results of the query - it'd push the results to you. It's almost realtime. Too much to write about them, perhaps best if you'd try to click through some trailheads - SF self-paced training courses:
https://trailhead.salesforce.com/en/content/learn/modules/api_basics/api_basics_streaming
https://trailhead.salesforce.com/en/content/learn/modules/change-data-capture
https://trailhead.salesforce.com/en/content/learn/modules/platform_events_basics
We were checking newly implemented Google Analytics for our mobile app and surprisingly there are a lot of visitors from multiple countries but in actuality, we haven't released our app for any store and it's just beta between 5 main users.
After checking Google Analytics report in details we have found that it got spammed by Bot call "Trumps Bot" when something happens on your account you can see following lines in your language section.
“Secret.ɢoogle.com You are invited! Enter only with this ticket URL. Copy it. Vote for Trump!”
There are a lot of solution available to avoid this data in your reports using the filter but i was just wondering if there is any concrete solution on permanently remove this data from my reports and also is there anything we can do to avoid such data in future as its seriously affecting business strategy.
Due the tecnology used on Google Analytics the only way to eliminate this referal is using a filter, check one common point of all this hits . In this case is a hard one, because all the parameters changes , exept for the language, for a well know reason, to see the spam.
So try to use this one, in my case works
I highly recommend you read the community policy, this can be considered as off-topic question
Analytics spammers are always trying to find new ways of getting attention, and with this one, this spammer hit it big.
It is not possible to permanently remove it unless you delete the whole property. But you can create and advance segment to get a clean view.
But the most important part is blocking it so it doesn't pollutes your data. For this particular type of spam you should create a custom exclude language filter with this expression:
\s[^s]*\s|.{15,}|.|,
That expression will block any hit that doesn't use a proper language. That combined with a valid hostname filter should prevent most of the current spam and save you a lot of headaches.
If you need help, you can check this step by step guide for building these filters and creating the advanced segment to remove it from your historical data.
Here is also a related question.
Login in to Your Google Analytics account
Select ADMIN Section
Click on All Filters -- Add Filters
Give a filter name such as -- Include only website traffic
In Predefined section, select Include Only
for more... Click Here
I'm getting some very strange behavior in DTM. When our page loads (from a local instance of the website) we get the expected call going out with the proper dev report suite. When a custom link call is made from that page, for some reason DTM sends it with a production report suite. If I look in Adobe Analytics for the custom link name reported under the prod RSI, it does not show up in there.
Any ideas on what is going on and how I can fix this issue?
This is my shot in the dark based on what you have said, and it is based on the assumption that your statements are true (e.g. you aren't seeing pink elephants, that the request was indeed showing your prod rsid in the proper portion of the request url, that you did in fact check your prod rsid after an acceptable amount of time has past, no segment or other filter shenanigans, etc..: in short, that you do know how to accurately perform the basic QA song and dance).
Under that assumption, the below is a scenario that can plausibly reproduce what you are describing. I could be partially right or totally off for your specific situation, but there's really no way for me to know for sure without having access to your DTM instance.
The Scenario
Long story short is it sounds like you have a blend of custom code and DTM automatic settings enabled, and DTM is overriding and/or not caring about your custom code for link tracking.
More specifically, it sounds to me like you have AA implemented as a tool in DTM, and in the config settings, you have your production and staging rsids specified in the text fields.
Then in the General section, you either do NOT have values specified for Tracking Server and Tracking Server Secure, or else they are set to the wrong values.
Then, in the Library Management section, you have either selected "Managed by Adobe" in which case DTM takes care of the library, or else you have selected "Custom" and you are adding the library yourself AND you have NOT checked "Set report suites using custom code below".
Then, somewhere in DTM (e.g. the Library Management > Custom code box, or Customize Page Code codebox) you have code that pops rsid stuff (e.g. s.account, s_account, dynamicAccountList stuff), and possibly also trackingServer and trackingServerSecure.
Finally, you (like most other people, because DTM's double script include for staging vs. prod is.. dumb) just use the prod script include on your page, and either use the debug/staging mode or rely on whatever rsid routing logic you've setup to route to dev.
So.. when the page is first loaded, DTM loads the AA library and it sets variables and stuff based on what you specified in the tool config. During this time, it is also popping any custom code blocks you have in the tool config, which may or may not override what you have specified in the tool config fields, depending on what you enabled. Then after that, it pops stuff you have in page load rules (if any), etc..
But then comes the link click.. As I have mentioned in other posts on SO, DTM has this caveat (IMO bug) about how it references the AA object after the initial page load/AA request: basically, it doesn't. Instead, it makes use of internal methods (the main one being a .getS() method) to create a new instance of the AA object, based on whatever things you have configured in the tool config section. Well here's the rub.. it does NOT account for or execute any custom coding you have done in code boxes in the tool config section.
So that basically happens whenever an event based or direct call rule is triggered, and it effectively screws you. Why does DTM do this? I do not know. IMO Adobe needs to change this feature caveat bug. Either they should refactor DTM to execute the code boxes, OR they could, you know.. just reference the original AA object created, like any normal script would do..
But in any case..
So for example, my theory here is that page loads fine, points to dev rsid based on your setup. But then you click a link and an event triggers, and DTM makes a new AA object not caring about your custom code, so all it has to go on is what you have in the tool's config fields.
Since DTM doesn't actually have any rules around the prod vs. dev rsids you specify in those fields (you have to write custom code in the custom code boxes - that DTM ignores!), it just pops the prod rsid, because that's the script include you have on your page.
Then as far as not seeing the data actually show up in your prod rsid: again, since DTM ignores what you set in your custom code boxes, it's defaulting to what is specified in the trackingServer fields in the tool config, and my assumption here is they are either blank or wrong (you should be able to look at the request url to adobe to verify this). This theory is because you said the prod rsid is right, and you see a request being made. So the next culprit would be wrong tracking server specified.
So, that is my theory of what's going on. Maybe it's all right, maybe it's some right, hopefully it may point you in the right direction at least.
Edit:
If you can confirm that this is indeed how you have things setup, then you will naturally ask "Okay, well what do I do about that?". As I have said in a lot of my other SO answers.. basically, your only option is to uncheck all the settings that make DTM automate AA, and in all your rules, keep the AA section disabled and whatever AA vars you wanna set, set them yourself and make the s.t() or s.tl() call yourself in a 3rd party script code box, so that it continues to reference and pop based off the originally instantiated AA object.
Update
Based on your comments below, okay so yeah.. that sounds like what I described, and accounts for prod rsid popping. As for data not showing up in report.. so if you are certain tracking server is set correct (the request url looks good) then this isn't a DTM issue. Here are some other explanations for why the data wouldn't show up:
Are you sure the request is being sent to your prod rsid? I don't know what you are looking at to verify this, but this is where you should be looking: In the request URL to AA: "http://[trackingServer value]/b/ss/[s.account value]/1..."
Click request isn't making it to Omniture. Verify in a packet sniffer that the request is actually made and that you are getting a 200 OK or NS_Binding_Aborted response.
You aren't waiting long enough to check for the data. Even basic hit data and looking at "real time" reports takes a little bit of time to show up.
You have a segment/filter active that's not jiving with the data you are trying to look at. Make sure that you don't have anything applied. Or, if you are using those things to find your data (and aren't seeing it), ensure that you are correctly applying it.
You recently created the rsid and the "go live" date hasn't passed yet. Data will not show up in the report suite until up to 24 hours after the specified "go live" date.
You have a vista rule in place that's affecting data showing up. Some companies have a vista rule in place for a number of reasons and there are a million ways it could affect data (e.g. routing to a different report suite). For shits and grins, check your dev (or other rsids) to see if your data showed up there. Even if that doesn't make sense, at least it's a step forward.
You have a bots / ip exclusion rule in place that's catching data from your location.
The data sent in from the link click isn't relevant to the report. For example, maybe you are looking at e.g. prop10 report and prop10 isn't actually sent in the click request.
I know a lot of these are basic things to check, and no doubt you've checked, but check again. Have someone else check for you to be sure. I'm not questioning g your abilities here, but even the best of coders forget to cross their t's and dot their i's sometimes, and manage to miss obvious things. If you are sure about all of these then contact Adobe ClientCare because I really can't think of anything else that wouldn't involve an issue with Adobe's backend.
I ran into a similar problem with my implementation. Essentially what I did was set the s.account variable directly inside the doPlugins, so it would be set on all tracking calls. I wrote specifics here also: DTM Tracking Account
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm trying to determine why an enterprise wouldn't want to use Google Analytics.
Here are the main reasons I've seen mentioned:
Inability to track clients that have Javascript disabled.
Lack of ownership of the statistics - Google owns the data.
Most of the web clients with Javascript disabled will probably be bots/spiders. This data is interesting, but probably not very useful.
As for the ownership issue, this is a bit paranoid IMO.
What am I missing here? When is Google Analytics not good enough?
Here are my findings from additional research:
Google Analytics is limited to 5 million page views per month - source
If a web site generates more than 5 million pageviews per month it will need linked to an active AdWords account to avoid interruption of service.
Lack of / slow technical support
All Google support is handled through email and response times can take a week or more. Commercial analytics products often have much faster & personalized support.
Inability to track files (PDF's, Images, etc.)
GA relies on Javascript and files lack the ability to execute Javascript. The workaround to this problem is to tag the link, but this won't track requests that go directly to the file.
Limited ability to customize
This is a selling point that I see pushed by commercial analytics tools (WebTrends). However it's never explained what customizations are denied by GA but allowed by WebTrends.
The Google Analytics EULA does not allow you to track individual users by identifying them. So if you wanted to add a custom variable for username to track how many times each user logs in, then you would be in a gray zone if not outright violating the EULA.
I use Google Analytics on about 10 sites right now and it's a great tool. In addition to all the analytics stats, you can tie it in with AdSense and it becomes a marketing/revenue tool and not just "wow look at all these cool user stats". If there was a way to track by user ID in certain circumstances (e.g. if user's agreed to it, or if they work for the company that owns the site) then I would have no issues.
Besides, it's free and all you have to do is add JavaScript to the files, so give it a try and see what you think after a few months.
One reason that was, surprisingly, not posted:
timing / speed of reaction
It takes at least 4 hours (up to 24) for GA to update your data.
This is ok for me personally in most of the cases, but when reacting fast is crucial (news sites, one-off events, etc.) you may want to employ some other solution (Mint comes to mind, but it's not the only one out there of course).
Thought I'd add my two pence worth to this thread, as this a topic close to my heart and one I've debated with colleagues for years. We've used webtrends in house for as long as i can remember, back to version 4 of the log analyzer (how different things were back then!). Since Google Analytics came along, we've started to come under increasing pressure from certain parts of our business to switch, as 'it does everything we need form an analytics tool'
Well, true in many senses it does, especially these days. But I championed the integration of our CRM and web analytics tools back in 2006, and as our business isn't e-commerce (the 'conversion' happens offline, sometimes months after the visitor acquisition) we need to integrate in this way to get a true picture of campaign effectiveness, and notion of ROI.
All of this means, we need access to the raw data, need to be able to join visitor records on sessionID etc, without this access we'd be screwed. I'd love it if we could roll without it, but the current requirements mean we can't, so this alone is a HUGE reason why Google analytics is not good enough.
Over and out
For tracking desktop software or creating a whitelabel solution there are better solutions.
For white label an integration based analytics, i use MixPanel. For Desktop Software, i use Deskmetrics
Google Analytics does not work well with mobile phones. While the iPhone and the Palm may be supported, many of the existing handsets do not support the javascript that Google uses.
If you're based in the UK, then theoretically you could be breaking the Data Protection Act by using Analytics.
If information about your users (like which web pages they're looking at) goes "outside the European Economic Area" and onto Google's servers in the US, then you're breaking the DPA.
Pretty obscure, but you did ask :)
Piwik avoids the problem because you host it on your own servers.
Lack of ownership of the statistics - Google owns the data.
... As for the ownership issue, this is a
bit paranoid IMO.
One problem with it is that we can't even access the raw data. We had a use case this week where we wanted a visitor map for an executive presentation. We needed to get more flexible with how the visitor map is displayed (wanted to view the map in Google Earth plug-in). In GA, you can't. You take what they give you. You can see a map of how many visits came from each city, but you can't export a data file of cities and number of visits, to run the data through other tools. So, paranoia aside, there are significant limitations on what you can accomplish with GA.
However this is not a problem if you use Urchin, the self-hosted version of GA: you can export the data and do what you want with it. (And the exported data is richer than the web server log's, as it includes some analysis already.)
Since Piwik is open source, and pluggable, I imagine you could enhance the visitor map plug-in any way you wanted to. And export whatever data you want.
Whether this limitation affects you depends on your needs, obviously.
Update: I've now looked at the GA Data Export API, and it turns out that things you cannot do through the UI (as you can with Urchin), you can do with this API. It does look like you can export the visit data I was talking about, via a feed (although there are daily traffic caps on those requests). So sprinkle salt heavily on what I wrote above.
A couple more points that I've come across:
GA doesn't let you dig beyond full-day statistics; I would often like the ability to investigate whether a traffic dip the previous day was caused by the design update I did at 1pm or the soccer match on TV at 8pm.
GA doesn't offer a workaround for traffic spikes caused by DDoS attacks, Slashdotting etc. When I'm looking at a GA visitor graph of 2009, all I can see is the 2-million-pageview-spike on October 16th, pushing the entire rest of the year down flat against the horizontal axis of the graph. To get a meaningful graph, GA should offer the ability to trim or exclude outlying data points, or the ability to limit/bracket the graph window itself
GA doesn't have an event monitoring client (think Reinvigorate's Snoop tool)
While GA is very user-friendly, I've found it's not as granular as some of the other stats programs (or maybe I'm not looking in the right places). Before the marketing monkeys I work with began pushing GA, we were very satisfied with AWStats. The sheer scope of the data helped us on several occasions hone sites to better suit their audience. While GA is very shiny and laid out well, I personally still prefer the raw numbers like I used to get through AWStats.
Slow data processing speed - Can be as low as 15-30 mins for page views, but may be up to 48 for eCommerce
EULA is limiting in some cases
You won't own or have any control of the data. Google's engineers might use it (anonymously) for testing
Anything more complex requires customization - Downloads and such care of no issue, but there are limits
Cross domain tracking by linker is faulty at best
Visit based - Proper tools are based on Visitor level, GA works on Visit based reporting mostly
Limited number of custom vars used at one time (5)
No tech support, if you're realistic
Usually when there is a downtime notice, it's already gone
API limitations (4 dimensions and 10 metrics at one time, not all can be used together in addition to that)
I have many more, but at the end of the day it is a good tool for it's price.
From the non-technical point, I think the most important is that some enterprise has the high level data security policy. All of the data should be controlled and managed by themselves.
If you use the Google analytics,the data is stored in google's server. For some special enterprise, like insurance, financial company. The policy should be followed.
I would NOT go with server logs. In fact I have them disabled on my server. Why you ask me?
For the simple reason that everytime you hit my server that stupid logging program makes an entry in the physical log file on my HDD. So if my server gets 100,000 hits in a day that's 100,000 time a HDD write operation happens.
You think that's cool? Well it's not. It's slowing your server down, specially if the log file is huge.
Why would someone even consider doing that to their server? Specially when we're working so hard to minify javascript, css and make image files 2 KB smaller!
Please do yourself a favor don't log directly on your server.
At least Google Analytics logs it on Google's server so my server's healthier.
I wouldn't use it for any of my sites, because you're forcing the user to accept your proprietary JavaScript code in their browser, which is bad. Also, giving your data is Google is a really bad idea.
See Piwiki for something you can run yourself as in free software, eliminating both of the problems.
The task is simple, and the answers might be many.
But here goes: On my website I'll make an InfoAboutYou.aspx page. So far i got the IP and the browser name and version, but ill like to expand, with just about every thing i can look up about the curret user/ip and hes Browser/OS
Does it exists any free webservices that kan lookup more information about a specific IP?
The idea is to see how specific a random user can be pin pointet
Thank you
You can use various service to determine a geographic location.
One example is:-
http://www.ipgeo.com/
There are plenty of things that you can determine by using JavaScript and having it post back to the server like some of the stats that Google Analytics does. You can determine screen size, etc.