I've been trying to use the Analytics API to get site search terms into a Google Ads script. I've used this basic setup before and it worked perfectly, but the data this time doesn't match what's in the interface by a long shot. I've also tried using the query explorer but that's also giving me very different numbers to what's in the interface (by almost a factor of 10).
I've checked and double checked that the metrics and dimensions I'm using are correct, but there really aren't that many options. Does anyone know what I'm doing wrong?
Here's the code I've been using:
// Build the query for the Analytics API
var query = {
"optionalArgs": { "dimensions": "ga:searchKeyword", },
"ids": "ga:" + analyticsView,
"metrics": "ga:searchUniques",
"start-date": startDate,
"end-date": "yesterday" };
var results = Analytics.Data.Ga.get(query.ids, query['start-date'], query['end-date'], query.metrics, query.optionalArgs);
// Format the results for Javascript
var formattedJson = JSON.stringify(results, null, 2);
var jsonData = JSON.parse(formattedJson);
// Iterate through the results
for (var i = 0; i < jsonData.rows.length; i++) {
var row = jsonData.rows[i];
var searchTerm = row[0];
var sessions = row[1];
I've tried with ga:sessions instead of searchUniques and a few other combinations of metrics and dimensions but nothing works and based on the documentation the ones I have in the code really seem to me to be the right ones!
I've noticed this myself just now (May 2019) when comparing a regular monthly-run API query to the Query Explorer today. The Query Explorer numbers are, similar to your case, at least about nine times greater.
The API-sourced numbers seem to be pretty consistent, and in the past the Query Explorer numbers were in line with the API-sourced numbers, so...
The only conclusion I can make is that the Query Explorer is presently broken (or changed enough it would take significant alteration in its usage to make it not broken).
This is consistent with my expectations of Google code "quality" and "QA".
I would suggest you rely on Google less as much as possible. They do not make reliable products, and they are not interested in your business’ longevity.
In case anyone else gets stuck here, my issue was simply that I was limited to a maximum of 10,000 entities returned, and since I wasn't sorting them I was getting 10,000 random ones back and not what I expected. As soon as I sorted them descending by what I needed I got exactly what I expected!
Related
I've had an analytics reporting API running for a while now and unfiltered view results from the API match the web reporting. The issue I'm seeing is when adding a segment to the API report request. The web reporting is frequently returning different values than the API for a handful of the segment/view_id combinations. I'm looking for a recommended settings to review here to understand what is causing the discrepancy, as I'm not sure if this is an program code/API issue, web reporting issue or a configuration for segment/view_id issue.
Notes:
When incorrect, it appears that the web reporting numbers for sessions is averaging 10% higher than what the API returns
A single segment is applied to many view_ids we manage and a high percentage (~80%) are showing the discrepancy, the remainder match.
the modified and created dates for this segment are 5 months old per the web interface, meaning there is not a configuration change within the segment causing the discrepancy
we've compared 2018 YTD to eliminate a time lag data update as an issue.
segments appear to be link to our master account level and applied to the accounts we manage.
currently using v4 of the analytics API for .Net (C#)
Current Questions:
Could this be a setting in how a particular segment was created?
Why would some segment/view_ids match and others not?
Is there a account, property or view_id permission/configuration setting to review as it relates to applying segments?
Any help or insights on what to review here would be helpful.
Forgot the code snippet:
var segmentDimension = new Dimension { Name = "ga:segment" };
var DefaultReportRequest = new ReportRequest
{
DateRanges = new List<DateRange> { dateRange },
Dimensions = new List<Dimension> { date, SourceMedium, Campaign, AdContent, Keyword },
Metrics = new List<Metric> { sessions, Users, NewUsers, Bounces, pageViews, SessionDuration, Goal01Completion, Goal02Completion, Goal03Completion, Goal04Completion },
ViewId = v_id,
PageSize = 10000
};
if (!(segmentId == ""))
{
DefaultReportRequest.Dimensions.Add(segmentDimension);
Google.Apis.AnalyticsReporting.v4.Data.Segment segment = new Google.Apis.AnalyticsReporting.v4.Data.Segment() { SegmentId = segmentId };
DefaultReportRequest.Segments = new List<Google.Apis.AnalyticsReporting.v4.Data.Segment> { segment };
};
var getReportsRequest5 = new GetReportsRequest
{
ReportRequests = new List<ReportRequest> { DefaultReportRequest }
};
var batchRequest5 = reportingService.Reports.BatchGet(getReportsRequest5);
var response5 = batchRequest5.Execute();
Thanks in advance for your help,
Mike
Update 2:
After reviewing this further the API call is always pulling a single day of data "Yesterday". The web reporting when pulling that single specific day of data matches. If the web reporting pulls a time range of data around those specific dates (ex: +/- 3 days) the numbers no longer match. It seems like sampling could be in play here, but the web reports we are running indicate 100% of sessions in both pulls. I think the question is how to determine which is more accurate a single day or a time range of data. Has anyone investigate this, I've reproduced it on several of our view_ids.
Thanks,
Mike
Update 3 (rseolution):
Turns out the issue was with how the segment was created and being applied to web reporting. The segment was focused at the User level, meaning aggregated values would change based on the time frame selected. The desired state was having the filters apply to a single day, making session focus a better then user as it contained the segment to the session.
Thanks all,
Mike
Without knowing too much about the details of the segments and views, the first thing I'd like to confirm with you is that you're aware of sampling in GA.
Unless they're all 360 accounts, you'll be subjected to sampling depending on the sessions you're returning for 2018 YTD. Note, sampling is based on sessions on the property level, not view level.
Another thing you can do in your code is to check if the sampling of the % of data matches with the web version VIA the response from the API. On the web version, the sampling info is here:https://i.stack.imgur.com/hcPGD.png
Besides this nifty tool i found : https://immersion.media.mit.edu/
Is there any other tool/google script that can simply tell me a list of the top most emails sent ( ideally with date filters and in a spreadsheet format ) ?
the issue with Immersion is it wont show me a Subject line.
I don't know of any tools for your specific problem, but you could write a script and connect to a google spreadsheet on google drive - Google's tutorial (i.e. make your own tool). This would be particularly useful if you want the results formatted as a spreadsheet.
Once you've figured out how to write a google script, you can use the global variable GmailApp to query your emails and iterate through the results like so:
function myFunction() {
var maxResults = 200; // the number of results the seach gives at a time
var query = "from: Joe"; // the same as a search query you would type into the gmail ui
var count = 0; // the index of the last searched email
var threads;
do {
threads = GmailApp.search(query, count, maxResults);
/* you can manipulate threads here */
count += threads.length;
}
while (threads.length === maxResults); // when the results are no longer full you've counted them all
}
Or if the total number of results isn't expected to be very large, you can just directly call: GmailApp.search(query);
Depend heavily on the query to tailor your results because the script can get very slow if you need to make a lot of calls to thread.getMessages() to check stuff. Google's search query can do it all much faster.
Here's how you can make a gmail query based on the date.
Tryng to get a simple result using "Where" style in firebase but get null althe time, anyone can help with that?
http://jsfiddle.net/vQEmt/68/
new Firebase("https://examples-sql-queries.firebaseio.com/messages")
.startAt('Inigo Montoya')
.endAt('Inigo Montoya')
.once('value', show);
function show(snap) {
$('pre').text(JSON.stringify(snap.val(), null, 2));
}
Looking at the applicable records, I see that the .priority is set to the timestamp, not the username.
Thus, you can't startAt/endAt the user's name as you've attempted here. Those are only applicable to the .priority field. These capabilities will be expanding significantly over the next year, as enhancements to the Firebase API continue to roll out.
For now, your best option for arbitrary search of fields is use a search engine. It's wicked-easy to spin one up and have the full power of a search engine at your fingertips, rather than mucking with glacial SQL-esque queries. It looks like you've already stumbled on the appropriate blog posts for that topic.
You can, of course, use an index which lists users by name and stores the keys of all their post ids. And, considering this is a very small data set--less than 100k--could even just grab the whole thing and search it on the client (larger data sets could use endAt/startAt/limit to grab a recent subset of messages):
new Firebase("https://examples-sql-queries.firebaseio.com/messages").once('value', function(snapshot) {
var messages = [];
snapshot.forEach(function(ss) {
if( ss.val().name === "Inigo Montoya" ) {
messages.push(ss.val());
}
});
console.log(messages);
});
Also see: Database-style queries with Firebase
Ektron 801 SP1
I am using the following code to fetch some smart form content. Can I pre-sort (using OrderByField?) before I fetch 20 rows? I am sorting memberlist but that is after the fact and kinda useless. What am I missing?
Criteria<ContentProperty> criteria1 = new Criteria<ContentProperty>();
criteria1.AddFilter(ContentProperty.XmlConfigurationId, CriteriaFilterOperator.EqualTo, MEMBERS_ID);
criteria1.PagingInfo = new PagingInfo(20);
List<ContentType<member>> memberslist = contentTypeManager.GetList(criteria1);
I have good news and bad news for you.
First, the good news. You can sort by Content Properties with the Criteria object before you pull the 20 items. You'll want to use the OrderByField and OrderByDirection properties of the criteria.
criteria.OrderByField = ContentProperty.DateCreated;
criteria.OrderByDirection = EkEnumeration.OrderByDirection.Descending;
The bad news comes when trying to order items based on fields within the Smart Form. You might be able to do so using the IndexSearch API, but since Ektron 8.0* still relies on Microsoft's Indexing Service, I'm not a fan of that approach and don't have any code to share. If you choose to go that route, the premise is to use search to return the content IDs in the correct order, then use the criteria, as you are, to get items with those IDs.
What you can do with just the API is use Microsoft LINQ to sort the data after it's loaded, but in order to get the right results in the right order you have to load all items first (and ideally cache them to minimize performance impact). I'm using one of my content types as an example, but you should get the idea.
var membersList = new List<SlideBannerType>();
var sortedList = membersList.OrderBy(s => s.EnableAlternateText);
var firstpage = sortedList.Take(20);
var nextpage = sortedList.Skip(20).Take(20);
It's not ideal, but it does work very well for smaller (in the hundreds, perhaps thousands, but not tens of) data sets.
The second bit of good news, though, is that Ektron uses Microsoft Search Server for versions 8.5 and up. This has a much, much more robust API and performs fantastic (both in terms of speed and reliability). The premise would actually stay the same as that for the IndexSearch, use Search to get the IDs in the right order, and then ContentManager (or ContentTypeManager) to get the items. I've used this approach several times, albeit not with Smart Forms specifically. Your best result would come from upgrading to 8.6 and Microsoft Search Server and using the two APIs together to get each page of data. In doing so, it would actually be almost trivial at that point to mix in advanced search and filter options as well with the new search APIs.
For a web application, I need to get a list or collection of all SalesOrders that meet the folowing criteria:
Have a WarehouseKey.ID equal to "test", "lucmo" or "Inno"
Have Lines that have a QuantityToBackorder greater than 0
Have Lines that have a RequestedShipDate greater than current day.
I've succesfully used these two methods to retrieve documents, but I can't figure out how return only the ones that meet above criteria.
http://msdn.microsoft.com/en-us/library/cc508527.aspx
http://msdn.microsoft.com/en-us/library/cc508537.aspx
Please help!
Short answer: your query isn't possible through the GP Web Services. Even your warehouse key isn't an accepted criteria for GetSalesOrderList. To do what you want, you'll need to drop to eConnect or direct table access. eConnect has come a long way in .Net if you use the Microsoft.Dynamics.GP.eConnect and Microsoft.Dynamics.GP.eConnect.Serialization libraries (which I highly recommend). Even in eConnect, you're stuck with querying based on the document header rather than line item values, though, so direct table access may be the only way you're going to make it work.
In eConnect, the key piece you'll need is generating a valid RQeConnectOutType. Note the "ForList = 1" part. That's important. Since I've done something similar, here's what it might start out as (you'd need to experiment with the capabilities of the WhereClause, I've never done more than a straightforward equal):
private RQeConnectOutType getRequest(string warehouseId)
{
eConnectOut outDoc = new eConnectOut()
{
DOCTYPE = "Sales_Transaction",
OUTPUTTYPE = 1,
FORLIST = 1,
INDEX1FROM = "A001",
INDEX1TO = "Z001",
WhereClause = string.Format("WarehouseId = '{0}'", warehouseId)
};
RQeConnectOutType outType = new RQeConnectOutType()
{
eConnectOut = outDoc
};
return outType;
}
If you have to drop to direct table access, I recommend going through one of the built-in views. In this case, it looks like ReqSOLineView has the fields you need (LOCNCODE for the warehouseIds, QTYBAOR for backordered quantity, and ReqShipDate for requested ship date). Pull the SOPNUMBE and use them in a call to GetSalesOrderByKey.
And yes, hybrid solutions kinda suck rocks, but I've found you really have to adapt if you're going to use GP Web Services for anything with any complexity to it. Personally, I isolate my libraries by access type and then use libraries specific to whatever process I'm using to coordinate them. So I have Integration.GPWebServices, Integration.eConnect, and Integration.Data libraries that I use practically everywhere and then my individual process libraries coordinate on top of those.