I read some rss feed from my application, and as some articles can be updated on the site, I wanted to know if there is a field I can look at, in order to see if I'm getting the same article or a modified version.
I know there's a ttl field, but it is just an hint of how long I can keep the article in the cache: it doesn't actually say if it was really updated or not.
As long as you don't mean something like the cloud element there isn't any.
"Its purpose is to allow processes to register with a cloud to be notified of updates to the channel, implementing a lightweight publish-subscribe protocol for RSS feeds."
See the list of elements at w3.org.
Related
I'm pretty sure the answer to this question is "no", but I would like to get a definitive answer from an official source, and also understand what my alternative options might be.
Long story short, my app has old data in it that used to include user email addresses as a GET parameter. Those URLs are showing up as unique page view URLs in Google analytics, like this:
I don't want to be recording email addresses in my Google Analytics account (for privacy reasons), and I have fixed the code that was causing this in the first place, but I also want to delete or scrub the old data that currently exists in Google Analytics.
From everything I've read, it doesn't sound like this is possible without completely deleting the property, maybe even the account?
To be clear, I am NOT interested in creating new views that don't include URLs with email parameters in them, or otherwise change the view and not the data. The data needs to be gone and be completely inaccessible to anyone with access to this Google Analytics account.
Here are the options I've come up with:
Delete the property and start over. I'm pretty sure this will
actually delete the collected data, but it's not clear to me if I
would have to actually delete the account itself to achieve that.
Set the data retention time to the lowest possible value (looks like 14 months right now) and wait 14 months for it to go away https://support.google.com/analytics/answer/7667196?hl=en
Perform some kind of magic to get in contact with an actual human at Google who could help me scrub or remove this data.
Does this sound right? Are there options I'm missing? If there's a way to do this through a Google API that would not be a problem.
If this is still a relevant issue. GoogleAnalytics provides a way to delete some data. Universal Analytics https://support.google.com/analytics/answer/9450800?hl=en and GA4 https://support.google.com/analytics/answer/9940393?hl=en&ref_topic=2919631
You are right: changing the data, that you have collected, and Google Analytics have already processed, is not possible. You have the option to make changes during processing with various filters, e.g. Search-and-replace filters, but as it is written in this official support article:
Like all filters, search-and-replace filters only apply to hits
collected after you've applied the filter to the view (filters cannot
change historical data).
Regarding you suggested options:
Deleting a view or property will result in a permanent loss of data after a 35 days period of waiting time. (While this could be undone.) So unless the requirement of scrubbing the collected PII is more important than having your historical data, this should not be a way to go. The same applies to deleting the whole account, so it would be enough to delete affected properties or views.
From the article you have linked as well, you can see, that data retention is about removing user and event level data, and it will not affect the data in aggregated reports. My understanding is, that an already created, page level report will keep showing the page with an email address:
Keep in mind that standard aggregated Google Analytics reporting is
not affected.
I hope these references help you to evaluate your options. Sorry for not being able to come up with a solution, but the basic concept is, as highlighted in this Google article:
Once Analytics processes the data, it’s stored in a database where it can’t be changed
I work for a nonprofit which help disabled military veterans. We have all our participants register with us using Salesforce as the repository of their registrations. We have dashboard components in Salesforce Lighting which totals up the number of active participants we have. I would like to display the component on our WordPress site but have never done anything like that before. I was hoping to find someone who has done something like that and offer some direction on how to go about doing it.
I tried looking up WordPress plugins which integrate with Salesforce. Most seem to be geared towards sending registrations back and forth but not displaying information. From a little bit of research, it seems like coding might need to be involved. Maybe doing a REST API with a Post option which will send the data through an HTTP URI? But to my understanding is that it would require WordPress to be an API. I am sure there are gaps in my logic.
I dont have an extensive amount of programing language experience but am willing to learn. I have taken a few Java and JavaScript classes in school.
I have not attempted this yet. I am just looking for feedback and direction.
Few options here, in no specific order...
Do Wordpress users have real Salesforce accounts or is their data simply stored in SF? Ask your Salesforce admin if there's a "customer community" configured (if your SF org is really old he might refer to it as customer portal). Communities offer nice way of exposing SF to poeple who don't need full SF user licenses. Think like collaborating with real SF users on "My Cases", viewing reports & dashboards... But for this you'd really need people logged in to SF so it won't work if you want just something anonymous. Some more info
Another option might be using Sites (Visualforce pages that expose SF data to guest users). Think like displaying a product catalog, FAQ, web-to-lead form or some other generic "contact us" page that's anonymous. So if you have SF developer (or admin with good copy-paste skills) you could use some Visualforce charts. They can be 100% coded (like this) or fed data from a report (like this) so it's simpler for admin to change the report filters or something without really writing code. Not sure if the simple route will work on a Site, there are some old answers that say "No", you might have to try it out. Worst case you'd need Apex code (or JavaScript) to query SF for results and display them. And display that SF Site page as <iframe> in Wordpress.
A slight twist on the Sites option - do you use Chatter (bit like Twitter inside SF)? There's way to take a snapshot of a report when a milestone has been met and post it to chatter ("congrats for hitting X participants"). And embed feeds on Visualforce pages too. Docs
What SF edition you're on (Group/Professional/Enterprise...)? If you have API access to Salesforce you could query the info yourself from Wordpress and display it using whatever charting library's easiest for you (Google Charts, Flot...). There are tons of examples how to connect to SF from PHP (or maybe you could cannibalize a WP plugin). Technically it's one POST message to log in to SF and one GET to run a query (something as simple as SELECT COUNT() FROM Contact WHERE isActive__c = true?)
That'd be more or less everything in terms of pulling data out of Salesforce. I mean if you have API access enabled you can slice & dice it how you want, extract data with raw PHP code or use some middleware but overall idea doesn't change. Write queries yourself or use "Analytics API" to access report results (so your administrator has power to change it without coding)...
So how about pushing? SF could notify you about current participants count. At scheduled intervals or even realtime. That'd be "just" raw data though, you'd have to write visualisation yourself.
Plenty of options here
workflow rules (code-free), sends XML message to specified URL so you'd need a WP page that can "capture" the result. Could be sent on creation of new record or update of existing. Won't give you totals, it'd be data related to that particular record so you'd have to build kind of +1 / -1 counter... Or if you use a report + analytic snapshot (helper object to store report results) and have workflow on that - that could be really close to what's needed.
scheduled apex job to run some queries and send the results to you. Again - you'd need a WP url that can be called from SF
if there's a CometD plugin for Wordpress you should look at Salesforce Streaming API, Platform Events or (newer and even simpler to configure) Change Data Capture. Basically you "subscribe" to a topic (a SF query) and whenever SF data changes and SF decides it'd change the results of the query - it'd push the results to you. It's almost realtime. Too much to write about them, perhaps best if you'd try to click through some trailheads - SF self-paced training courses:
https://trailhead.salesforce.com/en/content/learn/modules/api_basics/api_basics_streaming
https://trailhead.salesforce.com/en/content/learn/modules/change-data-capture
https://trailhead.salesforce.com/en/content/learn/modules/platform_events_basics
I am using link to generate deep linking. I am using their public API's endpoint to generate links.
Here is their endpoint: https://api.branch.io/v1/url
I append my branch key and data that I need to associate in this link. Everything is working fine but I need to expire this link within one hour.
Reading up here: https://github.com/BranchMetrics/branch-deep-linking-public-api#creating-a-deep-linking-url
I added "duration" key also, but it didnt expire the link.
It will be great if anyone could help me in figuring out how to expire branch.io link.
Alex from Branch.io here: the duration parameter is used for something different, so it's not going to be able to do what you want. We don't have a built-in feature to expire links, but you could create something close to it yourself:
Add a custom link parameter containing a timestamp for when the link was created.
Check for that timestamp when handling the link at the destination, and do something different if it is more than an hour old. I'm guessing this would be inside your app, and also on whatever fallback URL you have specified for when the app isn't installed or the user is on desktop.
Mail from branch.io support team suggested this answer as below:
If you found out about the $exp_date parameter from here then the
parameters in that list are only used for iOS Spotlight Indexing but
will be used by Branch in the future. A better solution than
utilizing $exp_date is to code logic into your client to determine
what to do with link data based on date. This way, your deep links
will always work and always carry data through, and you won't have to
worry about users clicking empty links.
This way, you would include date as an extra meta key/value pair, and
examine this date in your client when receiving link params to
determine if you want to honor the link's contents or not.
I have a large set of podcast feed URLs which I'm periodically polling to check for updates. I'm really struggling to find a robust way to detect if a feed has changed that doesn't have any false positives. I'd like to be able to detect not just if there is a new episode, but also if an existing episode was updated.
RSS and Atom feeds provide pubDate, lastBuildDate or updated elements. However, I'm finding these frequently misused so that the feed is actually inserting the current date time into these fields each request. This makes them difficult to rely on to detect changes.
My next thought was to strip all date information from the podcasts, then MD5 hash the feed contents. I can then compare the feed hashes to detect changes to the feeds.
This seems to work for about 90% of the cases. However, there are still hundreds of podcasts that insert dynamic data into their feeds.
One podcast has the following as their podcast cover art:
http://erikglassman.hipcast.com/albumart/1000.1439649026.jpg
Where 1439649026 is what I assume is a timestamp. This second number changes with each request of their feed.
This is starting to seem like a losing battle. If I can't reliably trust the date fields of a podcast feed, and if some percentage of podcasts insert dynamic data into their feed text, how can I reliably detect changes to a feed in a robust way?
Everything you say is true, so it's not a good idea to try to detect changes at the feed level, instead look for them at the item level.
That generally works, if it doesn't the feed can't be used by anyone, so the source of the feed is likely to have fixed any problem. That's why I think it works so well.
I've been writing feed readers as long as they have existed, my current product is called River4, it's available as open source, MIT License, so you can use it as example code, for this and other issues.
This is where it checks if an item is new:
https://github.com/scripting/river4/blob/master/river4.js#L1411
That might move around as the code changes, so look for a routine called getItemGuid. It shows you how to get a value that uniquely identifies the item. I use this code for my podcatcher, http://podcatch.com/, and it seems to catch the new items, and doesn't get false positives.
Hope this helps! :-)
I would look for some feedback on tracking user activity on an commerce website using th google analytics commerce capabilities.
I can't fully understand those 3 parts :
Adding an item (ecommerce:addItem) : obviously when some user add a thing to the cart
Adding a Transaction (ecommerce:addTransaction) : that's where I'm very confused
Sending the data (ecommerce:send) : that's obvious
Can those 3 event append at a different moment ? in what manner ?
What would be a real-world use case that would make you use execute ecommerce:addTransaction and ecommerce:send at a different moment ?
This thing makes me wonder a lot, and I'd like to have some experienced feedback on this as you tend to easily break your stats if something is not done week enough
Thanks in advance
EDIT
So the main purpose right here is to get stats for the pending orders (you add stuff to your cart), and the complete orders (you paid for the things you added).
Right now I only send it all when the order is complete, and things are working pretty good in analytics, but I just don't know anything about the ones that did not complete.
This question was a lack of knowledge.
Simple ecommerce plugin has nothing to do with the enhanced ecommerce plugin
You won't track that much with the first one, except the checkouts. A plain, one order at a time, revenue value.
If you want a deep insight on your users behaviors (when i say deep, I mean it), You have to go for the second one.
We might be able to debate over the unusefullness of the first one; and the fact that its existence in itself compared to the second is completely misleading, as when you first get in, as usual with google, you get flooded by an endless documentation
ecommerce:addItem does not add items to a cart; it adds items to a transaction (with "conventional" ecommcerce tracking there is no cart tracking, you'd have to use enhanced ecommerce tracking. Actually your title refers to enhanced ("ec:") and your question to conventional ecommerce ("ecommerce:") tracking).
So ecommerce:addTransaction starts a transaction; here goes the stuff that affects the transaction as a whole, like transaction id, tax on the total purchase or shipping costs.
Now that you have started the transaction you can add items to it that are associated via the transaction id.
Finally the ecommerce:send command tells Universal Analytics that the transaction should be processed on the server. "send" is actuall a misnomer; addItem and addTransaction do already send data to the server (they each create an request to the tracking server and thus count towards your hit quota).
The reason for this is, as far as I can tell, that the information is transmitted via url parameters (you call the Google Analytics endpoint which returns an transparent pixel). The maximum length for an url request is limited (actual limits depend on browser and browser version).
So the transaction is broken up into multiple parts not because you want to execute the commands at different moments but so it can be transmitted via Url parameters without being truncated. The send command merely tells that you are now finished adding new parts to the transaction and the data can now be processed.