Adobe Audience Manager External data - adobe

At my organization we are starting to use Adobe Audience Manager. We need to read online data from the website, but also to load data from our private database. Today, we do it by using the FTP, but it actually takes almost 3 days to load all the information so we can use it, which is a lot of time for us. I would like to know which is the best way or some alternatives so we can load information in a more agile and fast way, and ideally to read ir in the most real time possible from other sources (like our database or similar).
thanks a lot for your help

AAM offline data can be uploaded either on an FTP location or to an AWS S3 bucket, and unfortunately both of them take from 12 to 24 hours to load on AAM (Adobe Audience Manager), and then it takes another 12 to 24 hours to load them into your DSP (Demand Side Platform).
Given that the only real-time like signals in AAM (that I know of) comes from the online datasources, the best way to achieve your requirement is to do the following:
Send as much information as possible from the online channel channel.
Build an integrate between your CRM data (database in your case) and the online data (user behaviour data on your website).
The CRM data should contain the user details that do not change much, such as demographics (age, gender, ...etc.), and it should also contain the data that are collected via the non-online channels (e.g. retail purchases, customer service phone calls, ...etc.). On the other hand, the online data should contain all the user behaviour data, collected from the online channel. For example, the user search parameters, visited page names, purchased items, clicked links, …etc.
The integration between the online and CRM data can be done by using the same user ID in both activities. The following diagram should give you a high level view of integration. Simple AAM diagram
Here is an example of passing the user ID and online behavior data to AAM
var user_id = "<add your website user ID here>";//ex: user1234
//Add all your online data here
var my_object = {
color : "blue",
price : "900",
page_name : "cart details"
};
//Create the Dil object
var adobe_dil = DIL.create({
partner : "<partner name goes here>",
declaredId : {
dpid : '<add your online data source ID here>' ,
dpuuid : user_id
}});
//Load the object and append "c_" to all keys in the key-value pairs and send data to AudienceManager.
adobe_dil.api.signals(my_object,"c_").submit();
And here is an example of the offline data upload
user1234 "age_range"="1","gender"="F","city"="LA","status"="active"
user5678 "age_range"="2","gender"="M","city"="CA","status"="inactive"
Another Idea, which I haven't done before and I don't really recommend, is to send all your CRM data as online transactions, by calling the online API directly from your back-end. It may cost you more though, given the number of activities you will make to AAM from the back-end.
References:
https://marketing.adobe.com/resources/help/en_US/aam/c_dil_send_page_objects.html
https://marketing.adobe.com/resources/help/en_US/aam/r_dil_create.html

Related

Find out when user followed an account - rtweet

I'm curious to see when accounts started following me on Twitter (and when I started following accounts). It'd be interesting to see my user activity related to the types of accounts I follow, as well as maps of my followers/followings over time + season.
I've tried getting followers and lookup users in the following manner:
followers <- get_followers("twitterhandlehere", n = 50)
followers_data <- lookup_users(followers$user_id)
Followers_data is a data frame with user info including profile picture, bio, and when the user's account was created, but no where in there does it indicate when the relationship started, as far as I can tell.
Nor does this function seem to indicate the date in which the follow/following started:
lookup_friendship("BarackObama", "MyUsername")
It appears the API didn't support this functionality in the past, and I understand I can stream this data in the future - but is there any way to salvage specificity in the past data?
No, this is not available in the API. You would have to have been regularly polling the friends and followers endpoints to record those changes. You cannot discover it from the API at a specific point in time, you'd have to make the record of follower list changes youself.

GA 360 export to Big Query

We have just linked our GA360 account to BigQuery but we noticed from the docs that the userId doesn't get exported. This is really annoying as one of our main use case was to join the userId with our CRM system.
Why is Google not exporting the userId ? Is there workaround ?
Thank you for your answers.
The solution is to create a User level custom dimension and pass your user's ID into that as well.
There's no restriction on exporting your custom dimensions, and these are exported to BigQuery.
Enjoy :)
How User-ID works
User-ID enables the association of one or more sessions (and the activity within those sessions) with a unique and persistent ID that you send to Analytics.
To implement User-ID, you must be able to generate your own unique IDs, consistently assign IDs to users, and include these IDs wherever you send data to Analytics.
For example, you could send the unique IDs generated by your own authentication system to Analytics as values for User-ID. Any engagement, like link clicks and page or screen navigation, that happen while a unique ID is assigned can be sent to Analytics and connected via User-ID.
In an Analytics implementation without the User-ID feature, a unique user is counted each time your content is accessed from a different device and each time there’s a new session. For example, a search on a phone one day, purchase on a laptop three days later, and request for customer service on a tablet a month after that are counted as three unique users in a standard Analytics implementation, even if all those actions took place while a user was signed in to an account. While you can collect data about each of those interactions and devices, you can’t determine their relevance to one another. You only see independent data points.
When you implement User-ID, you can identify related actions and devices and connect these seemingly independent data points. That same search on a phone, purchase on a laptop, and re-engagement on a tablet that previously looked like three unrelated actions on unrelated devices can now be understood as one user’s interactions with your business.
From Google analytics about the userid feature the user id is used in the background by google analytics to analyse your data.
If you want to analyse on the user id you can just add it as a custom dimension you will then be able to see it.

how to check how many token been sold for my Smart contract

I want to create a smart contract and launch it for ICO. I also create a website where people can buy my token. I want know how to check how many token been sold (live)? so i can create a live bar counter to show how many percentages of the token already been sold.
Or is there a way i can monitor the token sale process in the smart contract?
A token contract is no different than any other smart contract. There are no special built in Solidity features or logic associated with them. They are just regular smart contracts that follow a specification.
So, if you want access to the number of tokens sold, you code that into your contract. While tokens sold is not part of the standard ERC20/ERC721 interface, nothing prevents you from adding a constant function to retrieve this information. In fact, if you're using the basic Zeppelin Crowdsale contract, you can just calculate it using the public state variables weiRaised / rate (Chances are you should be creating your own Crowdsale subcontract, so it's better to add the functionality you want there).
We can use the Etherscan Developer API to review transactions against a given contract address and find out the total supply or number of items available for sale.
There is a lot you can do with the Etherscan Developer API. For example, here's one URL that pulls data from Ethereum Mainnet -> Etherscan -> JSON parser -> Shields.io and renders it as an image to calculate the number of Su Squares remaining for sale:
Source: https://img.shields.io/badge/dynamic/json.svg?label=Su+Squares+available&url=https%3A%2F%2Fapi.etherscan.io%2Fapi%3Fmodule%3Daccount%26action%3Dtokenbalance%26contractaddress%3D0xE9e3F9cfc1A64DFca53614a0182CFAD56c10624F%26address%3D0xE9e3F9cfc1A64DFca53614a0182CFAD56c10624F%26tag%3Dlatest%26apikey%3DYourApiKeyToken&query=%24.result
^ I don't know if SO is going to cache the image here. But that URL is a live URL which pulls the number of Su Squares available hot off the blockchain.

Google+ userId into google analytics for attribution modeling

Due to the nature of my website, it is very complex to get a proper attribution model, cross device tracking is complex but necessary.
I would like to know if it is possible to obtain somehow a users google+ userId whenever he visits my site.
I know there is a feature called UserId where I need to generate my own ID and track it upon a customers visit, but due to the nature of my website, there is very little probability they will log in prior to converting (Fact which will make every cross device interaction useless).
You cannot use a Google ID as the User ID in Google Analytics. The User ID must be private and non PII, i.e. Google must have no way to determine who that ID belongs to. Obviously if it's a Google-based ID, they would be able to tell pretty easily.
The relevant bit from the TOS is paragraph 7 :
You will not (and will not allow any third party to) use the Service
to track, collect or upload any data that personally identifies an
individual (such as a name, email address or billing information), or
other data which can be reasonably linked to such information by
Google.

A single google analytics account for many users - possible?

Is it possible to use a single google analytics account, in particular, e-commerce, for more than user? I fact, I need it to be used for as a lot of users. What I want in a nutshell is this:
The users come to my web site and provide me their e-commerce data in json or any other format somehow. I have a google analytics, so I take that e-commerce data and send to google analytics. And then show them the reports for their data from google analytics by google analytics API (I guess it's reports API?)
The question is not whether or not it is profitable, makes sense, etc. The question is, can I use my, single google analytics account to achieve what I've described above?
Yes you can. Since you need to keep the users apart in a way that does not allow them to look into other users data you can use a single account for up to 50 users, since this is how many data views you can have per account (view permissions can be set at account level)1. Filter the view by hostname (or whatever) to record only the current users data per view.
If you do not need the interface (i.e. if you want to query GA via the api and build custom dashboards) you can have even more - simply store in unique id per user and use that to filter the data before displaying it in a dashboard. So as far as that part of the question is concerned you are safe.
Where things probably start to fall apart is data collection. Is looks like you want to do some sort of batch processing of accumuluated e-commerce data. Since you cannot send a timestamp for a user interaction all dates within GA will be off. Plus you have data limits (I'm thinking of max interactions per minute that you can send), so your insertion process might be not very efficient. It would probably be better to create something on top of the measuremnt protocol that allows your clients to send data in realtime.
1 To make this a little clearer, you can set up 50 entities whith different access permissions. Of course every view can have as many users a you like, but they will all see the same data.

Resources