Determine which marketplace a Sponsored Product belongs to - in Amazon Advertising API - amazon-advertising-api

Currently I am pulling reports for SponsoredProducts and save it in database for modifications of results displayed to the user. The frontend app has filtering functionality by marketplace. How do you determine which Marketplace a Sponsored Product belongs to? In Advertising api, there's no option to pass the marketplace id as parameters. Here is the example result from Advertising API
{
"sku": "DT-ALF3-XXX",
"adId": 11308014834513628,
"asin": "B07V3ZBJSXX",
"cost": 0,
"clicks": 0,
"currency": "EUR",
"adGroupId": 988845154306152,
"campaignId": 253559015894968,
"adGroupName": "halterung",
"impressions": 0,
"campaignName": "29.08.2019 - Breit",
"attributedSales1d": 0,
"attributedSales7d": 0,
"attributedSales14d": 0,
"attributedSales30d": 0,
"attributedConversions1d": 0,
"attributedConversions7d": 0,
"attributedConversions14d": 0,
"attributedConversions30d": 0,
"attributedSales1dSameSKU": 0,
"attributedSales7dSameSKU": 0,
"attributedUnitsOrdered1d": 0,
"attributedUnitsOrdered7d": 0,
"attributedSales14dSameSKU": 0,
"attributedSales30dSameSKU": 0,
"attributedUnitsOrdered14d": 0,
"attributedUnitsOrdered30d": 0,
"attributedConversions1dSameSKU": 0,
"attributedConversions7dSameSKU": 0,
"attributedConversions14dSameSKU": 0,
"attributedConversions30dSameSKU": 0,
"attributedUnitsOrdered1dSameSKU": 0,
"attributedUnitsOrdered7dSameSKU": 0,
"attributedUnitsOrdered14dSameSKU": 0,
"attributedUnitsOrdered30dSameSKU": 0
},
Regards,

Found the solution.Since the amazon advertising requires specific profile id in every request(the profile id is the id exclusive to a customer who gives permissions to a third party to fetch data from Amazon advertising api on their behalf), I just need to map the profile id's related marketplace by pulling the /v2/profiles end point in Amazon Advertising API

Related

Why do I have two different product ids on the same product in my data Layer?

I have a problem with product ids values on my data layer. When I checked Purchase data layer I could see ecommerce data layer section that looks like bellow:
ecommerce: {
currencyCode: "EUR",
purchase: {
actionField: {
id: "394299",
affiliation: "",
revenue: 0,
tax: 0,
shipping: 0,
coupon: "test999"
},
products: [
{
id: 296,
name: "Test Name",
sku: "STR99",
category: "Housewarming",
price: 0,
stocklevel: null,
quantity: 1
},
{
id: 393,
name: "Test Name2",
sku: "MN61",
category: "Wedding",
price: 0,
stocklevel: null,
quantity: 1
}
]
}
}
}
So I see that Test Name's ID equal to 296 and Test Name2 id equal to 393. But when I am on a product page for example on Test Name product page in data Layer I see that value of ecomm_prodid key equals to 484.
I compare it to ID that I could find in Woocomerce admin panel and it looks like the ID from ecomm_prodid key is valid. So the question is why am I seeing different value in Purchase data layer. And on Purchase I don't have valid one and unvalid. I only have one that i unvalid. Data Layer is implemented by wordpress plugins.

Here maps: Routing - Calculate matrix response shows failed status for few inter combinations of starts to destinations in few countries

Making a Rest call to calculate matrix of HERE routing with multiple starts and destinations but getting proper response only for direct one to one start and destinations and getting status:failed for other inter combinations (getting only for principal diagonal values). Facing the issue only for few countries (here it is India) but working for the samples in the website (Europe)
Rest GET call: https://matrix.route.ls.hereapi.com/routing/7.2/calculatematrix.json?apiKey=<API_KEY>&mode=balanced;car;traffic:disabled&summaryAttributes=distance,traveltime&start0=17.251160,78.437737&destination0=16.506174,80.648018&start1=13.069166,80.191391&destination1=12.971599,77.594566
Response: {
"response": {
"metaInfo": {
"timestamp": "2020-02-04T12:36:09Z",
"mapVersion": "8.30.105.150",
"moduleVersion": "7.2.202005-6333",
"interfaceVersion": "2.6.75",
"availableMapVersion": [
"8.30.105.150"
]
},
"matrixEntry": [
{
"startIndex": 0,
"destinationIndex": 0,
"summary": {
"distance": 286827,
"travelTime": 24236,
"costFactor": 24029
}
},
{
"startIndex": 0,
"destinationIndex": 1,
"status": "failed"
},
{
"startIndex": 1,
"destinationIndex": 0,
"status": "failed"
},
{
"startIndex": 1,
"destinationIndex": 1,
"summary": {
"distance": 339029,
"travelTime": 26924,
"costFactor": 26845
}
}
]
}
}
The reason behind the observed behaviour is that the road network in
India is quite dense, and in some areas the algorithm is not able to
find an optimal route within a reasonable time limit.
We suggest trying out our Large Scale Matrix Service. It supports two use cases:
Matrix Routing calculations with live traffic information for matrices up to 10000x10000 size in a limited size region (up to 400km in diameter).
Matrix Routing calculations without live traffic information for matrices up to 10000x10000 size without region limitations for fixed sets of parameters (profiles).

Why does LinkedIn's organizationalEntityShareStatistics endpoint return -1 for values in certain requests?

I am using the organizationalEntityShareStatistics endpoint but noticed that certain requests will return -1 for a value. Is this another way of indicating null? For example, the following request for my personal organization returns a shareCount of -1.
https://api.linkedin.com/v2/organizationalEntityShareStatistics?q=organizationalEntity&organizationalEntity=urn%3Ali%3Aorganization%3A35526437
{
"elements": [
{
"totalShareStatistics": {
"shareCount": -1,
"uniqueImpressionsCount": 434,
"clickCount": 25,
"engagement": 0.029905178701677606,
"shareMentionsCount": 0,
"likeCount": 10,
"impressionCount": 1371,
"commentMentionsCount": 0,
"commentCount": 7
},
"organizationalEntity": "urn:li:organization:35526437"
}
],
"paging": {
"count": 10,
"start": 0,
"links": []
}
}
I have also noticed this happening when querying share statistics for a specific share but I cannot provide that specific request because it is client data.
So to reference their docs for the likeCount:
This field can become negative when members who liked a sponsored share later unlike it. The like is not counted since it is not organic, but the unlike is counted as organic.
MRW
...so I would assume this also applies to shareCount and who knows what other fields as well.

How can i handle a lot of data with timestamp in arangodb?

i am new to handling a lot of data.
Every 100ms i write actually 4 json blocks to my arangodb in a collection.
the content of the json ist something like that:
{
"maintenence": {
"holder_1": 1,
"holder_2": 0,
"holder_3": 0,
"holder_4": 0,
"holder_5": 0,
"holder_6": 0
},
"error": 274,
"pos": {
"left": [
21.45, // changing every 100ms
38.36, // changing every 100ms
10.53 // changing every 100ms
],
"center": [
0.25, // changing every 100ms
0, // changing every 100ms
2.42 // changing every 100ms
],
"right": [
0, // changing every 100ms
0, // changing every 100ms
0 // changing every 100ms
]
},
"sub": [
{
"type": 23,
"name": "plate 01",
"sensors": [
{
"type": 45,
"name": "sensor 01",
"state": {
"open": 1,
"close": 0,
"middle": 0
}
},
{
"type": 34,
"name": "sensor 02",
"state": {
"on": 1
}
}
]
}
],
"timestamp": "2018-02-18 01:56:08.423",
"device": "12227225"
}
every block is another device
In only 2 days there are ~6 million of datasets in the collection.
if i want to get data to draw a line graph from "device 1 position left[0]"
with:
FOR d IN device
FILTER d.timestamp >= "2018-02-18 04:30:00.000" && d.timestamp <= "2018-02-18 04:35:00.000"
RESULT d.pos.left[0]
It tooks a veeeeeery long time so search in this ~6 million datasets.
My question is: is this normal and only machine power can fix this problem or is my way to handle this set of data wrong?
I think ~6 million datasets is not BIG DATA, but i think if i fail with this, how can i handle this if i add 50 more devices collect it not 2 days but 30 days.
converting the timstamps to unix timestamp (number) helps alot.
i added a skiplist index over timestamp & device.
Now, with 13 million datatsets my query runs 920ms.
Thank u!

Polar H7 unresolved service and characteristics (what are they for?)

I am developing an app that uses a polar H7 heart rate monitor.
I initially had some trouble with discovering it's services because i would get a large object and the hex-codes didn't mean anything to me.
Now i have resolved almost all services and characteristics but there are some i haven't figured out yet.
Here is a simplified object containing the service/characteristicUuid's and their name/functionality:
// first layer keys are serviceUuid's
// second layer keys are characteristicUuid's
// with their respective name/description as values
{
"1800" /* Generic Access */ : {
"2a00": "Device Name",
"2a01": "Appearance",
"2a02": "Peripheral Privacy Flag",
"2a03": "Reconnection Address",
"2a04": "Peripheral Preferred Connection Parameters"
},
"1801" /* Generic Attribute */ : {
"2a05": "Service Changed"
},
"180d" /* Heart Rate */ : {
"2a37": "Heart Rate Measurement",
"2a38": "Body Sensor Location"
},
"180a" /* Device Information */ : {
"2a23": "System ID",
"2a24": "Model Number String",
"2a25": "Serial Number String",
"2a26": "Firmware Revision String",
"2a27": "Hardware Revision String",
"2a28": "Software Revision String",
"2a29": "Manufacturer Name String"
},
"180f" /* Battery Service */ : {
"2a19": "Battery Level"
},
"6217ff4b-fb31-1140-ad5a-a45545d7ecf3" /* unknown */: {
"6217ff4c-c8ec-b1fb-1380-3ad986708e2d": "unknown", /* read:true */ // value = uInt16Array [3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
"6217ff4d-91bb-91d0-7e2a-7cd3bda8a1f3": "unknown" /* write:true, indicate:true, descriptors:{ descriptorUuid: "2902" }*/
}}
I couldn't find any documentation for the last serviceUuid and it's characteristicUuid's.
I don't know what i'm missing so i can't tell if it's of any importance to my project.
The ones you are unfamiliar with are vendor specific UUID's. Vendors can define their own custom UUID's
These may or may not be of importance to your project, depending on what you want to extract from the device. If it's just the heart rate you are interested in, it should be no problem and you can follow the Bluetooth standard for it. There may be extra data in those custom UUID's you would like to extract.
Extracting data from those vendor specific UUID's is a matter of trial and error though, unless you can get specifications from the vendor themselves.

Resources