Google Ad Manager COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSION ReportError - report

I am trying to fetch report of line items and works fine from the UI but halt with error via the API. Following is the reportQuery:
{'reportQuery': {
'dimensions': [
'DATE',
'LINE_ITEM_NAME',
'LINE_ITEM_TYPE',
'CREATIVE_SIZE_DELIVERED'
],
'adUnitView': 'TOP_LEVEL',
'columns': [
'TOTAL_LINE_ITEM_LEVEL_IMPRESSIONS',
'TOTAL_LINE_ITEM_LEVEL_CLICKS',
'TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE'
],
'dimensionAttributes': [
'LINE_ITEM_FREQUENCY_CAP',
'LINE_ITEM_START_DATE_TIME',
'LINE_ITEM_END_DATE_TIME',
'LINE_ITEM_COST_TYPE',
'LINE_ITEM_COST_PER_UNIT',
'LINE_ITEM_SPONSORSHIP_GOAL_PERCENTAGE',
'LINE_ITEM_LIFETIME_IMPRESSIONS'
],
'customFieldIds': [],
'contentMetadataKeyHierarchyCustomTargetingKeyIds': [],
'startDate': {
'year': 2018,
'month': 1,
'day': 1
},
'endDate': {
'year': 2018,
'month': 1,
'day': 2
},
'dateRangeType': 'CUSTOM_DATE',
'statement': None,
'includeZeroSalesRows': False,
'adxReportCurrency': None,
'timeZoneType': 'PUBLISHER'
}}
The above query throws following error when tried with API.
Error summary: {'faultMessage': "[ReportError.COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSIONS # columns; trigger:'TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE']", 'requestId': 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', 'responseTime': '98', 'serviceName': 'ReportService', 'methodName': 'runReportJob'}
[ReportError.COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSIONS # columns; trigger:'TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE']
400 Syntax error: Expected ")" or "," but got identifier "TOTAL_LINE_ITEM_LEVEL_ALL_REVENUE" at [1:354]
Did I miss anything? Any idea in this issue?
Thanks !

This issue has been solved by adding a dimension "Native ad format name".

Related

JQ to filter only vaule of id

the following is the JSON data. need to get only of id key
{apps:[ {
"id": "/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd",
"cmd": null,
"args": null,
"user": null,
"env": {},
"constraints": [
[
"hostname",
"GROUP_BY",
"5"
]
},
{
"id": "/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd",
"cmd": null,
"args": null,
"user": null,
"env": {},
"constraints": [
[
"hostname",
"GROUP_BY",
"5"
]
]},
output expected is
/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
Thanks in advance
After fixing the errors in your JSON, we can use the following jq filter to get the desired output:
.apps[] | .id
JqPlay Demo
Result jq -r '.apps[] | .id':
/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd
You can use map() to create an array from the properties of the objects. Try this:
let data = {apps:[{"id":"/application1/4b693882-ffba-4c93-a0f2-cccafcb4d7dd","cmd":null,"args":null,"user":null,"env":{},"constraints":["hostname","GROUP_BY","5"]},{"id":"/application2/4b693882-ffba-4c93-a0f2-cccafcb4d7dd","cmd":null,"args":null,"user":null,"env":{},"constraints":["hostname","GROUP_BY","5"]}]}
let ids = data.apps.map(o => o.id);
console.log(ids);
Note that I corrected the invalid brace/bracket combinations in the data structure you posted in the question. I assume this is just a typo in that example, otherwise there would be parsing errors in the console.

BLE telemetry data via thingsboard-gateway is not updating in thingsboard

I tried everywhere for a solution but I'm stuck.
My setup is the following:
ESP32 uses BLE GATT NOTIFICATION characteristic to push temperature data via thingsboard gateway into thingsboard. Once the BLE connection is established the first telemetry package is shown in the freshly created device's 'latest telemetry' area. If i turn on gateway debugging I can see further notifications reaching thingboard like this:
{"LOGS":"2020-07-20 02:04:19,640 - DEBUG - [ble_connector.py] - ble_connector - 321 - Notification received from device {'device_config': {'name': 'Esp32 v2.2', 'MACAddress': '24:62:AB:F3:43:72', 'telemetry': [{'key': 'temperature', 'method': 'notify', 'characteristicUUID': '0972EF8C-7613-4075-AD52-756F33D4DA91', 'byteFrom': 0, 'byteTo': -1}], 'attributes': [{'key': 'name', 'characteristicUUID': '00002A00-0000-1000-8000-00805F9B34FB', 'method': 'read', 'byteFrom': 0, 'byteTo': -1}], 'attributeUpdates': [{'attributeOnThingsBoard': 'sharedName', 'characteristicUUID': '00002A00-0000-1000-8000-00805F9B34FB'}], 'serverSideRpc': [{'methodRPC': 'rpcMethod1', 'withResponse': True, 'characteristicUUID': '00002A00-0000-1000-8000-00805F9B34FB', 'methodProcessing': 'read'}]}, 'interest_uuid': {'00002A00-0000-1000-8000-00805F9B34FB': [{'section_config': {'key': 'name', 'characteristicUUID': '00002A00-0000-1000-8000-00805F9B34FB', 'method': 'read', 'byteFrom': 0, 'byteTo': -1}, 'type': 'attributes', 'converter': <thingsboard_gateway.connectors.ble.bytes_ble_uplink_converter.BytesBLEUplinkConverter object at 0xb4427eb0>}], '0972EF8C-7613-4075-AD52-756F33D4DA91': [{'section_config': {'key': 'temperature', 'method': 'notify', 'characteristicUUID': '0972EF8C-7613-4075-AD52-756F33D4DA91', 'byteFrom': 0, 'byteTo': -1}, 'type': 'telemetry', 'converter': <thingsboard_gateway.connectors.ble.bytes_ble_uplink_converter.BytesBLEUplinkConverter object at 0xb4427eb0>}]}, 'scanned_device': <bluepy.btle.ScanEntry object at 0xb443a290>, 'is_new_device': False, 'peripheral': <bluepy.btle.Peripheral object at 0xb58f0070>, 'services': {'00001801-0000-1000-8000-00805F9B34FB': {'00002A05-0000-1000-8000-00805F9B34FB': {'characteristic': <bluepy.btle.Characteristic object at 0xb443a210>, 'handle': 2}}, '00001800-0000-1000-8000-00805F9B34FB': {'00002A00-0000-1000-8000-00805F9B34FB': {'characteristic': <bluepy.btle.Characteristic object at 0xb443a270>, 'handle': 21}, '00002A01-0000-1000-8000-00805F9B34FB': {'characteristic': <bluepy.btle.Characteristic object at 0xb443a1d0>, 'handle': 23}, '00002AA6-0000-1000-8000-00805F9B34FB': {'characteristic': <bluepy.btle.Characteristic object at 0xb443a2b0>, 'handle': 25}}, 'AB0828B1-198E-4351-B779-901FA0E0371E': {'0972EF8C-7613-4075-AD52-756F33D4DA91': {'characteristic': <bluepy.btle.Characteristic object at 0xb443a6b0>, 'handle': 41}, '4AC8A682-9736-4E5D-932B-E9B31405049C': {'characteristic': <bluepy.btle.Characteristic object at 0xb443a5f0>, 'handle': 44}}}} handle: 42, data: b'25.00'"}
the data i would like to update is the string '25.00'
I know I could update thingsboard directly but is the use of BLE that I'm interested in because I like that the sensors are notwork agnostic.
My question is why the updated temperature, even if reaching thingsboard won't show up and what can I change in order to make it happen.
Any kind of help much appreciated. I've being wrestling with this the entire weekend.
Adding more clarifications:
ESP32 code generate the BLE notifications: https://pastebin.com/NqMfxsK6
{
"name": "BLE Connector",
"rescanIntervalSeconds": 100,
"checkIntervalSeconds": 10,
"scanTimeSeconds": 5,
"passiveScanMode": true,
"devices": [
{
"name": "Temperature and humidity sensor",
"MACAddress": "24:62:AB:F3:43:72",
"telemetry": [
{
"key": "temperature",
"method": "notify",
"characteristicUUID": "0972EF8C-7613-4075-AD52-756F33D4DA91",
"byteFrom": 0,
"byteTo": -1
}
],
"attributes": [
{
"key": "name",
"characteristicUUID": "00002A00-0000-1000-8000-00805F9B34FB",
"method": "read",
"byteFrom": 0,
"byteTo": -1
}
],
"attributeUpdates": [
{
"attributeOnThingsBoard": "sharedName",
"characteristicUUID": "00002A00-0000-1000-8000-00805F9B34FB"
}
]
}
]
}

packer creating manifest.json is not valid json

I am using packer to create a base ami and using a post proccessor to create a manifest.json file
how can i make this json valid
{
"builds": [
{
"name": "amazon-ebs",
"builder_type": "amazon-ebs",
"build_time": 1589466697,
"files": null,
"artifact_id": "eu-west-1:ami-04d3331ac647e751b",
"packer_run_uuid": "add4c072-7ac2-f5e9-b941-6b80003c03ec",
"custom_data": {
"my_custom_data": "example"
}
}
],
"last_run_uuid": "add4c072-7ac2-f5e9-b941-6b80003c03ec"
2020-05-14T14:31:37.246153577Z stdout P }
Error: Parse error on line 13:
...b941-6b80003c03ec" 2020 - 05 - 14 T14:
----------------------^
Expecting 'EOF', '}', ':', ',', ']', got 'NUMBER'
My eventual goal is to save the artifact_id to a var using bash
Thank you for the help,
In order to make it valid json, i had to add this attribute to my packer template.json:
"post-processors": [
{
"type": "manifest",
"output": "manifest.json",
"strip_path": true,
"strip_time": true
"strip_time": "true"

Google AD historical report COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSIONS

I have python script to download the google ad historical report through the API.
I am having below as report job
{
'reportQuery': {
'dimensions': [
'DATE',
'MOBILE_APP_NAME',
'ADVERTISER_NAME',
'ADVERTISER_ID',
'AD_UNIT_ID',
'AD_UNIT_NAME',
'LINE_ITEM_NAME',
'LINE_ITEM_ID',
'LINE_ITEM_TYPE',
'CREATIVE_TYPE',
'CREATIVE_NAME',
'CREATIVE_ID',
'MOBILE_DEVICE_NAME'
'PLACEMENT_NAME',
'PLACEMENT_ID'
],
'dimensionAttributes': [
'AD_UNIT_CODE',
'LINE_ITEM_PRIORITY'
'CREATIVE_CLICK_THROUGH_URL'
'ADVERTISER_EXTERNAL_ID'
],
'columns': [
'AD_SERVER_IMPRESSIONS',
'AD_SERVER_CLICKS',
'AD_SERVER_CTR',
'AD_EXCHANGE_IMPRESSIONS',
'AD_SERVER_CPM_AND_CPC_REVENUE',
'AD_SERVER_WITHOUT_CPD_AVERAGE_ECPM'
],
'dateRangeType': 'CUSTOM_DATE',
'startDate': start_date,
'endDate': end_date
}
}
I am getting below error
Failed to generate report. Error was: [ReportError.COLUMNS_NOT_SUPPORTED_FOR_REQUESTED_DIMENSIONS # columns; trigger:'AD_EXCHANGE_IMPRESSIONS'
I am not sure which dimension need to include for AD_EXCHANGE_IMPRESSIONS. When I generate report with same fields in UI it works fine without errors

Why does Tableau Logs [Vertica][ODBC] (11430) Catalog name not supported?

I have been working on Tableau + vertica Solutions.
I have installed relevant vertica ODBC driver from Vertica provided packages .
While going through tdeserver.txt log file, I stumbled upon a line of error log as below :
{
"ts": "2015-12-16T21:42:41.568",
"pid": 51081,
"tid": "23d247",
"sev": "warn",
"req": "-",
"sess": "-",
"site": "{759FD0DA-A1AB-4092-AAD3-36DA0923D151}",
"user": "-",
"k": "database-error",
"v": {
"retcode-desc": "SQL_ERROR",
"retcode": -1,
"protocol": "7fc6730d6000",
"line": 2418,
"file": "/Volumes/build/builds/tableau-9-2/tableau-9-2.15.1201.0018/modules/connectors/tabmixins/main/db/ODBCProtocolImpl.cpp",
"error-records": [{
"error-record": 1,
"error-desc": "[Vertica][ODBC] (11430) Catalog name not supported.",
"sql-state": "HYC00",
"sql-state-desc": "SQLSTATE_API_OPT_FEATURE_NOT_IMPL_ODBC3x",
"native-error": 11430
}]
}
}
This piece of log is repeated several times .
Rest of the setup runs smooth as expected .
Below are the attributes from ~/Library/ODBC/odbc.ini
[ODBC]
Trace = 1
TraceAutoStop = 0
TraceFile = ~/log
TraceLibrary =
ThreePartNaming=1
What am I missing here ?

Resources