MS Graph API v1.0 cannot filter by onPremisesSamAccountName using Python requests - python-requests

I'm attempting to use Python + requests to talk with MS Graph API (v1.0) in order to filter user objects by the onPremisesSamAccountName property but am receiving this error when sending the simple query:
endpoint = "https://graph.microsoft.com/v1.0/users"
query_parameters = {
'$filter': 'onPremisesSamAccountName eq \'somevalue\'',
'$select': 'id,displayName,mail,onPremisesSamAccountName'
}
user_graph_data = requests.get(
endpoint,
headers={'Authorization': 'Bearer ' + result['access_token']},
params=query_parameters
).json()
==============================
{
"error": {
"code": "Request_UnsupportedQuery",
"message": "Unsupported or invalid query filter clause specified for property 'onPremisesSamAccountName' of resource 'User'.",
"innerError": {
"date": "...",
"request-id": "...",
"client-request-id": "..."
}
}
}
I am able to filter using this field while using Microsoft's Graph Explorer:
https://developer.microsoft.com/en-us/graph/graph-explorer and the corresponding Javascript call in the developer console shows a successful call and response based on the filter with onPremisesSamAccountName.
The MS Graph docs for v1.0 state that this is a supported field for filtering as well:
Returned only on $select. Supports $filter (eq, ne, NOT, ge, le, in,
startsWith).
I'm also able to successfully filter using other fields such as 'mail' (i.e. changing the $filter string from 'onPremisesSamAccountName eq \'somevalue\'' to 'mail eq \'somevalue\'' works just fine, so I don't believe this is a syntactical error)

Related

Issues with startAt in orderBy timestamp - Firebase REST API

Im using the Firebase REST API to retrieve data with the GET method, this is the URL im executing:
const url = `https://firestore.googleapis.com/v1/projects/${projectId}/databases/${dataBase}/documents/${collectionName}/${documentId}?&key=${apiKey}&pageSize=${pageSize}&pageToken=${nextPageToken}&orderBy=timestamp&startAt=${startTime}`;
But it return this error:
{
"error": {
"code": 400,
"message": "Invalid JSON payload received. Unknown name \"startAt\": Cannot bind query parameter. Field 'startAt' could not be found in request message.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"description": "Invalid JSON payload received. Unknown name \"startAt\": Cannot bind query parameter. Field 'startAt' could not be found in request message."
}
]
}
]
}
}
If im omitting the paramemeter of startAt it works fine.
The format of startTime, id try it in all the following ways, and all return the same error:
Firebase return format: 2022-06-16T15:46:46.061Z
Unix Timestamp:1655394406
ISO 8601 date: 2022-06-16T15:46:46+00:00
What im doing wrong?
(For reference here is the official documentation where the startAt is explained)
You're calling the Firestore REST API, but are referencing the documentation for the REST API of the Realtime Database. While both products are part of Firebase, they are complete separate - and the API of one cannot be applied to the other.
For the documentation of the Firestore REST API, see https://firebase.google.com/docs/firestore/reference/rest

Insert a Map in DynamoDB

I'm trying to insert map data into a DynamoDB table using API Gateway. Here's my payload:
{
"TableName":"OrderDB",
"Item":{
"_id": {"S":"04FA887FP2S5R"},
"_rev": {"S":"9-12e098e2490e1b7c9782597226689403"},
"_merchantId":{"S":"AXN3EKXT0SJ61"},
"doc":{"M":{"something":"storedHere"} }
}
And my body mapping template in API Gateway:
{
"TableName": "$input.path('$.TableName')",
"Item": $input.json('$.Item')
}
Everything works as expected if I remove the doc item from the payload. With trying to post with the map I get the following error:
{
"__type": "com.amazon.coral.service#SerializationException",
"Message": "Unexpected value type in payload"
}
All of the examples that I see suggest using the DynamoDB document mapping object, but I don't think this is possible for me because I'm using API Gateway to connect directly to DynamoDB. Is it possible to insert a map this way?
Your map entry needs the same format as the rest of the items, so instead of "doc":{"M":{"something":"storedHere"} } it should be "doc":{"M":{"something":"S": "storedHere"} }

Adobe Analytics 2.0 API endpoint to get report suite events, props, and evars

I'm having a hard time finding a way in the 2.0 API that I can get a list of Evars, Props and Events for a given report suite. The 1.4 version has the reportSuite.getEvents() endpoint and similar for Evars and Props.
Please let me know if there is a way to get the same data using the 2.0 API endpoints.
The API v2.0 github docs aren't terribly useful, but the Swagger UI is a bit more helpful, showing endpoints and parameters you can push to them, and you can interact with it (logging in with your oauth creds) and see requests/responses.
The two API endpoints in particular you want are metrics and dimensions. There are a number of options you can specify, but to just get a dump of them all, the full endpoint URL for those would be:
https://analytics.adobe.io/api/[client id]/[endpoint]?rsid=[report suite id]
Where:
[client id] - The client id for your company. This should be the same value as the legacy username:companyid (the companyid part) from v1.3/v1.4 API shared secret credentials, with the exception that it is suffixed with "0", e.g. if your old username:companyid was "crayonviolent:foocompany", the [client id] would be "foocompany0", because..reasons? I'm not sure what that's about, but it is what it is.
[endpoint] - Value should be "metrics" to get the events, and dimensions to get the props and eVars. So you will need to make 2 API endpoint requests.
[rsid] - The report suite id you want to get the list of events/props/eVars from.
Example:
https://analytics.adobe.io/api/foocompany0/metrics?rsid=fooglobal
One thing to note about the responses: they aren't like the v1.3 or v1.4 methods where you query for a list of only those specific things. It will return a json array of objects for every single event and dimension respectively, even the native ones, calculated metrics, classifications for a given dimension, etc. AFAIK there is no baked in way to filter the API query (that's in any documentation I can find, anyways..), so you will have to loop through the array and select the relevant ones yourself.
I don't know what language you are using, but here is a javascript example for what I basically do:
var i, l, v, data = { prop:[], evar: [], events:[] };
// dimensionsList - the JSON object returned from dimensions API call
// for each dimension in the list..
for (i=0,l=dimensionsList.length;i<l;i++) {
// The .id property shows the dimension id to eval
if ( dimensionsList[i].id ) {
// the ones we care about are e.g. "variables/prop1" or "variables/evar1"
// note that if you have classifications on a prop or eVar, there are entries
// that look like e.g. "variables/prop1.1" so regex is written to ignore those
v = (''+dimensionsList[i].id).match(/^variables\/(prop|evar)[0-9]+$/);
// if id matches what we're looking for, push it to our data.prop or data.evar array
v && v[1] && data[v[1]].push(dimensionsList[i]);
}
}
// metricsList - the JSON object returned from metrics API call
// basically same song and dance as above, but for events.
for (var i=0,l=metricsList.length;i<l;i++) {
if ( metricsList[i].id ) {
// events ids look like e.g. "metrics/event1"
var v = (''+metricsList[i].id).match(/^metrics\/event[0-9]+$/);
v && data.events.push(metricsList[i]);
}
}
And then the result data object will have data.prop,data.evar, and data.events, each an array of the respective props/evars/events.
Example object entry for an data.events[n]:
{
"id": "metrics/event1",
"title": "(e1) Some event",
"name": "(e1) Some event",
"type": "int",
"extraTitleInfo": "event1",
"category": "Conversion",
"support": ["oberon", "dataWarehouse"],
"allocation": true,
"precision": 0,
"calculated": false,
"segmentable": true,
"supportsDataGovernance": true,
"polarity": "positive"
}
Example object entry for an data.evar[n]:
{
"id": "variables/evar1",
"title": "(v1) Some eVar",
"name": "(v1) Some eVar",
"type": "string",
"category": "Conversion",
"support": ["oberon", "dataWarehouse"],
"pathable": false,
"extraTitleInfo": "evar1",
"segmentable": true,
"reportable": ["oberon"],
"supportsDataGovernance": true
}
Example object entry for a data.prop[n]:
{
"id": "variables/prop1",
"title": "(c1) Some prop",
"name": "(c1) Some prop",
"type": "string",
"category": "Content",
"support": ["oberon", "dataWarehouse"],
"pathable": true,
"extraTitleInfo": "prop1",
"segmentable": true,
"reportable": ["oberon"],
"supportsDataGovernance": true
}

Nexus 3 | How to create (external) users using Nexus 3 APIs?

I'm trying to create external user on Nexus 3 using nexus 3 APIs. Following are the details:
Posting Groovy Script using: http://localhost:8081/nexus3/service/rest/v1/script
{
"name": "d8b3baeb-628a-43cc-9a9c-9a156f399e2",
"type": "groovy",
"content": "security.addUser('q018246a', '', '', '', true, 'd8b3baeb-628a-43cc-9a9c-9a156f399ae2', ['abc_test_role Developer Role']);"
}
Running Script using: http://localhost:8081/nexus3/service/rest/v1/script/d8b3baeb-628a-43cc-9a9c-9a156f399e2/run
Response:
{
"name": "d8b3baeb-628a-43cc-9a9c-9a156f399e2",
"result": "User{userId='q018246a', firstName='', lastName='', source='default'}"
}
Hitting though Postman, all working fine and users getting created. But through Application server it is giving Bad request.
Awkward behavior is, it's letting me create user using postman post script with blank first_name, last_name, email, password, but all these parameters are required on UI.
Another thing, It's showing source as default but how to I ensure source as LDAP?
I assume you're trying to map an LDAP user? If so, this will work:
import org.sonatype.nexus.security.role.RoleIdentifier;
import org.sonatype.nexus.security.user.User;
String userId = 'someuser';
String newRoleId = 'nx-admin'
User user = security.securitySystem.getUser(userId, 'LDAP')
if(user != null) {
RoleIdentifier newRole = new RoleIdentifier('default', newRoleId);
user.addRole(newRole)
security.securitySystem.setUsersRoles(user.getUserId(), 'LDAP', user.getRoles());
} else {
log.warn("No user with ID of $userId found.")
}

How do I delete user analytics data from Firebase using userDeletionRequests:upsert?

Problem Description
My Android app collects data via Google Analytics for Firebase. For privacy reasons, users must be able to wipe their data off the Firebase servers, should they choose to do so.
The app requests a deletion by forwarding its Firebase APP_INSTANCE_ID to my own server. This server has been prepared in advance with credentials, from my personal Google account (via oauth2), for managing the Firebase project. The server authenticates with www.googleapis.com, and, using the supplied APP_INSTANCE_ID, invokes the upsert.
As noted by the documentation, the generic Google Analytics API is appropriate for this task.
After some initial trouble (b/c I didn't have the correct auth scope, and the Analytics API wasn't properly enabled), googleapis.com now returns HTTP 200 for each upsert request. (As an aside, even if you supply a bogus APP_INSTANCE_ID, it returns 200.)
Here is a sample response from the upsert, which shows nothing amiss:
{ kind: 'analytics#userDeletionRequest',
id:
{ type: 'APP_INSTANCE_ID',
userId: (REDACTED 32-char hexidecimal string) },
firebaseProjectId: (REDACTED),
deletionRequestTime: '2018-08-28T12:46:30.874Z' }
I know the firebaseProjectId is correct, because if I alter it, I get an error. I have verified that the APP_INSTANCE_ID is correct, and stable up until the moment it is reset with resetAnalyticsData().
Test Procedure
To test the deletions, I populated Firebase with several custom events, using the procedure below (Nexus 5X emulator, no Google Play, no Google accounts configured, but that shouldn't make any difference):
Install the app
Fire off some custom events (FirebaseAnalytics.logEvent)
Observe those events appear on the Firebase console
(About a minute later:) Make the upsert call, observe HTTP 200, and note the "deletionRequestTime"
Immediately call FirebaseAnalytics.resetAnalyticsData (to clear any event data cached on the device)
Uninstall the app
Rinse & repeat 7 or 8 times
However, even 24 hours later, 100% of the Firebase events are still present in the events table. No discernable state change has taken place on the Firebase server as a result of the upserts.
Question
So, what am I doing wrong? how do I successfully delete user data from Google Analytics for Firebase?
EDIT
Here's the code I'm using to make a request (from node.js):
const request = require( 'request' );
...
_deletePersonalData( data )
{
return new Promise( (resolve, reject) => {
request.post({
url: 'https://www.googleapis.com/analytics/v3/userDeletion/userDeletionRequests:upsert',
body: {
kind: 'analytics#userDeletionRequest',
id: {
type: 'APP_INSTANCE_ID',
userId: data.firebaseAppInstanceId
},
firebaseProjectId: (REDACTED)
},
headers: {
Authorization: 'Bearer ' + iap.getCurAccessToken()
},
json: true
}, (err, res, body) => {
console.log( 'user-deletion POST complete' );
console.log( 'Error ' + err );
console.log( 'Body ', body );
if( err )
{
reject( err );
return;
}
if( body.error )
{
reject( new Error( 'The Google service returned an error: ' + body.error.message + ' (' + body.error.code + ')' ) );
return;
}
resolve({ deletionRequestTime: body.deletionRequestTime });
});
});
}
Here's a sample request body:
{
kind: 'analytics#userDeletionRequest',
id: {
type: 'APP_INSTANCE_ID',
userId: (REDACTED 32-char hexidecimal string)
},
firebaseProjectId: (REDACTED)
}
And here's the console output for that same request (same userId and everything):
user-deletion POST complete
Error: null
Body: { kind: 'analytics#userDeletionRequest',
id:
{ type: 'APP_INSTANCE_ID',
userId: (REDACTED 32-char hexidecimal string) },
firebaseProjectId: (REDACTED),
deletionRequestTime: '2018-08-29T17:32:06.949Z' }
Firebase support just got back to me, and I quote:
Upsert method deletes any individual user data we have logged, but aggregate metrics are not recomputed. This means that you might not see any changes in the events tab in your Analytics console.
So, basically my mistake was expecting the events to disappear from the console.
This, of course, raises the question of how one determines that the API is actually working... but maybe the HTTP 200 is enough.

Resources