Autodesk forge Webhook API, Error 400 400, VALIDATION_ERROR - python-requests

using autodesk forge API, I am trying to create a webhook over folder using the following information; unfortunately, I am receiving the following response:
{
"id":"xxxx-xxxx-xxx-xxxxx",
"status":400,
"code":"VALIDATION_ERROR",
"detail":["Payload is not valid for serialization"]
}
URL:
https://developer.api.autodesk.com/webhooks/v1/systems/data/events/dm.folder.added/hooks (for specific folder added event) or: https://developer.api.autodesk.com/webhooks/v1/systems/data/hooks (for all events). Both are returning the same error
Header:
{
"Content-Type":"application/json",
"Authorization":"<MY_TOKEN>",
"x-ads-region":"US"
}
Data:
{
"callbackUrl":"<MY_DOMAIN>:<MY_PORT>/callback",
"scope":{
"folder":"urn:adsk.wipprod:fs.folder:co.xxxxxxxxxxxx-xxxxx"
}
}
Troubleshooting:
I've tried different folders, root and non-root. I can access all the folders i tried using the api
I am sure that my account is US region
I've tried to add hubId and/or projectId, but I received the same error
<MY_DOMAIN>:<MY_PORT>/callback is configured and working fine
Headers and Data are serializing and de-serializing normally using json loads & dumps
Any suggestion/help?

Answering myself :)
I've discovered that my issue is not related to Forge API, it's a general one related to python Requests. The payload (data) of Requests cannot be nested dictionary, only 1 level dictionary is accepted, nested ones will fail. the solution is to stringify the dict (json.dumps) and use that string as request payload.

Related

CORS issue when calling API via Office Scripts Fetch

I am trying to make an API call via Office Scripts (fetch) to a publicly available Azure Function-based API I created. By policy we need to have CORS on for our Azure Functions. I've tried every domain I could think of, but I can't get the call to work unless I allow all origins. I've tried:
https://ourcompanydoamin.sharepoint.com
https://usc-excel.officeapps.live.com
https://browser.pipe.aria.microsoft.com
https://browser.events.data.microsoft.com
The first is the Excel Online domain I'm trying to execute from, and the rest came up during the script run in Chrome's Network tab. The error message in office Scripts doesn't tell me the domain the request is coming from like it does from Chrome's console. What host do I need to allow for Office Scripts to be able to make calls to my API?
The expected CORS settings for this is: https://*.officescripts.microsoftusercontent.com.
However, Azure Functions CORS doesn't support wildcard subdomains at the moment. If you try to set an origin with wildcard subdomains, you will get the following error:
One possible workaround is to explicitly maintain an "allow-list" in your Azure Functions code. Here is a proof-of-concept implementation (assuming you use node.js for your Azure Functions):
module.exports = async function (context, req) {
// List your allowed hosts here. Escape special characters for the regular expressions.
const allowedHosts = [
/https\:\/\/www\.myserver\.com/,
/https\:\/\/[^\.]+\.officescripts\.microsoftusercontent\.com/
];
if (!allowedHosts.some(host => host.test(req.headers.origin))) {
context.res = {
status: 403, /* Forbidden */
body: "Not allowed!"
};
return;
}
// Handle the normal request and generate the expected response.
context.res = {
status: 200,
body: "Allowed!"
};
}
Please note:
Regular expressions are needed to match the dynamic subdomains.
In order to do the origin check within the code, you'll need to set * as the Allowed Origins on your Functions CORS settings page.
Or if you want to build you service with ASP.NET Core, you can do something like this: https://stackoverflow.com/a/49943569/6656547.

How to call Neo4j database api in R Studio

Please I created a Neo4j database instance, and I am trying to call it in R Studio, using the neo4r and neo4jshell packages. After running the api call, I still get a 404even though I correctly specified the url, username, and password. Please find my code below:
library(neo4r)
library(neo4jshell)
myTwitter <- neo4j_api$new(
url = "http://54.152.83.7:7474",
user = "neo4j",
password = "mypassword"
)
myTwitter$ping()
When I run the last line of code, I get the 404 instead of 200, which obviously means my api call was not successful. Please I would appreciate your helpful suggestions. Thank you
HTTP endpoints were changed since version 4 of Neo4J
Neo4j v3 had endpoint http://localhost:7474/db/data
Neo4j v4 uses http://localhost:7474/db/{databaseName}/tx instead of it.
Seems like Neo4j library for R needs to be updated...
I'm not familiar with R but you could try to use available HTTP client for R that supports Basic authentication to send POST requests to Neo4J API with JSON payload. I also see you use http schema which means your credentials will be sent as plain text through the network, which is not good.
Payload for such requests should be in form of:
{
"statements": [
{
"statement": "MATCH(n) RETURN n"
}
]
}
(adjust Cypher query to your needs)
Response will be JSON object with data section containing actual results.

How to read the apple-app-site-association file on Vapor 4?

For the auto-fill password to work on the Apple platforms, I am testing out Apple App Site Association (AASA) Validator in this website.
I have added the required json in the Public/.well-known/apple-app-site-association file for the auto-fill password to work on my iOS application.
The result from this test comes back with this error:
Your file's 'content-type' header was not found or was not recognized.
Does anyone have ever encounter this issue? It seems that the AASA file is not being downloading into my device.
Note that on iOS 14, AASA files will be delivered via Apple's CDN, which is different from how AASA files are currently downloaded.
Is there something else to do about it on my Vapor 4 project to make things work?
I meat the same issue, follow by imike's answer and doing some research, here is the solution.
create a custom Middleware
struct UniversalLinksMiddleware: Middleware {
func respond(to request: Request, chainingTo next: Responder) -> EventLoopFuture<Response> {
guard request.url.string == "/.well-known/apple-app-site-association" else {
return next.respond(to: request)
}
return next.respond(to: request).map { response in
response.headers.add(name: "content-type", value: "application/json")
return response
}
}
}
add this middleware at config.swift file. Be aware of the order you add middleware, you must add it before FileMIddleware. Because the responses leaving your application goes through the middleware in reverse order.
app.middleware.use(UniversalLinksMiddleware())
app.middleware.use(FileMiddleware(publicDirectory: app.directory.publicDirectory))

Enable CORs for Swashbuckle swagger.json in .NET Lambda API

I have a .NET lambda API that I was previously using Swashbuckle to generate a swagger.json file that was given to an external site to use. I am now trying to setup so the swagger.json file is is generated by the API and available through a url for the external site to us ie: mylambdaapi.com/swagger/v2/swagger.json. I was able to get this working by adding a dummy event to my template when pushing to aws as follows.
"SwaggerJson": {
"Type": "Api",
"Properties": {
"Path": "/swagger/v2/swagger.json",
"Method": "GET"
}
}
This works for just accessing the file normally, however the external site will run into CORS "No 'Access-Control-Allow-Origin' header" issues when trying to load the json. Is there any way to force the generation to use "Access-Control-Allow-Origin" in this case? Or is this not feasible in this way? I'm working off what another developer had built previously so I'm trying not to rewrite every, however I'm open to another method as long as it is able to produce some swagger json that the external site can consume.
EDIT: I should note that I am using API gateway, hover the swagger.json is only used for documentation purposes for the external site.
Attempted to use UseCors() functionality however that did not work. I was able to fix the issue by adding an anonymous function to handle the response before UseSwagger.
The following snip-it is from the Configure function in my startup.
app.Use((context, next) =>
{
context.Response.Headers["Access-Control-Allow-Origin"] = "*";
return next.Invoke();
});
app.UseSwagger();

google cloud vision api quickstart error opening file

I am following the following Google Cloud Vision quickstart:
https://cloud.google.com/vision/docs/quickstart
This is using the API Explorer, and I get
Error Opening File
I have created a bucket named vision2018, and checked Share Publicly for the file.
My portion of the request related to the file is:
"image":
{
"source":
{
"imageUri":"gs://vision2018/demo-image.jpg"
}
}
The response I get is:
{
"responses": [
{
"error": {
"code": 5,
"message": "Error opening file: gs://vision2018/demo-image.jpg\"."
}
}
]
}
}
What do I need to specify in order to access files in my GCP storage?
Alternatively, I read other Stack Overflows that talk about GOOGLE_APPLICATION_CREDENTIALS, Simple API Key, and "Create Service account key and download the key in JSON format", ... but these seem to be giving commands in the shell, which this quickstart doesn't even open.
Is there initial setup assumed prior to the quickstart?
I am not ready to call the api from code
You might want to doublecheck your request. I went to the quickstart, replaced the placeholder imageUri with gs://vision2018/demo-image.jpg and it worked just fine. The error message you posted is what would be displayed if you had given gs://vision2018/demo-image.jpg\" instead.
Regarding the second part of your question: these are authentication methods. In this particular case, under Authentication you will find a drop down which lets you chose between API key and Google OAuth 2.0. If you chose the former, you don't need to do anything as a demo key will be used just for the purposes of the quickstart. If you chose OAuth 2.0, a popup will appear prompting you to authenticate with a google account. All in all, what you need to do is follow step-by-step the instructions given by the quickstart.
I was receiving a similar JSON response from the Google Vision API:
"error": {
"code": 7,
"message": "Error opening file: gs://bucket/file.jpg."
}
The fix was to set the GCS file's permission to public-read:
gsutil acl set public-read gs://bucket/file.jpg
Finally I investigated what happened. The problem is that your API token is only grant for process the image (allow right to use OCR engine), but that API is not also for accessing object in GS.
Therefore "message": "Error opening file:
The problem is similar with this post:Authorize Google Cloud Vision API to Google Storage image Maybe the error message is a bit dumb than many years ago.
The solution also mentioned in the answer section, but if you want some thing more clear (expose security side-effect) here it is: Set GCS read-only public
Reason I want to keep using API because it's better for use it in mobile application, we cannot give the OAuth2.0 to any phone. However, still find a way to secure the read-public bucket.

Resources