Map datatables.js request to pydantic model - fastapi

I'm doing some server-side table generation to support pagination, searching, and so. On the client side, I'm using the datatable.js library. This library sends out GET requests in the form like this:
GET /api/data?draw=22&columns%5B0%5D%5Bdata%5D=name&columns%5B0%5D%5Bname%5D=&columns%5B0%5D%5Bsearchable%5D=true&columns%5B0%5D%5Borderable%5D=true&columns%5B0%5D%5Bsearch%5D%5Bvalue%5D=&columns%5B0%5D%5Bsearch%5D%5Bregex%5D=false&columns%5B1%5D%5Bdata%5D=age&columns%5B1%5D%5Bname%5D=&columns%5B1%5D%5Bsearchable%5D=true&columns%5B1%5D%5Borderable%5D=true&columns%5B1%5D%5Bsearch%5D%5Bvalue%5D=&columns%5B1%5D%5Bsearch%5D%5Bregex%5D=false&columns%5B2%5D%5Bdata%5D=address&columns%5B2%5D%5Bname%5D=&columns%5B2%5D%5Bsearchable%5D=true&columns%5B2%5D%5Borderable%5D=false&columns%5B2%5D%5Bsearch%5D%5Bvalue%5D=&columns%5B2%5D%5Bsearch%5D%5Bregex%5D=false&columns%5B3%5D%5Bdata%5D=phone&columns%5B3%5D%5Bname%5D=&columns%5B3%5D%5Bsearchable%5D=true&columns%5B3%5D%5Borderable%5D=false&columns%5B3%5D%5Bsearch%5D%5Bvalue%5D=&columns%5B3%5D%5Bsearch%5D%5Bregex%5D=false&columns%5B4%5D%5Bdata%5D=email&columns%5B4%5D%5Bname%5D=&columns%5B4%5D%5Bsearchable%5D=true&columns%5B4%5D%5Borderable%5D=true&columns%5B4%5D%5Bsearch%5D%5Bvalue%5D=&columns%5B4%5D%5Bsearch%5D%5Bregex%5D=false&order%5B0%5D%5Bcolumn%5D=0&order%5B0%5D%5Bdir%5D=asc&start=30&length=10&search%5Bvalue%5D=ass&search%5Bregex%5D=false&_=1635453433852 HTTP/1.1
What would be the preferred way to map these query params into a pydantic object with proper validation? Some of these values are potentially used in a SQL query.
Here is the documentation for all the params:
https://datatables.net/manual/server-side

Related

JSON schema for FastAPI request body

Is there a way to specify a JSON schema (ie not pydantic models) for the request body and use the automated documentation features (ie to have "Try it out" with support for that schema) ?
I solved the problem by building a rapidoc based web page, that consumes the JSON (or YAML) openapi specification and generates the HTML contents on the client side.

Getting entity body from HttpURLConnection

I'm currently writing a little library to standardize the use of HttpURLConenction's in my android-projects.
In one of my projects I'm communicating with a server using Http Digest to authenticate the user. The default java HttpUrlConnection doesn't support digest but I've managed to write a simplified digest auth (qop=auth) which is working pretty well.
For future projects I want to enable my library to use auth-int. Therefor I need to modify the construction of the A2 Hash and include the HTTP entity body (see RFC 7616 Section 3.4.3).
To do so I thought about extracting the full HTTP request from the URLConnection and then throwing out the unneeded stuff. Unfortunately I was unable to find a way to do this.
Extracting the single headerfields is possible by .getContentEncoding(), .getContentLength(), .getContentType() etc. . But with this I can't ensure that the order of the entity I'm reconstructing with the get-methods is the same as in the request and this might lead to a 401.
To sum it up:
Is there a way to extract the full request (or better only the entity body) from a HttpURLConnection?

Third party to PeopleSoft SSO integration

I have to write sign on peoplecode to make a service call by passing token (sent from third party) to API and get the responce (if token is valid responce will have username) in json format to create a PS_TOKEN.
I am fresher to peoplecode. How can I run HTTP POST request by passing token and get the response using Peoplecode?
You would create a synchronous service operation in the Integration Broker. The Integration Broker works best if you are sending XML or JSON. If this is just a regular HTTP POST with fields then it can cause some issues with the Integration Broker. I had a similar case and could not get the basic HTTP Post to work but instead ended up using HTTP POST multipart/form-data and was able to get that to work.
Steps I had to do to make this work.
Create a Message (document based or rowset based are both possible)
Create Service Operation and related objects
Create Transform App Engine to convert the Message to a HTTP POST multipart/form-data
Create a routing and modify the connector properties to send the content type of multipart/form-data. Also call the Transform app engine as part of the routing.
The issue with a application/x-www-form-urlencoded POST is that it seems PeopleSoft does another url encoding after the Transform, which is the last time you can touch the output with code. This final url encoding was encoding the = sign in the form post which made the format invalid.
Your other option would be to write this is Java and call the Java class from within PeopleSoft (or mix the Java objects in with PeopleCode). If you choose to go this way then the App Server needs to have connectivity to your authentication server. My only experience with this is I had a client that used this approach and had issues under heavy load. It was never determined the cause of the performance issue, they switched to LDAP instead to resolve the issue.

Which REST operation (GET, PUT, or POST) for validating information?

My users enter a few information fields in an iOS app.
This information must be validated on my server, which has a RESTful API.
After validation the UI of the iOS app changes to indicate the result.
Neither GET, PUT, or POST seem to be appropriate, because I'm not getting a resource, and neither is a resource created or updated.
What is the best fitting REST operation to implement this validation?
I use the same scenario as you and use PUT for it. You have to ask yourself: "when I send the same request twice, does this make a different state on server?" If yes, use POST, if no use PUT.
My users enter a few information fields in a iOS app. This information
must be validated on my server, which has a RESTful API. After
validation the UI of the iOS app changes to indicate the result....I'm
not getting a resource, and neither is a resource created or updated.
Since you aren't saving anything (not modifying any resource), I'd think this is technically more RPC than RESTful to me.
The following is my opinion, so don't take it as gospel:
If the information is simply being submitted and you're saying yes or no, and you're not saving it, I'd say POST is fine..
If information were actually being saved / updated, then choosing the proper HTTP method would be a lot more relevant.
POST = CREATE / SUBMIT (in an RPC context)
PUT = UPDATE (or CREATE if there is nothing to UPDATE)
I recommend using a ValidationResource and two requests. Each instance of this resource represents the validation of a set of data. The workflow:
1. Create new ValidationResource
Request: POST /path/to/validations
data to validate as the body
Response: 201 Created
Location: /path/to/validations/<unique-id-of-this-validation>
2. Look up result
Request: GET /path/to/validations/<unique-id-of-this-validation>
Respons: 200 OK
body: {'valid': true} or {'valid': false}
This is a RESTful approach in which the Validation is a Resource with server state.
Google proposes use of Custom Methods for REST API
For custom methods, they should use the following generic HTTP
mapping:
https://service.name/v1/some/resource/name:customVerb
The reason to use : instead of / to separate the custom verb from the
resource name is to support arbitrary paths. For example, undelete a
file can map to POST /files/a/long/file/name:undelete
Source: https://cloud.google.com/apis/design/custom_methods
So for validation the URL should be POST /resource:validate
I believe it is similar to GET entity but since we need to send data to validate and sending confidential data in URL is wrong habit as only payload data is ciphered by TLS, the only way left is POST or PUT.
However you may save or update the data in validate(eg. "verified":false). Based on requirement, you can go for POST or PUT (recommended POST if no update)
POST /user/validate-something
It seems like you're not doing it the correct way, if the validation is at the server-side then it should happen while submitting the data using a POST method. Then you'll validate that data, if validation fails then you can raise a 400 BAD REQUEST error, else you can create the resource.
This approach is more RESTful, as the POST method is properly used to create a resource or to raise 400 if validation fails

Amazon Dynamodb Exception error

when we are calling dynamodb with http rest api it is giving this error
Can i know what is the problem? what are all the required things we need to append in the dynamodb url??
http://dynamodb.us-east-1.amazonaws.com/?aws_access_key=XXXXXXXXXXXXXXXX&aws_secret_access_key=ZZZZZZZZZZZZZZZZZZZZZZ
Do we need to append anything more parameters with this url please let me know??
http://docs.amazonwebservices.com/amazondynamodb/latest/developerguide/UsingJSON.html#JSONMajorExample
Your solution is in the same link
http://docs.amazonwebservices.com/amazondynamodb/latest/developerguide/MakingHTTPRequests.html
If you don't use one of the AWS SDKs, you can perform Amazon DynamoDB operations over HTTP using the POST request method. The POST method requires you to specify the operation in the header of the request and provide the data for the operation in JSON format in the body of the request.
You need to make POST request with all the required parameters mentioned in that page.

Resources