Do Desktop apps need HTTP preflight requests? - http

I'm developing an app that needs to get info from a third party API. I've been developing it to be a web application with Vuejs. For the requests I tried to use axios, jquery and the fetch API, but I'm having trouble with the preflight requests, it seems that the API is not treating the OPTIONS requests properly and it throws me a 405 error (I made a GET request on the same url through Postman and it worked normally and I also edited a OPTIONS request on firefox network panel to become a GET request and it returned a 200 status).
Now I'm thinking of abandon the idea of the web application and work it as a desktop application, but I need to know if the preflight requests are going to be a default behavior in this kind of app too.
Thanks for your attention!

No, CORS preflight requests are made by browsers, and are necessary due to the browser security model. They would not be used by a desktop application.
You can easily test this with curl, postman, etc. It sounds like you tried this, but the details you've described are off. Don't change anything to GET. Use the actual request you're trying to make, but do it outside the browser context. If the API responds appropriately then it should work in a desktop application.

Related

Retrieve openid bearer token using headless browser setup

Using OkHttp3 I was happily scraping a website for quite some time now. However, some components of the website have been upgraded and are now using an additional OpenID bearer authentication.
I am 99.9% positive my requests are failing due to this bearer token because when I check with Chrome dev tools, I see the bearer token popping up only for these parts. Moreover, a couple of requests request are going to links that end with ".well-known/openid-configuration". In addition, when I hardcode the bearer token from my browser in my OkHttp3 code, everything works. Without the code, I get an 401 non authorized message.
I figured that my browser emulation was not close enough to the real situation so I decided to use a headless browser setup that is doing some javascript invocations. Since I am using Java, I used HtmlUnit. Using this tool I could quickly get to the point where I could successfully scrape parts of the website (just as with OkHttp3) but it would again fail with the newly updated parts. I checked but couldn't find the bearer token in any of the responses (nor in the headers or in the cookies).
Is there any chance this approach (using a headless browser) could work? Or are there perhaps alternative approaches I could check.

Can browsers be called rest client

If browsers use http to connect to server ,and in any web application when we hit the URL and the request is received by a controller mapped to the URL ,can we say browsers are also rest client
That would depend entirely on what you use as a browser but generally no, a browser lacks meaningful tooling to probe a RESTful server out of the box, and comes with features that otherwise would not be needed by a REST client application, so would not be considered a REST client. A browser might be considered as a more generic HTTP client, but even that does not fully describe the problem domain of a browser (rendering, scripting, etc.). Even if you build a web interface to probe a REST service by submitting forms, that does not make the browser a REST client, but instead your website/web application would be the REST client application.
Yes,
the protocol the browser uses to communicate with the webserver clearly initially is a restful protocol.
Nothing more is necessary.
But it can get a bit more complicated.
The browser can fetch application code (javascript) in a restful way (e.g. GET) and execute that code which further can be communicating (Ajax) restful.

Block http requests not submitted via UI

This might seem like a strange question, but is it possible to detect and reject requests sent to my web server from outside my UI? For example if someone sent a post request to create a resource using the correct authorization token or session info from a tool such as Postman, could it be detected?
I want to prevent someone from using my application as some makeshift API.
Probably the best you can do is to just make sure (or come close to that) it's a human being by using a captcha service such as reCaptcha

HTTP Headers logger

how can I log all HTTP headers from php web application, including ajax requests etc? Is there some easy to use middleman application which can save all requests from specific HTTP host to specific file?
I need it for web application behavior analysis
Postman might be what you looking for. It's a Chrome plugin that you can log everything including the headers.
https://chrome.google.com/webstore/detail/postman/fhbjgbiflinjbdggehcddcbncdddomop?hl=en

How to automate logging in and retrieve data?

I want to automate logging into a website and retrieving certain data.
I thought the way to do this would be to sniff the HTTP requests so I know where the login form is being POSTed to so I can do the same using NodeJS/Java/Python.
However I can't seem to find the HTTP request that handles it.
The site seems to use some Java-applet and a lot of Javascript.
This is the site: link
Should I have a different approach?
Also also wonder about storing a cookie session, and sending it with each HTTP request after logging in.
I'm sorry if I am not to clear, I will try to explain myself further and edit this post if needed.
You can use the developer console (hit F12) in Chrome (this works also in other browsers) and then click the "Network" tab. There you see all network calls.
To detect what http requests are performed from a mobile device, you can use a proxy like Charles Proxy.
Also be aware that if you post from nodejs the cookies won't be set in the users browser.

Resources