HTTP Headers logger - http

how can I log all HTTP headers from php web application, including ajax requests etc? Is there some easy to use middleman application which can save all requests from specific HTTP host to specific file?
I need it for web application behavior analysis

Postman might be what you looking for. It's a Chrome plugin that you can log everything including the headers.
https://chrome.google.com/webstore/detail/postman/fhbjgbiflinjbdggehcddcbncdddomop?hl=en

Related

Azure ASP.NET Core web api returns 404 for proxied multipart/form-data request

I'm new to Azure and trying to set up my nextjs client app and my ASP.NET Core backend app. Everything seems to play well now, except for file uploads. It's working on localhost, but in production the backend returns a 404 web page (attached image) before reaching the actual API endpoint. I've also successfully tested to make a multipart/form-data POST request in Postman from my computer.
The way I implemented this is that I'm proxying the upload from the browser through an api route (client's server side) to the backend. I have to go via the client server side to append a Bearer token from a httpOnly cookie.
I've enabled CORS in Startup.cs:
app.UseCors(builder => { builder.AllowAnyOrigin().AllowAnyHeader().AllowAnyMethod(); });
The frontend and backend apps are running in individual services and I've tried to enable CORS in the Azure portal as well, but there I could only allow origins, not headers and methods? The error message doesn't indicate CORS problems, but I just wanted make sure..
As far as I can see the requests look good, with correct URLs and origins. I suspect I'm missing some config in azure, but I didn't get any further by following the hints in the error message.
Any suggestions to what may cause this? Or where I can start looking. I'm not quite sure where to look for log output for this issue.
I finally got this working. I figured the host header in the proxy http request was unchanged. I only changed the URL for the proxy request, but I solved it by setting the host manually as well. This also explains why it was working at localhost, since both the client and backend was running at the same host.

I'm getting some unknown requests in my "app engine" aplicattion and want to know what they are

it's just a simple express js web app in standard environment, i put my custom domain and is working ok,
but a couple of days ago I started receiving these requests and I want to know what they are.
Those requests are typically Let's Encrypt requests to verify that you control the domain during the SSL certificate validation process.
The response is 404 Not Found which means the pages do not exist.
Those requests are harmless provided that you do not receive a massive number of requests.

Can my Heroku app call a http endpoint from my GCP backend?

I tried deploying a Heroku web app with my Flask backend (not on Heroku, actually on GCP) and got the following message in my browser's dev console:
Mixed Content: The page at 'https://x.herokuapp.com/' was loaded over
HTTPS, but requested an insecure XMLHttpRequest endpoint
'http://x:5000/endpoint'. This request has been blocked; the content
must be served over HTTPS.
I have little experience with serving and SSL, but the first temptation here would be to find a way to make Heroku okay with using http endpoints. And I'd love to avoid setting up SSL if possible.
What are my options from here?
Thanks!
In the end I realised that by applying my own http domain to the project (rather than using Heroku's domain), I am able to avoid the issue.
Then with http on the frontend, I was able to call http endpoints from my GCP server.

Do Desktop apps need HTTP preflight requests?

I'm developing an app that needs to get info from a third party API. I've been developing it to be a web application with Vuejs. For the requests I tried to use axios, jquery and the fetch API, but I'm having trouble with the preflight requests, it seems that the API is not treating the OPTIONS requests properly and it throws me a 405 error (I made a GET request on the same url through Postman and it worked normally and I also edited a OPTIONS request on firefox network panel to become a GET request and it returned a 200 status).
Now I'm thinking of abandon the idea of the web application and work it as a desktop application, but I need to know if the preflight requests are going to be a default behavior in this kind of app too.
Thanks for your attention!
No, CORS preflight requests are made by browsers, and are necessary due to the browser security model. They would not be used by a desktop application.
You can easily test this with curl, postman, etc. It sounds like you tried this, but the details you've described are off. Don't change anything to GET. Use the actual request you're trying to make, but do it outside the browser context. If the API responds appropriately then it should work in a desktop application.

HTTP site to HTTPS webservice using CORS

I have an HTML5/JS website on on domain, which uses an asp.mvc web service for CORS queries on another domain.
Everything works fine with HTTP -> HTTP however as we are now adding login and authentication mechanisms for user specific content we are wanting to enable HTTPS. However it just refuses to send the options request to the web service, just gives an "Aborted" status.
I am testing using Firefox and the web service is hosted on IIS7 with a self cert (generated with SelfSSL7).
Is there any known issues around this? I did check:
Cross domain request from HTTP to HTTPS aborts immediately
However it mentions the solution is to make sure the cert is trusted, and to my knowledge SelfSSL is doing this using the /T option when I call it. So is there anything else which needs to be changed to get this working?
You will unfortunately need to manually set this in firefox, although I believe you can override this behaviour if you manually set the profile configuration.

Resources