I am currently using this clumsy way to check if a request is HTTP or HTTPS
def get_scheme(self, request):
if self.request.url.startswith('https:'):
scheme = 'https'
else:
scheme = 'http'
Is there a request.is_secure() method in Python Requests?
Related
We are implementing short URLs to redirect on our project.
I do an API request to my server using axios, with the info on the short url.
The server responds with a redirect status 308 succesffuly.
I see in the response headers, the location parameter (to redirect to) is correct.
And we have also set Access-Control-Allow-Origin: *
But the redirect does not follow through...
After recieving the 308; the browser attempts a preflight OPTIONS request to the redirect URL, followed by a GET request to the redirect URL.
Both of these return an error.
The preflight request error: CORS Misssing Allow Origin. And the GET request gives error: NS_ERROR_DOM_BAD_URI
Not sure what the issue is. Is it on the front-end, or on the server-side?
Any advice would be greatly appreciated!
When I try to send a request via python3.6 to some urls, it waits until Timeout exception is raised( ConnectionError: HTTPSConnectionPool(host={host}, port=443): Read timed out) . But when I try the same request via python2.7 it is successfully completed with status code: 200. Can you help me?
Version of Requests Package: 2.23.0
Sample Code:
import requests
url = "https://www.khaneyeshoma.ir/"
requests.get(url=url, timeout=10)
Thanks!
sometimes problem occurs cuz of using timeout parameter try :
requests.get(url=url,)
It think is because of the website you are trying to access. The request is correct, but it may need some extra headers.
If you try the request on other address it will work:
import requests
url = "https://www.google.com"
requests.get(url=url, timeout=10)
Response:
<Response [200]>
You can use urllib.request with postman header, and you won't need timeout anymore:
import urllib.request
url = "https://www.khaneyeshoma.ir/"
req = urllib.request.Request(
url,
data=None,
headers={
'User-Agent':"PostmanRuntime/7.6.0"
}
)
response = urllib.request.urlopen(req)
html = response.read()
print(html)
It is because of a whitespace between the header field-name(access-control-expose-headers) and colon. RFC 7230:
No whitespace is allowed between the header field-name and colon. In the past,
differences in the handling of such whitespace have led to security vulnerabilities
in request routing and response handling. A server MUST reject any received request
message that contains whitespace between a header field-name and colon with a
response code of 400 (Bad Request). A proxy MUST remove any such whitespace from a
response message before forwarding the message downstream.
I've read https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication and the Basic Authentication chapter from HTTP: The Definitive Guide.
I thought Proxy-Authenticate + Proxy-Authorization + status code 407 was essentially the same as WWW-Authenticate + Authorization + status code 401. I thought if the server responded WWW-Authenticate + 401 or Proxy-Authorization + 407, under both conditions, the browser would pop up an auth dialog, and then the browser would send the credentials with the Authorization or Proxy-Authorization header.
The "WWW-Authenticate combination headers" did work as expected, while the "Proxy combination headers" did not. For Proxy-Authorization + 407, I get ERR_UNEXPECTED_PROXY_AUTH in Chrome and get nothing happened in Firefox(No auth dialog popping up!).
Error in Chrome:
This site can’t be reached.
The webpage at http://localhost:5000/http_auth might be temporarily down or it may have moved permanently to a new web address.
ERR_UNEXPECTED_PROXY_AUTH
So what's the difference between these 2 sets of similar headers? When and where do I use Proxy-Authenticate? Practical exmaples that I can run would be much appreciated.
I am using Python with Flask for testing.
My serverside code:
WWW-Authenticate
#app.route('/www_auth')
def ha():
print("====request headers begin======")
print(request.headers)
print("====request headers end======")
if 'Authorization' in request.headers and request.headers['Authorization'] == 'Basic MTIzOjQ1Ng==':
return render_template('demo.html')
else:
resp = make_response(render_template('demo.html'), 401)
resp.headers['WWW-Authenticate'] = 'Basic realm="WWW-Authenticate required :)"'
return resp
Proxy-Authenticate
#app.route('/proxy_auth')
def haha():
print("====request headers begin======")
print(request.headers)
print("====request headers end======")
if 'Proxy-Authorization' in request.headers and request.headers['Proxy-Authorization'] == 'Basic MTIzOjQ1Ng==':
return render_template('demo.html')
else:
resp = make_response(render_template('demo.html'), 407)
resp.headers['Proxy-Authenticate'] = 'Basic realm="Proxy-Authenticate required :)"'
return resp
I did some tests and here's what I found. (I took a look at RFC and as usual it's too overwhelming :) )
The Proxy-Authenticate set of headers can indeed result in auth pop-up dialog too. But it is something that one must manually set in the client/browser at first. Specifically, for example in Firefox, it's related to the proxy setting.
The Proxy-Authenticate set of headers is used when you connects to a proxy which needs username and password.
Attention: You need to set the root path to your proxy function like this:
#app.route('/')
def haha():
#rest of the code
The workflow is:
-----------------------------------Step 1---------------------------------------------------->
client/browser <---Step 2, 407,Proxy-Authorization response header, required username and password----------- proxy
----Step 3, Proxy-Authorization request headers, contains credentials------------------------> --------> target website
----Subsequent requests, Proxy-Authorization request headers with credentials is still sent--> ---------> target website
In this case, the Proxy-Authorization(with credentials) will be sent automatically for each request.
If the server does not require authentication, then the client can visit the target website directly, and there's no Proxy-Authorization header in the request. (Most free http proxies that you find on the Web works in this way I think)
I also tried the WWW-Authenticate set of headers while I had set the proxy setting in Firefox. The result is that: Every time I visit a new website, I need to authenticate again. So obviously the WWW-Authenticate set of headers aren't meant to be used in this case.
Any other in-depth opinions/explanation would be appreciated. After all I merely did some tests and I want to know more.
I am current Apache Tomcat user designing an Akka HTTP based replacement for a HTTPS web service that uses client certificates for authentication and authorization. Using Tomcat I am accustomed to retrieving the client X509Certificate with a servlet request attribute
request.getAttribute("javax.servlet.request.X509Certificate")
I need the certificate for some additional authorization checks inside the handler for select routes. How would retrieve the client certificate in this way with Akka HTTP 10.0.x?
You need to enable decorating requests with TLS session info through the config settings for the server:
akka.http.server.parsing.tls-session-info-header = on
And then extract the info for a specific request using the synthetic header akka.http.scaladsl.model.headers.Tls-Session-Info like so:
headerValueByType[`Tls-Session-Info`]() { sessionInfo =>
val sslSession = sessionInfo.getSession()
sslSession.getPeerCertificates
... etc ...
How to capture a request (ngx.req) in lua nginx module?(depend on method of request).
In lua nginx config, I need to check some conditions before send the request and return the response as normal, how can I do that?
I have used ngx.location.capture, but the method is different with method of ngx.req. I also use ngx.redirect(ngx.req) but it didn't work.
Just use the return statement to continue the request. return;