api.loganalytics.io unable to get local issuer certificate - python-requests

I've been using the code based on the snipped below for several months and it worked.
import requests
resp = requests.get(
'https://api.loganalytics.io',
# verify=False
)
Now I have an error:
File "C:....virtualenvs\pythonProject-XaZ9hdp4\lib\site-packages\requests\adapters.py", line 563, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.loganalytics.io', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)')))
I checked SSL certificate for api.loganalytics.io with third-party online service and it looks like everything is OK with its SSL certificate.
I created new Python project and re-install requests and certifi in new virtual environment.
What kind of another certificate can be meant in this error message? How can I find and update it?
I work under Windows 10.

import requests
resp = requests.get(
'https://api.loganalytics.io',
# verify=False
)
In this code we need change Verify = false to verify=ssl.CERT_NONE
import requests
resp = requests.get(
'https://api.loganalytics.io',
verify=ssl.CERT_NONE
)
if you have any ssl certificate you can use this
s = requests.Session()
s.cert = '/path/client.cert'

Related

SSL certificate error while using requests to scrape. Usual solutions return connection error

I am a beginner.
I was trying to use requests to pull a website data. It threw a ssl certification error.
then i tried to solve by "verify = False" which raised another error. Are they related? How do i solve?
the URL is :
https://www.nepalstock.com.np/
I tried:
import requests
web = requests.get("https://www.nepalstock.com.np/)"
which returned error:
Traceback (most recent call last):
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)
During handling of the above exception, another exception occurred:
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.nepalstock.com.np', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)')))
During handling of the above exception, another exception occurred:
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.nepalstock.com.np', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:997)')))
Then i tried the usual suggested solution:
import requests
web = requests.get("https://www.nepalstock.com.np/", verify = False)
which raised error:
C:\Users\pk\miniconda3\envs\data_science\lib\site-packages\urllib3\connectionpool.py:1045: InsecureRequestWarning: Unverified HTTPS request is being made to host 'www.nepalstock.com.np'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
File "C:\Users\pk\miniconda3\envs\data_science\lib\http\client.py", line 287, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
During handling of the above exception, another exception occurred:
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
I tried using certifi to reference certifi CA bundle while passing the request but it returned the same initial error.
Is the second error different? or caused due to the initial problem?
how do we solve this?
PS:
the errors are cut down to be concise.
First error
The first error is a SSL certificate issue. This can occur if:
The website you are trying to access does not have a trusted SSL certificate
Your local certificate store is not correctly setup
In your situation, it is probably the certificate store. Search online for how to correctly install the latest root certificates on your system. Should fix that problem.
Second error
The error message is about the remote website closing your connection; this can be for any reason (timeout, SSL version mismatch, bot protection, etc).
The most probable reason could be that the website detects you are a script/bot, and therefore blocks you. You could try get around this by faking your user agent. This can be done by using the headers field.
Example:
import requests
headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' }
web = requests.get("https://www.nepalstock.com.np/", headers=headers)

Forcing a TLS 1.0 POST request with Requests

To start I know that TLSv1.0 is super old and should not be used, but I need to connect to some really old local hardware that isn't supporting anything else atm
#import ssl
from OpenSSL import SSL
try:
import urllib3.contrib.pyopenssl
urllib3.contrib.pyopenssl.inject_into_urllib3()
except ImportError:
pass
import requests sys, os, select, socket
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager
from requests.packages.urllib3.util import ssl_
from requests.packages.urllib3.contrib import py
CIPHERS = (
'ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384:
ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-SHA256:AES256-SHA:'
)
class TlsAdapter(HTTPAdapter):
def __init__(self, ssl_options=0, **kwargs):
self.ssl_options = ssl_options
super(TlsAdapter, self).__init__(**kwargs)
def init_poolmanager(self, *pool_args, **pool_kwargs):
ctx = SSL.Context(SSL.TLSv1_METHOD)
self.poolmanager = PoolManager(*pool_args,
ssl_context=ctx,
**pool_kwargs)
session = requests.Session()
adapter = TlsAdapter(ssl.OP_NO_TLSv1_1 | ssl.OP_NO_TLSv1_2)
session.mount("https://", adapter)
data = { "key":"value"}
try:
r = session.post("https://192.168.1.1", data)
print(r)
except Exception as exception:
print(exception)
I've tried several ways. The above code is mostly ripped from similar issues posted here in the past but python3 ssl no longer supports TLSv1 so it throws an unsupported protocol error. I added the "import urllib3.contrib.pyopenssl" to try and force it to use pyOpenSSL instead per this urllib3 documentation. The current error with this code is
load_verify_locations() takes from 2 to 3 positional arguments but 4 were given
I know this is from the verify part of urllib3 context and I need to fix the context for pyOpenSSL but I've been stuck here trying to fix the context.
Analyzed the website in question in "https://www.ssllabs.com/" , the simulator doesn't use python for testing. I haven't been successful using python. However with jdk 1.8 , I was able to comment the line in the security file as mentioned in "https://www.youtube.com/watch?v=xSejtYOh4C0" and was able to work around the issue.
The server prefers these cipher suites. Is these supported ciphers in urllib3 ?
TLS_RSA_WITH_RC4_128_MD5 (0x4) INSECURE 128
TLS_RSA_WITH_RC4_128_SHA (0x5) INSECURE 128
TLS_RSA_WITH_3DES_EDE_CBC_SHA (0xa) WEAK
Right now I'm stuck with the below error:
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='{}', port={}): Max retries exceeded with url: /xxx.htm (Caused by ProtocolError('Connection aborted.', FileNotFoundError(2, 'No such file or directory')))

In Python3.6.5 Requests getting SSL Certificate Error

I had try to get the following URL using requests, but am getting an SSL certificate Error. I had tried all earlier Stack overflow Queries but nothing seems working
Code:
resp = requests.get('https://www.magidglove.com/', verify=certifi.where())
I had given verify=False, still not works
Error:
raise MaxRetryError(_pool, url, error or ResponseError(cause))urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.magidglove.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
TL;DR - The server is misconfigured. Either fix the server, pass verify=ssl.CERT_NONE, or download and pass www.magidglove.com's certificate explicitly.
The problem here is on the server, not the client. The server is only configured to return it's own certificate, which isn't enough for the client to trust it. Servers generally need to be configured to return the full certificate chain.
In order to diagnose this, you can use openssl to view some raw information about the certificate chain returned:
$ openssl s_client -connect www.google.com:443 -showcerts -servername www.google.com
CONNECTED(00000003)
depth=2 OU = GlobalSign Root CA - R2, O = GlobalSign, CN = GlobalSign
verify return:1
depth=1 C = US, O = Google Trust Services, CN = Google Internet Authority G3
verify return:1
depth=0 C = US, ST = California, L = Mountain View, O = Google LLC, CN = www.google.com
verify return:1
... snipped the rest of the output ...
You can see that 3 certificates were returned by the server, and they were verified in reverse order. The GlobalSign certificate is trusted by the certifi library, the cert at depth=1 was created by the cert at depth=2, and the last cert, CN=www.google.com, was created by the cert at depth=1.
Now let's compare that to the server you were trying to connect to:
$ openssl s_client -connect www.magidglove.com:443 -showcerts -servername www.magidglove.com
CONNECTED(00000003)
depth=0 businessCategory = Private Organization, jurisdictionC = US, jurisdictionST = Illinois, serialNumber = 00043176, C = US, ST = Illinois, L = Romeoville, O = "Magid Glove and Safety Manufacturing Company, L.L.C.", OU = web site, CN = www.magidglove.com
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 businessCategory = Private Organization, jurisdictionC = US, jurisdictionST = Illinois, serialNumber = 00043176, C = US, ST = Illinois, L = Romeoville, O = "Magid Glove and Safety Manufacturing Company, L.L.C.", OU = web site, CN = www.magidglove.com
verify error:num=21:unable to verify the first certificate
verify return:1
You can see a few things from this output:
- The server only returned a single certificate
- The client tried to verify the certificate and couldn't
It requires some knowledge of ssl to know that the reason why it couldn't verify was that it doesn't trust the certificate, but now that we know that, we can see that having the server return the full certificate chain will fix that. I suspect that the reason why chrome and other browsers don't report an error is that the browser itself knows about DigiCert, so it doesn't require a full chain.
This problem can easily be solved by adding importing the ssl to your python code and adding verify=ssl.CERT_NONE so your code should look something like this:
import requests
import ssl
resp = requests.get('https://www.magidglove.com/', verify=ssl.CERT_NONE)
That being said when running this code you might come across this error:
/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/urllib3/connectionpool.py:858: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
which you could disable by adding the following lines to your code:
import requests
import ssl
import urllib3
urllib3.disable_warnings()
resp = requests.get('https://www.magidglove.com/', verify=ssl.CERT_NONE)
Hope this helps!

SSL: CERTIFICATE_VERIFY_FAILED error displayed while connecting to SignalR through Python

from requests import Session
from signalr import Connection
with Session() as session:
connection = Connection("https://localhost:443/Platform", session)
Signalhub = connection.register_hub('MessageRouteHubspot')
with connection:
Signalhub.server.invoke('subscribe','1_FPGA_ACCESS_COMMANDS')
When executing this I'm getting error requests.exceptions.
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
How to bypass/ignore SSL verification??
Python's signalr-client uses WebSocket package to establish the connection. WebSocket package is then using OpenSSL to do SSL/TLS. It appears that the WebSocket client requires client CA (Certificate Authority) bundle to be passed as the environment variable WEBSOCKET_CLIENT_CA_BUNDLE.
Exporting this variable with the CA bundle including the certificates signing the original site's certificate should do the trick. Below is an example on my Ubuntu based system.
$ export WEBSOCKET_CLIENT_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt
$ python test-signalr.py

Python Requests: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:547)

I'm trying to login and scrape an airline website with the python Request package. I am getting the belowe error just by trying to load the main website. This code use to work last year but I have not tried it until now with the new Requests 2.2.1. Any ideas what is going on?
[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:547)
I'm using Requests 2.2.1.
ssladapter.py
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager
from ssl import PROTOCOL_TLSv1
class SSLAdapter(HTTPAdapter):
'''An HTTPS Transport Adapter that uses an arbitrary SSL version.'''
__attrs__ = ['max_retries', 'config', '_pool_connections', '_pool_maxsize', '_pool_block', 'ssl_version']
def __init__(self, ssl_version=None, **kwargs):
self.ssl_version = ssl_version
super(SSLAdapter, self).__init__(**kwargs)
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = PoolManager(num_pools=connections,
maxsize=maxsize, block = block,
ssl_version=self.ssl_version)
scrape.py
import requests
import ssladapter
from ssl import PROTOCOL_TLSv1
session = requests.Session()
session.mount('https://', ssladapter.SSLAdapter(ssl_version=PROTOCOL_TLSv1))
request = session.get("www.delta.com")
!!! SSLERROR raised here.
This Error is not a problem of the Requests library because it has been rigorously tested.
This is an indication of a 'Man-in-the-middle' attack.
May be you may be having a Network Sniffing tool like Fiddler or Wireshark running.
More Detailed Info on this a related Question
Here it is suggested that this is how SSL should work.

Resources