Check if X509Certificate2 will work with http/2 in Google Chrome an Mozilla Firefox - encryption

we are running a Dotnet Core 3.0 application that provides a web api via Kestrel. By default Kestrel has set available protocols to Protocols.Http1AndHttp2.
On one of our testmachines it turned out that http/2 does not work in combination with Google Chrome (77.0.3865.90). The browser displays an error with ERR_HTTP2_INADEQUATE_TRANSPORT_SECURITY message. The fallback to http1 does not work in this case because the browser does support http/2 in general.
Microsoft Edge btw. does support requesting the same endpoint via http/2.
I tested the certificate on my develop machine and it turns out that this works as expected. So it does not seem to be a problem with the certificate itself rather than the server hosting the application. My local machine is running Windows 10 Pro (1903). The server is running Windows Server 2012 R2 Datacenter (6.3.9600).
The application will be hosted at our customers servers as a self contained package and we will not have an impact on what servers the api will be running.
Instead of disabling http/2 completely, we would like to check if all browsers will accept the provided certificate in combination of http/2. I'm not sure how to find this out on server side. To us it seems to be some kind of problem regarding to the provided cipher suites but I got no idea on how to check the compatibility on server side.
Does anyone have an idea on how to check the compatibility?

It’s not the certificate it’s the cipher suites.
Run your site through this tool: https://www.ssllabs.com/ssltest/ (or download and use https://testssl.sh if not publicly available, internal site) and you’ll see the cipher suites configured.
HTTP/2 blacklists older ciphers and Chrome won’t use HTTP/2 if those are configured. The list is here: https://www.rfc-editor.org/rfc/rfc7540#appendix-A but basically you probably should be using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 for Chrome.
This post tells you how to change them for IIS: https://medium.com/#rootsecdev/configuring-secure-cipher-suites-in-windows-server-2019-iis-7d1ff1ffe5ea

Related

How to control TLS version when WebAPI service accesses IdentityServer configuration on startup

I have an Authentication service built on Identity Server 3 and a set of WebAPI services using BearerTokenAuthentication. On startup, each of the WebAPI services makes a call to the Authentication service's .well-known/openid-configuration. That has been working fine until we recently configured the firewall between the WebAPI services and the Authentication service to only allow traffic with the TLS 1.2 protocol. Now the WebAPI services all fail to start and report that they cannot access the Authentication .well-known/openid-configuration because they "Could not create SSL/TLS secure channel".
Update: The problem described is occurring in my Test environment and is eliminated when we modify the firewall to allow TLS 1.0. In my Dev and Staging environments, the firewall is configured to require TLS 1.2 and that configuration is not producing the above symptoms. Something else is different. I just haven't figured out yet what it is.
Based on this latest update, I have a different question. Any suggestions on what to look for in the server/environment that would cause the TLS 1.2 requirement not to work in one environment while working in the other two? I've started to look at the registry settings that control the server's use of SSL and TLS protocols, but haven't found a difference there when comparing a working server to the non-working one.
Final Update: I never did figure out what was different. Someone from the systems group rebuilt the Test server, which was exhibiting the problem, by cloning the Staging server, which was not exhibiting the problem. The rebuilt server handles TLS 1.2 just fine. So it was clearly something in the server, not the code. But that's about all we know.

Microsoft EDGE - Security certificate required to access this resource is invalid

We are getting following error in Microsoft EDGE in our Dev environment when we run our ASP.NET Application Hosted in IIS 8 in Windows 2012 R2 Server.
Error:
XMLHttpRequest: Network Error 0x800c0019, Security certificate required to access this resource is invalid.
Following are more details about implementations and environments.
Our application runs on 2 different secured ports (HTTPS). In IIS both apps are hosting as different Web Applications and using same certificate. The certificate is generated using OpenSSL SHA2 encryption and it has been added in Secured Certificate Store.
From `Microsoft EDGE when we first load our application, it issues certificate warning message, and we are allowing to proceed. Once page is loaded, on a button click we are calling an API using AJAX call and that is hosted on different port.
In EDGE it is not allowing to proceed that API and giving above mentioned error.
In Chrome and IE 11 also, we are getting same warning message but from there it is allowing to execute next API.
Any help would be appreciated to fix the issue.
If you know your certificate is valid, a possible reason this might happen is if you have a tool running in the background somewhere that hijacks the SSL connections through a proxy, such as Fiddler.
Since such a tool is effectively using a man-in-the-middle attack to report the requests, the warnings are "normal". It's pretty easy to forget them running, too.

XSockets.net azure websites

I'm running my ASP.NET Web API app locally with xsockets without any problems. When I publish the project to Azure, it won't connect. I enabled websockets for Azure and adjusted the xsockets url on the client side
from:
conn = new XSockets.WebSocket("ws://localhost:50838/api/Chat");
to:
conn = new XSockets.WebSocket("ws://.azurewebsites.net/api/Chat");
Any suggestions?
Arnoud
XSockets passes in the subprotocol 'XSocketsNET' by default and for some reason Azure is filtering this away in the response. It works fine in emulation but on Azure the sub-protocol is removed. This causes errors in Chrome since chrome checks the subprotocol giving the error below:
WebSocket connection to 'ws://xmvc.azurewebsites.net/Home' failed: Error during WebSocket handshake: Sent non-empty 'Sec-WebSocket-Protocol' header but no response was received
It works fine in other browser that does not perform this check. I've tried IE10 and Firefox on this sample site running XSockets on a Azure WebSite
In my opinion Chrome is doing the correct thing and Azure has a bug.
EDIT:
Be aware of the fact that Azure WebSites limits websocket connections!!!
Free site: 5 connections
Shared site: 35 connections
Standard site: 350 connections
According to Microsoft using WebSockets over Wss / Https will do the trick, i just done a test and the result is still the same.
Most likely the "unnecessary" sub-protocol header is removed from the response event if you pass is from server during the hand-shake.
So, You cannot rely on the SubProtocol when using Windows Azure as it seems?

Fiddler shows DNS Lookup failure

On my desktop I run Visual studio web server cassini and fiddler as a proxy, then I connect to a web site running in Cassini using an iPad connected via wireless.
This has always worked since I installed and unistalled the MVC pack for visual studio.
Now I get this message from fiddler (it is sent to the iPad):
[Fiddler] DNS Lookup for "http://175.33.22.116" failed. The requested name is valid, but no data of the requested type was found
There are similar posts with this message, but none of them with my setup.
Can you suggest what to check?
Thank you
This indicates that the traffic from your IPAD client is malformed. Are you using anything in Fiddler (e.g. Tools > HOSTS) to change the traffic? If not, the bug is on the client; e.g. something is trying to connect to http://http://175.33.22.116 which isn't legal (due to the double http:// within the string).

Connect to self-signed HTTPS web services from Flex

In my project I need to connect to an intra-net web service but we need SSL connection between the two machines.
Because this is an intra-net site the certificate of the web service might be a self-signed certificate.
The web service and the web page that loads my Flex application resides on the same web server (tomcat) so when I load the web page of the application with HTTPS I been asked to confirm the certificate, I confirm it but this confirmation does not apply on the Flex application (on Internet Explorer and Firefox).
I tried Google's Chrome and it actually asked me twice for confirmation, one for the web page and one for the connection to the web service so it worked great.
Is there a way to tell Flash player to accept also self-signed certificates of is it entirely concern the browser and Adobe guys can't do nothing about it?
In other words, is there a way to connect to a self-signed certificates HTTPS web services from Adobe Flex???
Thanks.
When I put on my security hat the answer would be I hope not. I don't want browser plugins to bypass my browser security settings. I'd consider that a vulnerability.
There might be a way for standalone Air applications, but in browser Flash should honor browser settings.
In your company, you could create your own root CA, add its certificate to all machines that will access the intranet, and then have the CA issue your web service a certificate. The certificate will no longer be self-signed. The two main issues are: (1) managing the private key of your CA, (2) distributing the CA's root certificate to client PCs.
You need to download the cert and install it on both IE and Firefox.
To install a cert on IE:
http://www.markwilson.co.uk/blog/2008/11/trusting-a-self-signed-certificate-in-windows.htm
To install a cert on FF:
Go to Tools->Options and click on the Encryption tab. Click "View Certificates", then "Import".
In my experience, the cert has to be either verified or installed in the browser in order to get Flash to work properly. The cert also needs to have a valid hostname, but you can just edit your client's hostfile if you need to do this for testing.
Well, the flash player should just use the browser to make the connection and be done with it. We have a similar setup here. We use self-made certs, and communicate with the an AMF channel over HTTPS. My guess is though that our setup is different in that we load the flash application itself over an HTTPS channel, so it's talking from HTTPS to the same HTTPS server. Maybe you could try that? This is the setup for our Tomcat server:
Tomcat Server/Client Self-Signed SSL Certificate

Resources