Firebase Cloud Functions for devices with old SSL/TLS - firebase

I have an IoT device that makes http requests to a Cloud Function. This works ok until I setup a custom domain in Firebase. From the error I get, it looks like the device only supports SSL2, SSL3 and TLS 1.0. Now I'm trying to figure out a solution.
What are the possible solutions for this? Can I "enforce" a SSL/TLS version in a Cloud Function? Maybe I have to put a load balancer in front of the cloud function that supports these old cryptographic protocols?
Thank for your help.

It's not possible to enforce the SSL/TLS version directly from the Cloud Functions.
Your best bet is to use the GCP Cloud Load Balancing where you can set the SSL version to older cyphers.
This can be done by creating a new Policy and set TLS 1.0 for Minimum TLS Version

Related

Use ruby grpc client with self signed certificate

Trying to use ruby GRPC client to connect to a go GRPC server. The server uses TLS credentials with self signed certificates. I have trusted the certificate on my system (ubuntu 20.04) but still getting Handshake failed with fatal error SSL_ERROR_SSL: error:1000007d:SSL routines:OPENSSL_internal:CERTIFICATE_VERIFY_FAILED
Only way this is working is by manually setting GRPC::Core::ChannelCredentials.new(File.read(cert_path)) when initializing the client. Another workaround is setting :this_channel_is_insecure, but this only works if I remove TLS credentials in the server altogether (which I do not want).
Is there any way to get the GRPC client to work with the system certs?
I assume the gem is using roots.pem and trying to override that using GRPC::Core::ChannelCredentials.set_default_roots_pem results in Could not load any root certificate.
Also, I have not found any parameter that would let me skip certificate verification.
The default root location can be overridden using the GRPC_DEFAULT_SSL_ROOTS_FILE_PATH environment variable pointing to a file on the file system containing the roots. Setting GRPC::Core::ChannelCredentials.new(File.read(cert_path)) also seems fine to me.
In Ruby, most likely the feature to skip cert verification in TLS is not supported. We have the corresponding feature supported in underlying core, but it might not be plumbed to Ruby yet(at least not that I am aware of). If you need that, feel free to open a feature request to in gRPC Github page.
Thank you!

Is gRPC not suited for small projects?

For the past 2 weeks I have been struggling to setup a simple backend that utilises gRPC and communicates with a mobile client.
Reading online about this technology it feels like it is the proper easy going solution for my needs.
Bridging client/server communication written in multiple languages Java/Kotlin/Swift/Go.
Backwards compatibility checks for the API realized with buf
Efficient communication by transferring binary data and utilising HTTP2
Support for both RPC and REST thanks to grpc-gateway
However when I decided to go down the gRPC path I faced a ton of issues (highlights of the issues, not actual questions):
How to share protobuf message definitions across clients and server?
How to manage third party protobuf message dependencies?
How to manage stub generation for projects using different build tools?
How to secure the communication using SSL certificates? Also keep in mind that here I am talking about a mobile client <--> server communication and not server <--> server communication.
How to buy a domain because SSL certificates are issued agains public domains in order to be trusted by Certificate Authorities?
How to deploy a gRPC server as it turns out that there aren't any easy to use PaaS that support gRPC and HTTP2? Instead you either need to configure the infrastructure like load balancers and machines hosting the server by installing the appropriate certificates or just host everything on your own bear metal.
How to manage all of the above in a cost effective way?
This is more of a frustration question.
Am I doing something wrong and misunderstanding how to use gRPC or is it simply too hard to setup for a small project that should run in production mod?
I feel like I wasted a ton of time without having made any progress.

How to enable HTTP2 in Cloud Foundry using nginx-buildpack?

Is it possible to enable HTTP2 in cloud foundry using NGINX buildpack or any? I understand that GoRouter will not support HTTP2 but not sure if there is any workaround for this?
My original requirement is to serve large JS file from Cloud Foundry so to improve performance looking for enabling HTTP2.
Thanks,
Not exactly the same question, but the solution here applies: https://stackoverflow.com/a/55552398/1585136.
If you have the need for public clients (i.e.clients outside CF) to connect to your app, you need to use TCP routing. If your provider doesn't enable this by default, find another provider (see this list of public providers, hint Pivotal Web Services will provide TCP routes upon request) or self host.
If you only need to use HTTP/2 and/or gRPC between apps running on CF, you can use the container to container network. When you talk app to app, there are no restrictions (so long as you properly open required ports). You can use TCP, UDP and any protocol built on top of those. There some details about how this works here.
You'll also need the Nginx http_v2_module. This is a very recent addition and isn't yet in a build of the Nginx or Staticfile buildpack as I write this. It's should be, if everything goes right, in the next release though. That should be Nginx buildpack 1.1.10+ and Staticfile buildpack 1.5.8+.
My original requirement is to serve large JS file from Cloud Foundry so to improve performance looking for enabling HTTP2.
It might, it might not. Your mileage may vary. HTTP/2 isn't a silver bullet. This explains this well.
https://www.nginx.com/blog/http2-module-nginx/

Firebase and MBED TLSV1.2

I am having an issue where I can connect to any HTTPS server other than Firebase Cloud Functions.
I found that the error is occurring during the handshake between the client(me) and the server (Firebase) but i have never experienced this during connection to any other servers.
I am using a STM32F105 with a Wiznet W5500 using the mbed TLSV1.2 encryption libraries. Has anyone had experience in a similar situation? Is there any restrictions when connecting to the Firebase Server?
Do you have the right root CA added? I don't have any Firebase Cloud Functions, but I could successfully connect and download files from https://firebase.google.com using the GeoTrustGlobalCA and mbed-http.
Example code here: https://gist.github.com/janjongboom/3db8a6dd00ee23b7ab73f3c0435de853

Google Cloud Endpoints: Websockets and JWT

I've been developing a mobile app (iOS) with gRPC via Firebase auth(z). My server is using GKE with the NGinx proxy - so now I'm developing the Web UI for the deeper configuration of a user account. I prefer not to fall back to REST API's, so I was wondering if Google Cloud Endpoints supports websockets, and would it also prevent non-authorised app users from trying to make a request? With websockets I know it's possible, but as I'm tied in with gRPC with Cloud Endpoints, I'm just checking before I fall back to REST API calls (I prefer not to!).
Summary: Does Google Cloud Endpoints support Websockets with JWT auth tokens from Firebase?
Thanks
It looks like ESP supports websockets now, using the "--enable_websocket" flag in the esp config.
Currently, Cloud Endpoints doesn't support WebSockets at all.
Btw, what is your use case for WebSockets? WebSocket won't work with gRPC either. If you just want to talk to your gRPC service from Web UI, transcoding should work. It works with JWT from Firebase auth.
The Google Cloud Endpoints ESP doesn't support websockets.
However, Google Cloud Endpoints have open sourced their Extensible Service Proxy implementation. Internally it's implemented as a custom nginx module. Since Nginx supports websockets, it should be feasible to add support to their nginx:esp module.
But it's definitely out of scope for me. :-)

Resources