How to pass a HOST header to AWS ALB with gRPC - grpc

I successfully managed to create an ALB that supports gRPC with a private self-signed certificate; however, I was planning on using this ALB to do content-routing of gRPC requests based on the Host header that ALB supports. Till now, I tried to no avail. Has anyone had success with this? Here's my code:
import json
import grpc
import base64
from lib import unified_pb2
from lib import unified_pb2_grpc
# Setup
uri = 'internal-test-lb-micky-18886655.us-east-2.elb.amazonaws.com'
with open('server.crt', 'rb') as f:
trusted_certs = f.read()
credentials = grpc.ssl_channel_credentials(root_certificates=trusted_certs)
channel = grpc.secure_channel(f'{uri}:443', credentials)
metadata=( ('host', 'qa.mydomain.com'),)
cf_greeters_stub = unified_pb2_grpc.GreetersStub(channel)
cf_request = unified_pb2.Request()
cf_request.cf_greeters_request.greeter_text = "hello"
data = cf_greeters_stub.GetResponse(cf_request,metadata=metadata)
print(data)
My ALB has the rule that intercepts the Host header and should route to the right target group based on that, but then ALB is not doing that, passing the request to the default rule. The rule I created returns 200 when there's a match, returns JSON dictionary in the body.

Related

GRPC call for a service which is inside a subdirectory? (Android grpc client)

This question is similar to below but my issue is with Android grpc client
How can I make a GRPC call for a service which is inside a subdirectory? (in .Net Framework)
I am getting 404 error while accessing the grpc streaming api :
UNIMPLEMENTED: HTTP status code 404
invalid content-type: text/html
headers: Metadata(:status=404,content-length=1245,content-type=text/html,server=Microsoft-IIS/10.0,request-id=5154500d-fb58-7903-65d6-3d3711129101,strict-transport-security=max-age=31536000; includeSubDomains; preload,alt-svc=h3=":443",h3-29=":443",x-preferredroutingkeydiagnostics=1,x-calculatedfetarget=PS2PR02CU003.internal.outlook.com,x-backendhttpstatus=404,x-calculatedbetarget=PUZP153MB0788.APCP153.PROD.OUTLOOK.COM,x-backendhttpstatus=404,x-rum-validated=1,x-proxy-routingcorrectness=1,x-proxy-backendserverstatus=404,x-feproxyinfo=MA0PR01CA0051.INDPRD01.PROD.OUTLOOK.COM,x-feefzinfo=MAA,ms-cv=DVBUUVj7A3ll1j03ERKRAQ.1.1,x-feserver=PS2PR02CA0054,x-firsthopcafeefz=MAA,x-powered-by=ASP.NET,x-feserver=MA0PR01CA0051,date=Tue, 11 Oct 2022 06:24:18 GMT)
The issue is that the /subdirectory_path is getting ignored by the service in the final outgoing call.
Here's the code I am using to create the grpc channel in android (gives 404)
val uri = Uri.parse("https://examplegrpcserver.com/subdirectory_path")
private val channel = let {
val builder = ManagedChannelBuilder.forTarget(uri.host+uri.path)
if (uri.scheme == "https") {
builder.useTransportSecurity()
} else {
builder.usePlaintext()
}
builder.executor(Dispatchers.IO.asExecutor()).build()
}
The uri is correct since it works with web client.
For web client the channel is defined like this (working)
var handler = new SubdirectoryHandler(httpHandler, "/subdirectory_path");
var userToken = "<token string>";
var grpcWebHandler = new GrpcWebHandler(handler);
using var channel = GrpcChannel.ForAddress("https://examplegrpcserver.com", new GrpcChannelOptions { HttpHandler = grpcWebHandler,
Credentials = ChannelCredentials.Create(new SslCredentials(), CallCredentials.FromInterceptor((context, metadata) =>
{
metadata.Add("Authorization", $"Bearer {userToken}");
return Task.CompletedTask;
}))
});
I tried to inject the subdirectory_path in the uri for my android client but unable to find appropriate api. grpc-kotlin doesn't expose the underlying http-client used in the channel.
Could someone please help me with this issue, how can I specify the subdirectory_path? (before the service and method name)
The path for an RPC is fixed by the .proto definition. Adding prefixes to the path is unsupported.
The URI passed to forTarget() points to the resource containing the addresses to connect to. So the fully-qualified form is normally of the form dns:///example.com. If you specified a host in the URI like dns://1.1.1.1/example.com, then that would mean "look up example.com at the DNS server 1.1.1.1." But there's no place to put a path prefix in the target string, as that path would only be used for address lookup, not actual RPCs.
If the web client supports path prefixes, that is a feature specific to it. It would also be using a tweaked grpc protocol that requires translation to normal backends.

Handling X-request-id request header from Here Maps Routing Javascript API

Migrating Here maps RoutingAPI from 7 to V8. Need to set X-Request-Id in request header and read the same from response.
Old implementation in version 7: sending "requestId" as query param and reading the value from "response.metaInfo.requestId" node coming in response
router.calculateRoute(
{
...
"requestId": "Route_1"
},
CallBackFn,
function (error) {
alert(error.message);
}
);
Trying to set as below in Version V8:
_platform = new H.service.Platform({
apikey: _appkey,
});
router = _platform.getRoutingService(null, 8);
router.i = {"X-Request-ID": "Route_1"}
From the above code, able to set request header. But not able to read from response header from callback function
routing v7: requestId
routing v8: Use the header X-Request-ID
X-Request-ID
You can tag your requests with a request identifier using the non-standard HTTP header X-Request-Id. The service will forward this value to the response, be it a success or a failure. If no value is provided, a UUID will be generated on the incoming request and this value will be attached to the response as the X-Request-Id. While you can use any string as the request identifier, we recommend using a UUID to uniquely identify your requests or group of requests.
migration guide : https://developer.here.com/documentation/routing-api/migration_guide/index.html
v8 guide : https://developer.here.com/documentation/routing-api/dev_guide/topics/trace-request.html

Track python simpleHttp server logging information in azure application insights application map

We have different microservices(function apps, vm servers, etc) logging to application insights. A simple python http server is hosted on a linux VM, I want this server to receive a traceparent http header (W3C tracing) log the information to application insights. This python server should create a separate node in the Application map.
I am able to extract the span context from traceparent http header and use it to log the information. But i am not able to view it as a separate node in Application map.
There are middlewares for flask,django for tracing the requests. But there is no ready made solution available for python simple http server.
The goal is to have this python server on vm be represented as a separate node in Application map.
Attaching my python script for reference. (this code was written using the code from flask-middleware)
import six
import logging
import sys
from opencensus.ext.azure.log_exporter import AzureLogHandler
from google.rpc import code_pb2
from opencensus.ext.azure.trace_exporter import AzureExporter
from opencensus.common import configuration
from opencensus.trace import (
attributes_helper,
execution_context,
print_exporter,
samplers,
)
from opencensus.trace import span as span_module
from opencensus.trace import stack_trace, status
from opencensus.trace import tracer as tracer_module
from opencensus.trace import utils
from opencensus.trace.propagation import trace_context_http_header_format
from opencensus.trace import config_integration
HTTP_HOST = attributes_helper.COMMON_ATTRIBUTES['HTTP_HOST']
HTTP_METHOD = attributes_helper.COMMON_ATTRIBUTES['HTTP_METHOD']
HTTP_PATH = attributes_helper.COMMON_ATTRIBUTES['HTTP_PATH']
HTTP_ROUTE = attributes_helper.COMMON_ATTRIBUTES['HTTP_ROUTE']
HTTP_URL = attributes_helper.COMMON_ATTRIBUTES['HTTP_URL']
HTTP_STATUS_CODE = attributes_helper.COMMON_ATTRIBUTES['HTTP_STATUS_CODE']
EXCLUDELIST_PATHS = 'EXCLUDELIST_PATHS'
EXCLUDELIST_HOSTNAMES = 'EXCLUDELIST_HOSTNAMES'
config_integration.trace_integrations(['logging'])
trace_parent_header= "00-4bf92f3577b34da6a3ce929d0e0e4736-00f067aa0ba902b7-01"
APP_INSIGHTS_KEY = "KEY HERE"
logging.basicConfig(
format='%(asctime)s traceId=%(traceId)s spanId=%(spanId)s %(message)s')
log = logging.getLogger(__name__)
def callback_function(envelope):
envelope.tags['ai.cloud.role'] = 'Pixm Agent'
handler = AzureLogHandler(
connection_string='InstrumentationKey=APP_INSIGHTS_KEY')
handler.setFormatter(logging.Formatter('%(traceId)s %(spanId)s %(message)s'))
handler.add_telemetry_processor(callback_function)
log.addHandler(handler)
propogator = trace_context_http_header_format.TraceContextPropagator()
sampler = samplers.ProbabilitySampler(rate=1.0)
exporter = AzureExporter(
connection_string="InstrumentationKey=APP_INSIGHTS_KEY")
exporter.add_telemetry_processor(callback_function)
try:
span_context = propogator.from_headers(
{"traceparent": trace_parent_header})
log.info("he...")
tracer = tracer_module.Tracer(
span_context=span_context,
sampler=sampler,
exporter=exporter,
propagator=propogator)
span = tracer.start_span()
span.span_kind = span_module.SpanKind.SERVER
# Set the span name as the name of the current module name
span.name = '[{}]{}'.format(
'get',
'testurl')
tracer.add_attribute_to_current_span(
HTTP_HOST, 'testurlhost'
)
tracer.add_attribute_to_current_span(
HTTP_METHOD, 'get'
)
tracer.add_attribute_to_current_span(
HTTP_PATH, 'testurlpath'
)
tracer.add_attribute_to_current_span(
HTTP_URL, str('testurl')
)
# execution_context.set_opencensus_attr(
# 'excludelist_hostnames',
# self.excludelist_hostnames
# )
with tracer.span(name="main-ashish"):
for i in range(0, 10):
log.warning("identity logs..."+str(i))
except Exception: # pragma: NO COVER
log.error('Failed to trace request', exc_info=True)
The Application Map finds components by following HTTP dependency calls made between servers with the Application Insights SDK installed.
OpenCensus Python telemetry processors
You can modify cloud_RoleName by changing the ai.cloud.role attribute in the tags field.
def callback_function(envelope):
envelope.tags['ai.cloud.role'] = 'new_role_name'
# AzureLogHandler
handler.add_telemetry_processor(callback_function)
# AzureExporter
exporter.add_telemetry_processor(callback_function)
Correlation headers using W3C TraceContext to log the information to Application Insights
Application Insights is transitioning to W3C Trace-Context, which defines:
traceparent: Carries the globally unique operation ID and unique identifier of the call.
tracestate: Carries system-specific tracing context.
The latest version of the Application Insights SDK supports the Trace-Context protocol.
The correlation HTTP protocol, also called Request-Id, is being deprecated. This protocol defines two headers:
Request-Id: Carries the globally unique ID of the call.
Correlation-Context: Carries the name-value pairs collection of the distributed trace properties.
import logging
from opencensus.trace import config_integration
from opencensus.trace.samplers import AlwaysOnSampler
from opencensus.trace.tracer import Tracer
config_integration.trace_integrations(['logging'])
logging.basicConfig(format='%(asctime)s traceId=%(traceId)s spanId=%(spanId)s %(message)s')
tracer = Tracer(sampler=AlwaysOnSampler())
logger = logging.getLogger(__name__)
logger.warning('Before the span')
with tracer.span(name='hello'):
logger.warning('In the span')
logger.warning('After the span')
You can refer to Application Map: Triage Distributed Applications, Telemetry correlation in Application Insights, and Track incoming requests with OpenCensus Python

FastAPI: CORS Middleware not working with GET method

I try to use CORS on the FastAPi framework but it dose not working with GET method
Here's the code I'm working on:
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
app = FastAPI()
app.add_middleware(
CORSMiddleware,
allow_origins=['*'],
allow_methods=["*"],
allow_headers=["*"],
)
#app.get("/test1")
async def test1():
return {"message": "Hello World"}
I had the same issue and the solution is to not use add_middelware but do the following:
First import from Starlette:
from starlette.middleware import Middleware
from starlette.middleware.cors import CORSMiddleware
Create the middleware:
middleware = [
Middleware(
CORSMiddleware,
allow_origins=['*'],
allow_credentials=True,
allow_methods=['*'],
allow_headers=['*']
)
]
and then:
app = FastAPI(middleware=middleware)
This should work
Thanks #Sam_Ste, I had the same problem! I set my imports back to FastAPI and it still works. I think they are just proxies for the starlette modules (IMHO). The method is the vital thing, not using app_middleware.
from fastapi.middleware import Middleware
from fastapi.middleware.cors import CORSMiddleware
For me, none of the above mentioned ideas worked. I had to create a custom middleware like this.
#app.middleware("http")
async def cors_handler(request: Request, call_next):
response: Response = await call_next(request)
response.headers['Access-Control-Allow-Credentials'] = 'true'
response.headers['Access-Control-Allow-Origin'] = os.environ.get('allowedOrigins')
response.headers['Access-Control-Allow-Methods'] = '*'
response.headers['Access-Control-Allow-Headers'] = '*'
return response
When testing, make sure you add Origin header to your request. Otherwise CORSMiddleware will not send back the cors headers.
It may not be clear at first, but it is written here in the documentation:
Simple requests
Any request with an Origin header. In this case the middleware will
pass the request through as normal, but will include appropriate CORS
headers on the response.
So any request without an Origin will be ignored by CORSMiddleware and no CORS headers will be added.

Pass the url into the parse method in scrapy that was consumed from RabbitMQ

I am using the scrapy to consume the message(url) from the RabbitMQ,But When I use the yield to call the parse method passing my url as parameters .The program does not comes inside the callback method.Below is the foloowing code of my spider
# -*- coding: utf-8 -*-
import scrapy
import pika
from scrapy import cmdline
import json
class MydeletespiderSpider(scrapy.Spider):
name = 'Mydeletespider'
allowed_domains = []
start_urls = []
def callback(self,ch, method, properties, body):
print(" [x] Received %r" % body)
body=json.loads(body)
url=body.get('url')
yield scrapy.Request(url=url,callback=self.parse)
def start_requests(self):
cre = pika.PlainCredentials('test', 'test')
connection = pika.BlockingConnection(
pika.ConnectionParameters(host='10.0.12.103', port=5672, credentials=cre, socket_timeout=60))
channel = connection.channel()
channel.basic_consume(self.callback,
queue='Deletespider_Batch_Test',
no_ack=True)
print(' [*] Waiting for messages. To exit press CTRL+C')
channel.start_consuming()
def parse(self, response):
print response.url
pass
cmdline.execute('scrapy crawl Mydeletespider'.split())
My goal is to pass the url response to parse method
To consume urls from rabbitmq you can take a look at scrapy-rabbitmq package:
Scrapy-rabbitmq is a tool that lets you feed and queue URLs from RabbitMQ via Scrapy spiders, using the Scrapy framework.
To enable it, set these values in your settings.py:
# Enables scheduling storing requests queue in rabbitmq.
SCHEDULER = "scrapy_rabbitmq.scheduler.Scheduler"
# Don't cleanup rabbitmq queues, allows to pause/resume crawls.
SCHEDULER_PERSIST = True
# Schedule requests using a priority queue. (default)
SCHEDULER_QUEUE_CLASS = 'scrapy_rabbitmq.queue.SpiderQueue'
# RabbitMQ Queue to use to store requests
RABBITMQ_QUEUE_NAME = 'scrapy_queue'
# Provide host and port to RabbitMQ daemon
RABBITMQ_CONNECTION_PARAMETERS = {'host': 'localhost', 'port': 6666}
# Bonus:
# Store scraped item in rabbitmq for post-processing.
# ITEM_PIPELINES = {
# 'scrapy_rabbitmq.pipelines.RabbitMQPipeline': 1
# }
And in your spider:
from scrapy import Spider
from scrapy_rabbitmq.spiders import RabbitMQMixin
class RabbitSpider(RabbitMQMixin, Spider):
name = 'rabbitspider'
def parse(self, response):
# mixin will take urls from rabbit queue by itself
pass
refer to this : http://30daydo.com/article/512
def start_requests(self) this function should return a generator, else scrapy wont work.

Resources