accessing OC configmap data in Robot test - robotframework

We have microservice architecture and I want to write an automated Robot test to test this MS. This MS has http endpoint as input but it writes data to a configmap. Does anyone know if its possible to access this configMap data in Robot test? and if possible how can we do that?
Thanks

Related

Custom routing via nginx - read from third party source

I am new to nginx, and am wondering if it can help me to solve a use-case we've encountered.
I have n nodes,which are reading from from a kafka topic with the same group id, which means that each node has disjoint data, partitioned by some key.
Nginx has no way of knowing apriori which node has data corresponding to which keys. But we can build an API or have a redis instance which can tell us the node given the key.
Is there a way nginx can incorporate third party information of this kind to route requests?
I'd also welcome any answers, even if it doesn't involve nginx.
Nginx has no way of knowing apriori which node has data corresponding to which keys
Nginx doesn't need to know. You would need to do this in Kafka Streams RPC layer with Interactive Queries. (Spring-Kafka has an InteractiveQueryService interface, btw, that can be used from Spring Web).
If you want to present users with a single address for the KStreams HTTP/RPC endpoints, then that would be a standard Nginx upstream definition for a reverse proxy, which would route to any of the backend servers, which in-turn communicate with themselves to fetch the necessary key/value, and return the response back to the client.
I have no idea how Kafka partitions
You could look at the source code and see it uses a murmur2 hash, which is available in Lua, and can be used in Nginx.
But again, this is a rabbit hole you should probably avoid.
Other option, use Kafka Connect to dump data to Redis (or whatever database you want). Then write a very similar HTTP API service, then (optionally) point Nginx at that.

Is there a way to process SignalR data using Spark Streaming?

I have a data source provided to me using SignalR, which I’ve never used.
I can’t find any documentation on how to ingest it with Spark Streaming, is there a defined process for that?
If not, are there intermediate steps I should take first? Process the data myself from a signal-r client, and create a Kafka producer that’s then reading from that via Structured Streaming?
Alternatively, could try to use airflow but also not sure about that since it’s streaming data.

how to work with kafka-kusto-sink and debug

ingesting data to kakfka cluster that can send data to adx kusto db by using kafka-sink-azure-kusto .
iam successfully ingesting data to kafka cluster and its not transferring data to kusto db. how to debug this? any logs i can verify.
i have tried to check broker log no errors there
ref:https://github.com/Azure/kafka-sink-azure-kusto/blob/master/README.md
Could you please provide more information about how you are running Kafka, and how did you set up the connector?
Debugging steps would be:
Broker logs should mention that connector was picked up properly, did you see that line in the logs?
Looking at Connector logs should show more info about what is actually going on under the hood. Maybe you will see some errors there. /var/log/connect-distributed.log
Try to ingest data via any other method? like one of the SDK's
Try running the setup according to steps detailed under delpoy
Update: more info about connector setup in general can be found at this SO question: Kafka connect cluster setup or launching connect workers
Also, confluent has some helpful docs:https://docs.confluent.io/current/connect/userguide.html

robot framework REST interface

Does robot framework have any capability to expose a REST interface to run/stop tests and provide status? I need some sort of stateless capability to manage tests and so on. Is there a way to limit how many tests that can run in parallel, so that a executed test either gets queued or runs in parallel?
I went through 'remote server' documentation at https://github.com/robotframework/PythonRemoteServer but didn't think this did what I wanted it to do.
Can someone provide more information?
No, it does not provide any sort of server that can be used to control tests via a REST interface. Robot also has no support for running tests in parallel. There is a separate tool that can be used to run robot tests in parallel: pabot
If you need a restful interface, you might want to look at a ci server such as jenkins.

Using NGINX to forward tracking data to Flume

I am working on providing analytics for our web property based on instrumentation data we collect via a simple image beacon. Our data pipeline starts with Flume, and I need the fastest possible way to parse query string parameters, form a simple text message and shove it into Flume.
For performance reasons, I am leaning towards nginx. Since serving static image from memory is already supported, my task is reduced to handling the querystring and forwarding a message to Flume. Hence, the question:
What is the simplest reliable way to integrate nginx with Flume? I am thinking about using syslog (Flume supports syslog listeners), but I struggle with how to configure nginx to forward custom log messages to a syslog (or just TCP) listener running on a remote server and on a custom port. Is it possible with existing 3rd party modules for nginx or would I have to write my own?
Separately, anything existing you can recommend for writing a fast $args parser would be much appreciated.
If you think I am on a completely wrong path and can recommend something better performance-wise, feel free to let me know.
Thanks in advance!
You should parse nginx log file like tail -f do and then pass results to Flume. It will be the most simple and reliable way. The problem with syslog is that it blocks nginx and may completely stuck under high-load or if something goes wrong (this is why nginx doesn't support it).

Resources