Does nginx-rtmp provide interface to get current live stream info - nginx

As we know,nginx-rtmp provides stat function, but the result style is web page. I have several nginx server, I want to collect all the live streams info in these nginx server, and integrate them into my web system.So does nginx provide interface to get these info or is there any other idea to do it?
THANKS!

It also provides a stat.xml based on which the xsl builds the html. You can programatically parse the xml structure and build a list of the currently running streams on your server.

Related

Command / CLI based REST Http client to process bulk requests (i.e. like .http files)

Need a tool/setup in place that will trigger RESTful webAPI requests using a preformatted file (which contains Http request config and payload) .. like the .http file format supported by VScode REST client.
We need a CLI because the process is automated and runs in background. A typical scenario will be a .http file arriving in a folder. The tool will pick it up (usually there will be a single Http Request) and trigger that Http request. The response will be output to another file (not mandatory but a logging feature would be helpful in debugging).
Here're a few options we've been exploring -
Postman Newman
CURL with a windows Batch file (or use HttPie instead of curl)
Similar tools we've been exploring - VScode REST client, httpYAC, .. (not sure if such tools can be automated)
A console app based solution - using Node.js, C#, Python, php, ... it'll be highly customizable but this will be like start from scratch. So this is the last option if all the above fails.
We just need this piece to complete our data flow. And we're yet to face things like request throttle, delay, auth, ... but all of this needs to be preconfigured and automated. The setup can very based on the Http service provider we use (i.e. Shopify, Amazon, ...).
EDITT #1 :
Option#5. Forgot to mention that we had implemented a webAPI access demo using SQL SPs (OLE Automation) and we could achieve a lot of what we wanted. Here's a similar Ref.

Why Jmeter can't record sites using the firebase as data connection

I tried to record a site using JMeter which uses Firebase for data storage but it fails to access the firebase and I can not log into the site while recording. Is there any way to access firebase during the recording of load testing in JMeter? I entered the JMeter certificate also but still, the problem is there. And also tried using the chrome extension still it also didn't give the expected output Error Description Image
Most probably it's due to incorrect JMeter configuration for recording, you need to import JMeter's certificate into your browser. The file is called ApacheJMeterTemporaryRootCA.crt, JMeter generates it under its "bin" folder when you start the HTTP(S) Test Script Recorder.
See HTTPS recording and certificates documentation chapter for more details.
Going forward consider looking at View Results Tree listener output and jmeter.log file, they should provide sufficient amount of information in order to get to the bottom of the issue. If you cannot interpret what you see there yourself - add at least essential parts of response/log to your question.
Also be aware of alternative "non-invasive" way of recording a JMeter test - JMeter Chrome Extension, in that case you won't have to worry about proxies and certificates and should be able to normally record whatever HTTP(S) traffic your browser generates

Single Page Application with signalR: performance testing

I have an issue to evaluate the amount of concurrent users that our website can handle. Website is a Single Page Application built on .net framework with Durandal.js on the frontend. We use signalR (hubs) for real time communication between server and client.
The only option I see is ‘browser testing’, so each test should run browser instance (or use phantomJs etc) to keep real time connection with the server (as in real usage). Are there any other options to do this except use tests that will use browser instance to emulate user’s behaviour? What is the best way to emulate load of e.g. 1000 concurrent users?
I’ve found several cloud services that support such load testing, e.g. loadimpact, blazemeter. Would be great if someone can share their experience of using such tools.
SignalR provides tool called Crank, which can be used to test how many connections can be handled by given machine.
More info: http://www.asp.net/signalr/overview/performance/signalr-connection-density-testing-with-crank
Make your own script to create virtual users! that is the most effective way to recreate real world load/stress! use Akka Actor model(for creating virtual users) with java signalr client! (if you want you can use Gatling tool as framework and attach your script written in java or scala to virtual users of Gatling!)
make script dynamic by storing user info(Authentication token or user credentials) in xml document.
Please comment questions I can guide you end to end as I completed building+deploying such tool...

Change application's Solr url with a master/slave configuration

Im working in a "home project solr"(aka PoC), using Solr as search engine.
I'm using master/slave configuration in two distinct servers, so(in my plans) my search would be avaiable even if the main server comes down but, the problem is:
Im currently doing my search at url
http://mainSOLRSERVER:8080/search/select/?q=*:*
Is there any built-in solr component to check if this url is OK, and if is not, change to another server
http://redundancySOLRSERVER:8080/search/select/?q=*:*?
I aint using Solrj in my application, its a simple ASP.Net application using xml parsing in Solr's results.
The only thing I came up with, would be send a ping request to Solr's main server and, if returns not ok, build the request with the redundancy URL, but that would be necessary in every single search request, is that the right approach?
Thanks in advance!
In production scenario, you would use a load balancer.
You don't need the ping. Directly execute the query on the main URL and if it returns an error, then query the backup URL. If that also throws error, then you are out of luck.

Printing on local printer from web application with limitations

I need to develop application, witch fetch data from some databases, compose document and then print it on local (label) printer.
It would be nice, if main application can be web based (ASP.NET). But I need control on the printing process to do not allow users to print the same document again.
Can you suggest me some solution? It is possible to create desktop application also, but we prefer web application.
One idea I have had was to implement some custom protocol as netmeeting does (callto://), but I don't know, if it is good idea and how difficult it would be.
Thank you for your advices.
I have realized this as I write by custom protocol (callto://) and I can recommend it. Protocol opens simple desktop application (it can be set up in windows registry) configured to print data fetched data to local printer.
In the protocol link a have base64 encoded url of webservice to ask for data and session id with customized print request for generating print data.
Works fine.

Resources