Command / CLI based REST Http client to process bulk requests (i.e. like .http files) - webapi

Need a tool/setup in place that will trigger RESTful webAPI requests using a preformatted file (which contains Http request config and payload) .. like the .http file format supported by VScode REST client.
We need a CLI because the process is automated and runs in background. A typical scenario will be a .http file arriving in a folder. The tool will pick it up (usually there will be a single Http Request) and trigger that Http request. The response will be output to another file (not mandatory but a logging feature would be helpful in debugging).
Here're a few options we've been exploring -
Postman Newman
CURL with a windows Batch file (or use HttPie instead of curl)
Similar tools we've been exploring - VScode REST client, httpYAC, .. (not sure if such tools can be automated)
A console app based solution - using Node.js, C#, Python, php, ... it'll be highly customizable but this will be like start from scratch. So this is the last option if all the above fails.
We just need this piece to complete our data flow. And we're yet to face things like request throttle, delay, auth, ... but all of this needs to be preconfigured and automated. The setup can very based on the Http service provider we use (i.e. Shopify, Amazon, ...).
EDITT #1 :
Option#5. Forgot to mention that we had implemented a webAPI access demo using SQL SPs (OLE Automation) and we could achieve a lot of what we wanted. Here's a similar Ref.

Related

Why Jmeter can't record sites using the firebase as data connection

I tried to record a site using JMeter which uses Firebase for data storage but it fails to access the firebase and I can not log into the site while recording. Is there any way to access firebase during the recording of load testing in JMeter? I entered the JMeter certificate also but still, the problem is there. And also tried using the chrome extension still it also didn't give the expected output Error Description Image
Most probably it's due to incorrect JMeter configuration for recording, you need to import JMeter's certificate into your browser. The file is called ApacheJMeterTemporaryRootCA.crt, JMeter generates it under its "bin" folder when you start the HTTP(S) Test Script Recorder.
See HTTPS recording and certificates documentation chapter for more details.
Going forward consider looking at View Results Tree listener output and jmeter.log file, they should provide sufficient amount of information in order to get to the bottom of the issue. If you cannot interpret what you see there yourself - add at least essential parts of response/log to your question.
Also be aware of alternative "non-invasive" way of recording a JMeter test - JMeter Chrome Extension, in that case you won't have to worry about proxies and certificates and should be able to normally record whatever HTTP(S) traffic your browser generates

How can I track downloads of files from remote websites

I am sharing the link of a file (e.g. pdf), which is stored in my server. Is it possible to track whenever some user is downloading the file? I don't have access to the script of the other page but I thought I could track the incoming requests to my server. Would that be computationally expensive? Any hints towards which direction to look?
You can use the measurement protocol, a language agnostic description of a http tracking request to the Google Analytics tracking server.
The problem in your case is that you do not have a script between the click and the download to send the tracking request. One possible workaround would be to use the server access log, provided you have some control over the server.
For example the Apache web server can user piped logs, e.g. instead if being written directly to a file the log entry is passed to a script or program. I'm reasonably sure that other servers have something similar.
You could pipe the logs to a script that evaluates if the log entry points at the URL of your pdf file, and if so breaks down the info into individual data fields and sends them via a programming language of your choice to the GA tracking server.
If you cannot control the server to that level you'd need to place a script with the same name and location as the original file on the server, map the pdf extension to a script interpreter of your choice (in apache via addType, which with many hosts can be done via a htaccess file) and have the script sending the tracking request before delivering the original file.
Both solutions require a modicum of programming practice (the latter much less than the former). Piping logs might be expensive, depending on the number of requests to your server (you might create an extra log file for downloadable files, though). An intermediary script would not be an expensive operation.

Does nginx-rtmp provide interface to get current live stream info

As we know,nginx-rtmp provides stat function, but the result style is web page. I have several nginx server, I want to collect all the live streams info in these nginx server, and integrate them into my web system.So does nginx provide interface to get these info or is there any other idea to do it?
THANKS!
It also provides a stat.xml based on which the xsl builds the html. You can programatically parse the xml structure and build a list of the currently running streams on your server.

Single Page Application with signalR: performance testing

I have an issue to evaluate the amount of concurrent users that our website can handle. Website is a Single Page Application built on .net framework with Durandal.js on the frontend. We use signalR (hubs) for real time communication between server and client.
The only option I see is ‘browser testing’, so each test should run browser instance (or use phantomJs etc) to keep real time connection with the server (as in real usage). Are there any other options to do this except use tests that will use browser instance to emulate user’s behaviour? What is the best way to emulate load of e.g. 1000 concurrent users?
I’ve found several cloud services that support such load testing, e.g. loadimpact, blazemeter. Would be great if someone can share their experience of using such tools.
SignalR provides tool called Crank, which can be used to test how many connections can be handled by given machine.
More info: http://www.asp.net/signalr/overview/performance/signalr-connection-density-testing-with-crank
Make your own script to create virtual users! that is the most effective way to recreate real world load/stress! use Akka Actor model(for creating virtual users) with java signalr client! (if you want you can use Gatling tool as framework and attach your script written in java or scala to virtual users of Gatling!)
make script dynamic by storing user info(Authentication token or user credentials) in xml document.
Please comment questions I can guide you end to end as I completed building+deploying such tool...

Any suggestions for good automated web load testing tool?

What are some good automated tools for load testing (stress testing) web applications, that do not use record and replay of HTTP network packets?
I am aware that there are numerous load testing tools on the market that record and replay HTTP network packets. But these are unsuitable for my purpose, because of this:
The HTTP packet format changes very often in our application (e.g. when
we optimize an AJAX call). We do not want to adapt all test scripts just because
there is a slight change in HTTP packet format.
Our test team shall not need to know any internals about our application
to write their test scripts. A tool that replays HTTP packets, however, requires
the team to know the format of HTTP requests and responses, such that they
can adapt details of the replayed HTTP packets (e.g. user name).
The automated load testing tool I am looking for should be able to let the test team write "black box" test scripts such as:
Invoke web page at URL http://... .
First, enter XXX into text field XXX.
Then, press button XXX.
Wait until response has been received from web server.
Verify that text field XXX now contains the text XXX.
The tool should be able to simulate up to several 1000 users, and it should be compatible with web applications using ASP.NET and AJAX.
JMeter I've found to be pretty helpful, it also has a recording functionality to record use cases so you don't have to specify each GET/POST manually but rather "click" the use case once and then let JMeter repeat it.
http://jmeter.apache.org/
A license can be expensive for it (if you dont have MSDN), but Visual Studio 2010 Ultimate edition has a great set of load and stress testing tools that do what you describe. You can try it out for free for 90 days here.
TestMaker by PushToTest.com can run recorded scripts such as Selenium as well as many different languages like HTML, Java, Ruby, Groovy, .Net, VB, PHP, etc. It has a common reporting infrastructure and you can create load in your test lab or using cloud testing environments like EC2 for virtual test labs.
They provide free webinars on using open source testing tools on a monthly basis and there is one next Tuesday.
http://www.pushtotest.com
There are a few approaches; I've been in situations, however, where I've had to roll my own load generating utilities.
As far as your test script is concerned it involves:
sending a GET request to http://form entry page (only checking if a 200 response is given)
sending a POST request to http://form submit page with pre-generated key/value pairs for text XXX and performing a regexp check on the response
Unless your web page is complex AJAX there is no need to "simulate a button press" - this is taken care of by the POST request.
Given that your test consists of just a 2-step process there should be several automated load packages that could do this.
I've previously used httperf for load testing a large website: it can simulate a session consisting of several requests and can simulate a large number of users (i.e. sessions) simultaneously. For example, if your website generated a session cookie from the home page you could make that the first request, httperf would then use that cookie for subsequent requests, until it had finished doing the list of requests supplied.
What about http://watin.sourceforge.net/ ?

Resources