sending dictionary parameter with jmeter to rest service - asp.net

First of all i am a newbie at jmeter. I searched a lot of documents but unfortunately i did not find to answer for my problem. I have a web api rest service with following sign. I can't send dictionary format parameter.
[HttpPost,ActionName("DummyService")]
public Dictionary<string,string> DummyService([FromBody] Dictionary<string,string> parameters)
I used Parameters section and BodyData section but parameters are always null.
How can i achieve that?
Thanx in advance.

I believe you need to pass some form of a JSON Array to your web service, in order to do it:
Switch to Body Data tab
Put your JSON payload there like:
{
"Parameters": [
{
"Key": "Key1",
"Value": "Value1"
},
{
"Key": "Key2",
"Value": "value2"
}
]
}
Most likely you will need to add a HTTP Header Manager to your test plan and configure it to send Content-Type header with the value of application/json
(optional) Consider upgrading to JMeter 3.0, looking into screenshot it seems you're sitting on an older version
See Testing SOAP/REST Web Services Using JMeter guide for more information on setting up JMeter for API testing

Related

Kafka HTTP Topic Producer configuration acks=all

How do we set acks=all for KAFKA HTTP Topic? I tried sending in the below JSON as "acks":"all" But it throws with an unrecognized property. I tried setting it in the header as well. But in Header when I set to any value like 0,1, all and abcd. It is accepting all the values. So I can't rely on the header. But from Java code, we can set this property as ProducerConfig.ACKS_CONFIG as key-value pair
{
"records": [
{
"value": { "name": "Firstname, lastname" },
"key":"123e4567-e89b-42d3-a456-5566424415123591"
}
]
}
Any Suggestions are helpful.
In case of Kafka REST Proxy, the producer instances are shared between clients i.e. the client(s) will connect to REST proxy instance(s) and those REST proxy instance(s) will in turn connect to the Kafka cluster (brokers).
The json that you have supplied is only going to contain data records.
The REST proxy layer will have the global settings for producers that you're looking for and clients will end up sharing these. So, if you've got access to the REST proxy instances then you could modify these parameters there directly.

Enable CORs for Swashbuckle swagger.json in .NET Lambda API

I have a .NET lambda API that I was previously using Swashbuckle to generate a swagger.json file that was given to an external site to use. I am now trying to setup so the swagger.json file is is generated by the API and available through a url for the external site to us ie: mylambdaapi.com/swagger/v2/swagger.json. I was able to get this working by adding a dummy event to my template when pushing to aws as follows.
"SwaggerJson": {
"Type": "Api",
"Properties": {
"Path": "/swagger/v2/swagger.json",
"Method": "GET"
}
}
This works for just accessing the file normally, however the external site will run into CORS "No 'Access-Control-Allow-Origin' header" issues when trying to load the json. Is there any way to force the generation to use "Access-Control-Allow-Origin" in this case? Or is this not feasible in this way? I'm working off what another developer had built previously so I'm trying not to rewrite every, however I'm open to another method as long as it is able to produce some swagger json that the external site can consume.
EDIT: I should note that I am using API gateway, hover the swagger.json is only used for documentation purposes for the external site.
Attempted to use UseCors() functionality however that did not work. I was able to fix the issue by adding an anonymous function to handle the response before UseSwagger.
The following snip-it is from the Configure function in my startup.
app.Use((context, next) =>
{
context.Response.Headers["Access-Control-Allow-Origin"] = "*";
return next.Invoke();
});
app.UseSwagger();

Alexa Skill Developers Reference-Based Catalog Management API

This doc says "With the Reference-Based Catalog Management API, you can create a custom slot type that references an external data source to get the slot type values. This API allows you to create and maintain a catalog of slot type values independent of your Alexa skill."
However as you dig into it, it doesn't provide some needed details on how to actually setup the catalog on an endpoint like s3.
While this resource was provided as an answer in this similar question, it actually refers to content catalogs (like music playlists), not the Reference-Based Catalog Management API, so I assume that was in error and it is not applicable.
So, for the Reference-Based Catalog Management API: The docs say it needs to be in JSON format, and offers ingredients.json as an example. However I used this directly, and it fails (see below). Also, it does not describe what the format should be to include synonyms. Please describe this.
I can successfully create the catalog with '/v1/skills/api/custom/interactionModel/catalogs/' and get a catalogId in return. However, creating the catalog version via '/skills/api/custom/interactionModel/catalogs/{catalogId}/versions' fails. I get "Website Temporarily Unavailable" when I issue the POST.
Here's the request body structure that I'm including with that post:
data: {
"source": {
"type": "URL",
"url": "https://s3.amazonaws.com/..../ingredients.json"
},
"description": "test S3 bucket"
}
Also, does the S3 endpoint have to be made public? I tried it both ways, didn't seem to matter. If it does have to be public though, how did you handle security?
Thanks for the help.
While the API call fails, I did get this to work using the CLI approach.
ask api create-model-catalog-version -c {catalogID} -f {filename}
The file should be JSON with the following structure:
{
"type": "URL",
"url": "[your catalog url]"
}
It remains an open question how to get the API approach to work, so any answers appreciated. Maybe it is a bug, because I specify the exact same 'source' definition in the data structure of the API call as I do in the JSON file used by the CLI command.
Here's what I learned as I got it to work with the CLI:
Yes, the S3 endpoint must be made public in order for the create-model-catalog-version job to succeed. This strikes me as a problem, would like to see the ability to wrap some security around these endpoints.
Here is the format of the JSON that you will want to use, including the use of synonyms which is not described in the official Amazon example. Note that you don't have to include an ID as shown in that example.
{
"values": [
{
"name": {
"value": "hair salon",
"synonyms": [
"hairdresser",
"beauty parlor"
]
}
},
{
"name": {
"value": "hospital",
"synonyms": [
"emergency room",
"clinic"
]
}
},
]
}

Zerocode: Set system property in host configuration file

Configuration:
zerocode-tdd.1.3.2
${host}
At runtime, system property set with -D java option. All is well.
Problem / What I Need:
At unit test time, system property not set, and host not resolved.
App uses Junit and Zerocode, would like to simply configure Zerocode to set the system property.
Example:
host.properties
web.application.endpoint.host:${host}
web.application.endpoint.port=
web.application.endpoint.context=
More Info:
Requirement is for configuration only. Can't introduce new Java code, or entries into IDE.
Any help out there? Any ideas are appreciated.
This feature is available in zerocode version 1.3.9 and higher.
Please use the place holder like ${SYSTEM.PROP:host} e.g. ${SYSTEM.PROPERTY:java.vendor} resolves to Oracle Corporation or Azul Systems, Inc.
Example link:
https://github.com/authorjapps/zerocode/blob/master/README.md#general-place-holders
Found a solution, but not sure if this is the correct way to do so.
Step 1: Create a config file and load system properties.
Config.java
public class Config {
public Map<String, Object> readProperties(String optionalString) {
Map<String, Object> propertiesMap = new HashMap<>();
final String host = System.getProperty("host");
propertiesMap.put("host", host);
return propertiesMap;
}
}
Step 2: Add a step (before other steps) to use the loaded properties in the .json file.
test.json
{
"scenarioName": "Test ...",
"steps": [
{
"name": "config",
"url": "com.test.Config",
"operation": "readProperties",
"request": "",
"assertions": {}
}
]
}
Step 3: Use loaded property in step config
test.json
{
"scenarioName": "Test ...",
"steps": [
{
"name": "config",
"url": "com.test.Config",
"operation": "readProperties",
"request": "",
"assertions": {}
},
{
"name": "test",
"url": "${$.config.response.host}/test/xxx",
"operation": "GET",
"request": {},
"assertions": {
"status": 200
}
}
]
}
That's it, although it is working but I am looking for better approach.
Some possible options I am trying are:
Common step for load/config (in one place)
Directly using properties as {host} in json files
Custom client
Again any help/ideas are appreciated.
My question is why are you trying to access the actual host/port? Sorry for the long answer but bear with me. I think there is an easier way to achieve what you are attempting. I find its best to think about zerocode usage in two ways,
live integration tests (which is what I think your trying to do) [meaning this calls a live endpoint / service], or
what I refer to as a thintegration test (an integration test but using a mock endpoint / service).
Thinking about it this way gives you the opportunity for two different metrics,
when using the mock endpoint / service how performant / resilient is my code, and
when using live integration tests what is the rough real life performance (expect a bit slower than external load test due to data setup / test setup).
This lets you evaluate both yourself and your partner service.
So outside of the evaluation above why would you want to build a thintegration test? The real value in doing this is you still make it all the way through your code like you would in an integration test but you control the result of said test like you would in a standard unit test. Additionally since you control the result of the test this may improve build time test stability vs a live api.
Obviously it seems you already know how to setup an integration test so I'll assume you're good to go there but what about the thintegration tests?
To setup a thintegration test you really have two options,
use postman mock server (https://learning.postman.com/docs/designing-and-developing-your-api/mocking-data/setting-up-mock/)
a. more work to setup
b. external config to maintain
c. monthly api call limits
use WireMock (http://wiremock.org/)
a. lives with your code
b. all local so no limits
If you already have integration tests you can copy them to a new file and make the updates or just convert your existing.
**** To address your specific question ****
When using WireMock you can setup a dynamic local server url with dynamic port using the following.
protected String urlWithPort;
#Rule
public WireMockRule wireMockRule = new WireMockRule(wireMockConfig().dynamicPort().dynamicHttpsPort());
protected String getUriWithPort() {
return "http://localhost:" + wireMockRule.port();
}
Note: The above was tested using WireMock version 2.27.1 and ZeroCode 1.3.27
Hope that helps you answer how to dynamically get a server/port for your tests.

Custom authentication provider for Azure logic apps

I have an existing web API using ASP.NET web API 2 which has it own token based authentication using x-auth-token.
I want to add Azure logic apps to this existing API, but the logic apps have to use that API for authentication. Azure AD, Facebook, Google... are not an option.
Is this possible? How?
In this case you would want to specify the header directly under the headers property of the action.
"Http": {
"conditions": [],
"inputs": {
"headers": {
"x-auth-token": "the auth token"
},
"method": "POST",
"uri": "https://myapiendpoint.com/action"
},
"type": "Http"
}
As a best practice you would want to specify the actual token value as a parameter of type 'securestring'. You can find more information on secure parameters here
https://msdn.microsoft.com/library/azure/mt643789.aspx
So what I did was create a IOperationFilter (Swashbuckle) in ASP.NET and add the x-auth-token parameter to the Swagger export when required. Then the parameter shows up correctly in Azure and be filled in with the response of the previous authentication action.

Resources