I'm trying to test a GRPC Service using GHZ. However, I get the error -
Cannot find service "com.server.grpc.Executor"
Config.json file:
"proto": "/Users/dev/Desktop/ghz/execute.proto",
"call": "com.server.grpc.Executor.execute",
"total": 2000,
"concurrency": 50,
"data": {
"param1": "test-data1",
"param2": "test-data2",
},
"max-duration": "10s",
"host": "<ip-address>:9090",
"c": 10,
"n": 200
}
proto file:
option java_package= "com.server.grpc";
option java_multiple_files = true;
service Executor {
rpc execute(ExecuteRequest) returns (ExecuteResponse);
}
message ExecuteRequest {
string param1 = 1;
string param2= 2;
}
message ExecuteResponse {
bool res = 1;
string msg = 2;
}
Running using command: ghz --config=<path/to/config>/config.json
Is there anything I'm missing?
Your protobuf file should contain e.g.:
syntax = "proto3";
package example;
...
Then, your service would be fully-qualified by example.Executor.execute not com.server.grpc.Execute.execute which is a language-specific (I assume, Java by your option) fully-qualified name.
I assume you unintentionally omitted the opening brace ({) of the JSON file but that, of course, is required.
JSON is challenging your "param2": "test-data2" must not be terminated with , because it's the last item in the list; so drop that comma.
{
"proto": "/Users/dev/Desktop/ghz/execute.proto",
"call": "example.Executor.execute",
"total": 2000,
"concurrency": 50,
"data": {
"param1": "test-data1",
"param2": "test-data2"
},
"max-duration": "10s",
"host": "<ip-address>:9090",
"c": 10,
"n": 200
}
Assuming your service is running on <ip-address>:9090, that should then work!
Related
Given a downloaded deno module like:
import x from "https://deno.land/x/reggi#0.0.1/calendar/denodb/fresh/island.ts"
How can I find where this specific module is located on the host machine?
➜ deno.land cd x
cd: no such file or directory: x
➜ deno.land pwd
/Users/thomasreggi/Library/Caches/deno/deps/https/deno.land
There seems to be no "x" dir here 👆
Where is the file stored locally?
How can I create the local file path from url via code / api?
Notes:
The Linking to Third Party Code section in the manual will be a great entrypoint reference on this topic.
The below information was written when Deno's latest release was version 1.25.1.
Deno has a dependency inspector tool:
deno info [URL] will inspect an ES module and all of its dependencies.
This tool prints all of the dependencies of a module to stdout, and the output includes the module's remote origin as well as its cached path on your local storage device.
It also has a (currently unstable) argument --json which will print the dependencies in JSON format (which is easier to parse programmatically if that's your goal).
You can create a subprocess in your program that uses the above command and argument, then parse its output in order to programmatically determine a module's cached location. (Also note that the subprocess API is changing and the currently unstable APIs Deno.spawn and Deno.spawnChild will replace the current one.)
Here's some example output from a module with fewer dependencies than the one in your question:
% deno info https://deno.land/std#0.154.0/testing/asserts.ts
local: /Users/deno/.deno/deps/https/deno.land/ce1734220fb8c205d2de1b52a59b20401c59d0707abcc9765dbeb60c25483df9
type: TypeScript
dependencies: 3 unique (total 46.65KB)
https://deno.land/std#0.154.0/testing/asserts.ts (23.22KB)
├── https://deno.land/std#0.154.0/fmt/colors.ts (11.62KB)
├─┬ https://deno.land/std#0.154.0/testing/_diff.ts (11.11KB)
│ └── https://deno.land/std#0.154.0/fmt/colors.ts *
└── https://deno.land/std#0.154.0/testing/_format.ts (705B)
Output when using the --json argument:
% deno info --json https://deno.land/std#0.154.0/testing/asserts.ts
{
"roots": [
"https://deno.land/std#0.154.0/testing/asserts.ts"
],
"modules": [
{
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/02d8068ecd90393c6bf5c8f69b02882b789681b5c638c210545b2d71e604b585",
"emit": null,
"map": null,
"size": 11904,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/fmt/colors.ts"
},
{
"dependencies": [
{
"specifier": "../fmt/colors.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/fmt/colors.ts",
"span": {
"start": {
"line": 11,
"character": 7
},
"end": {
"line": 11,
"character": 25
}
}
}
}
],
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/62cb97c1d18d022406d28b201c22805c58600e9a6d837b0fc4b71621ed21e30d",
"emit": null,
"map": null,
"size": 11380,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/testing/_diff.ts"
},
{
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/3f50b09108fe404c8274e994b417a0802863842e740c1d7ca43c119c0ee0f14b",
"emit": null,
"map": null,
"size": 705,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/testing/_format.ts"
},
{
"dependencies": [
{
"specifier": "../fmt/colors.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/fmt/colors.ts",
"span": {
"start": {
"line": 10,
"character": 32
},
"end": {
"line": 10,
"character": 50
}
}
}
},
{
"specifier": "./_diff.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/testing/_diff.ts",
"span": {
"start": {
"line": 11,
"character": 44
},
"end": {
"line": 11,
"character": 56
}
}
}
},
{
"specifier": "./_format.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/testing/_format.ts",
"span": {
"start": {
"line": 12,
"character": 23
},
"end": {
"line": 12,
"character": 37
}
}
}
}
],
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/ce1734220fb8c205d2de1b52a59b20401c59d0707abcc9765dbeb60c25483df9",
"emit": null,
"map": null,
"size": 23776,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/testing/asserts.ts"
}
],
"redirects": {}
}
However, if your ultimate goal is to cache modules (that come from a remote origin) to your local storage device and import them from that location rather than use Deno's built-in cache, I recommend using the built-in tool for that: deno vendor. From its manual page:
deno vendor <specifiers>... will download all remote dependencies of the specified modules into a local vendor folder.
Update: Here's an example script demonstrating the method I described above:
so-73596066.ts:
/// <reference lib="deno.unstable" />
import { assertExists } from "https://deno.land/std#0.154.0/testing/asserts.ts";
/*
`deno info --json` is unstable, and I didn't find any mention of schema for its
output in the docs, but here's a (conservative) partial type for the bits
that are relevant to this example, derived from looking at just a couple
of outputs from Deno v1.25.1:
*/
type ModuleInfo =
& Record<"kind" | "local" | "mediaType" | "specifier", string>
& Record<"emit" | "map", string | null>
& {
dependencies?: unknown[];
size: number;
};
type DependencyInspectorResult = {
modules: ModuleInfo[];
roots: string[];
};
/**
* Creates a formatted error message and allows for improved error handling by
* discriminating error instances
*/
class ProcessError extends Error {
override name = "ProcessError";
constructor(status: Deno.ChildStatus, stdErr?: string) {
let msg = `The process exited with status code ${status.code}`;
if (stdErr) msg += `. stderr:\n${stdErr}`;
super(msg);
}
}
/**
* Parses output from `deno info --json`. The resulting command will look like:
* `deno info --json [...denoInfoArgs] specifier`
* #param specifier local/remote path/URL
* #param denoInfoArgs optional, additional arguments to be used with `deno info --json`
*/
async function getCachedModuleInfo(
specifier: string | URL,
denoInfoArgs?: string[],
): Promise<ModuleInfo> {
const decoder = new TextDecoder();
const specifierStr = String(specifier);
const args = ["info", "--json"];
if (denoInfoArgs?.length) args.push(...denoInfoArgs);
args.push(specifierStr);
const { stderr, stdout, ...status } = await Deno.spawn("deno", { args });
if (!status.success) {
const stdErr = decoder.decode(stderr).trim();
throw new ProcessError(status, stdErr);
}
const result = JSON.parse(
decoder.decode(stdout),
) as DependencyInspectorResult;
const moduleInfo = result.modules.find((info) =>
info.specifier === specifierStr
);
assertExists(moduleInfo, "Module not found in output");
return moduleInfo;
}
/**
* `console.log` truncates long strings and deep object properties by default.
* This overrides that behavior.
*/
function print(value: unknown): void {
const inspectOpts: Deno.InspectOptions = {
colors: true,
depth: Infinity,
strAbbreviateSize: Infinity,
};
const formattedOutput = Deno.inspect(value, inspectOpts);
console.log(formattedOutput);
}
async function main() {
const moduleInfo = await getCachedModuleInfo(
"https://deno.land/std#0.154.0/testing/asserts.ts",
);
const { local, specifier } = moduleInfo;
print({ specifier, local });
}
if (import.meta.main) main();
% deno --version
deno 1.25.1 (release, x86_64-apple-darwin)
v8 10.6.194.5
typescript 4.7.4
% deno run --allow-run=deno --unstable so-73596066.ts
{
specifier: "https://deno.land/std#0.154.0/testing/asserts.ts",
local: "/Users/deno/.deno/deps/https/deno.land/ce1734220fb8c205d2de1b52a59b20401c59d0707abcc9765dbeb60c25483df9"
}
k6 docs make using environmental variables very simple and I tried following their instructions, but I get a GO error when I try to run it:
ERRO[0000] GoError: parse https://${__ENV.TARGET_ENV}-api.mycompany.com/v1/managers/259999/properties": invalid character "{" in host name".
I don't see an extra bracket anywhere. This script was working fine when I had the url as https://green-api.mycompany.com/v1/managers/259999/properties. Am I possibly missing an import or dependency somewhere? All I am trying to do is get it working to where when i type k6 run --env TARGET_ENV=green propertiesScript.js, it executes against http://green-api.mycompany.com/v1/managers/259999/properties. Here is the file:
import { check } from "k6";
export let options = {
thresholds: {
http_req_duration: ["p(90)<300"], // 95% of requests should be below 200ms
Errors: ["count<100"],
},
};
export default function () {
var url =
"https://${__ENV.TARGET_ENV}-api.mycompany.com/v1/managers/259999/properties";
const params = {
headers: {
"X-App-Token": "<our app token>",
"X-Auth-Token":
"<our auth token>",
accept: "application/json",
},
};
let res = http.get(url, params);
console.log(res.body);
console.log(JSON.stringify(res.headers));
check(res, {
"status is 200": (r) => r.status === 200,
});
}
I also tried adding scenarios and adjusting the body of my file. These are the scenarios:
thresholds: {
http_req_duration: ["p(90)<300"], // 95% of requests should be below 200ms
Errors: ["count<100"],
},
scenarios: {
pod_green: {
tags: { my_custom_tag: "green" },
env: { MYVAR: "green" },
executor: "shared-iterations",
},
pod_red: {
tags: { my_custom_tag: "red" },
env: { MYVAR: "red" },
executor: "shared-iterations",
},
staging: {
tags: { my_custom_tag: "staging" },
env: { MYVAR: "staging" },
executor: "shared-iterations",
},
},
};
Then I edited my export default function and I was able to get that script to run, but it runs against every single environment.
You need to use backticks when you set the url variable for string interpolation to work in template literals. See the documentation.
So in your case you should have:
var url =
`https://${__ENV.TARGET_ENV}-api.mycompany.com/v1/managers/259999/properties`;
I can't seem to find how to correctly call PutItem for a StringSet in DynamoDB through API Gateway. If I call it like I would for a List of Maps, then I get objects returned. Example data is below.
{
"eventId": "Lorem",
"eventName": "Lorem",
"companies": [
{
"companyId": "Lorem",
"companyName": "Lorem"
}
],
"eventTags": [
"Lorem",
"Lorem"
]
}
And my example template call for companies:
"companies" : {
"L": [
#foreach($elem in $inputRoot.companies) {
"M": {
"companyId": {
"S": "$elem.companyId"
},
"companyName": {
"S": "$elem.companyName"
}
}
} #if($foreach.hasNext),#end
#end
]
}
I've tried to call it with String Set listed, but it errors out still and tells me that "Start of structure or map found where not expected" or that serialization failed.
"eventTags" : {
"SS": [
#foreach($elem in $inputRoot.eventTags) {
"S":"$elem"
} #if($foreach.hasNext),#end
#end
]
}
What is the proper way to call PutItem for converting an array of strings to a String Set?
If you are using JavaScript AWS SDK, you can use document client API (docClient.createSet) to store the SET data type.
docClient.createSet - converts the array into SET data type
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName:table,
Item:{
"yearkey": year,
"title": title
"product" : docClient.createSet(['milk','veg'])
}
};
I am quite familiar with using Python to connect with Elasticsearch server, but as I am exploring the R language, I am stuck at this connecting part. I usually use this kind of code to connect with the server in Python:
import requests
import json
RM_URL = "http://es-int-client-1.senvpc:9200/rm_201609/_search?timeout=10000"
payload = {
"size": 10000000,
"query": {
"filtered": {
"filter" : {
"bool": {
"must": [
{"term": {"events.id": str(event_id)}},
{"range": {"score_content_0": {"gte": score}} },
{"range": {"published_at": { "gte": str(start_date+"T00:00:00"),
"lte": str(end_date+"T23:59:59")}}},
{"term": {"lang": la}}
]
}
}
}
}
}
r = requests.post(RM_URL, json=payload)
results = json.loads(r.content, encoding='utf-8')
I would be glad if anyone could show me how to do the same in R, thanks!
I am trying to convert my JSON ES query to Java I am using Spring data. I am almost there but the problem is i cannot get the "size": 0 into my query in Java.
GET someserver/_search
{
"query": { ...},
"size": 0,
"aggregations" : {
"parent_aggregation" : {
"terms" : {
"field": "fs.id"
},
"aggs": {
"sub_aggs" : {
"top_hits": {
"sort": [
{
"fs.smallVersion": {
"order": "desc"
}
}
],
"size": 1
}
}
}
}
}
}
In Java I am building an NativeSearchQuery object on which I think it should be possible to set the size?
NativeSearchQuery searchQuery = createNativeSearchQuery(data, validIndices, query, filter);
es.getElasticsearchTemplate().query(searchQuery, response -> extractResult(data, response));
If you are using elasticsearch before 2.0, you can use the search type feature to do what you want, which is not return docs. This can be accomplished using the NativeSearchQueryBuilder. If you set the SearchType to COUNT you will not get docs. Beware that in elasticsearch 2.x this is deprecated and you should use the size is 0. If the spring elasticsearch project will support elasticsearch 2.0 this will most likely change and the size attribute should be exposed in this builder as well.
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(matchAllQuery())
.withSearchType(SearchType.COUNT)
.withIndices("yourindex")
.addAggregation(terms("nameofagg").field("thefield"))
.build();