I was wondering if it is possible to generate the interfaces from a git repository location instead of relative directory locations.
I tried
python -m grpc.tools.protoc \
--proto_path=https://github..../../protos/ \
--grpc_python_out=. https://github..../..protos/required.proto
but it doesn't work with the error https://github..../../protos/required.proto: No such file or directory exists
This feature is not supported in any protobuf implementation I know of. You'll have to download the proto files indicated somehow as well as all of their dependencies.
Related
Setup
I'm following the installation directions in the mirthSync readme, which is to clone the repo. The next indication of usage that I can see is in the Examples section, which via CLI is to "pull Mirth Connect code from a Mirth Connect instance":
java -jar mirthsync.jar -s https://localhost:8443/api -u admin -p admin pull -t /home/user/
I'm assuming that after cloning the repo, one should cd into that directory and then run the java -jar... command with all the appropriate flag values (server, username, password, etc).
Error
After running the CLI command, I get this error:
Error: Unable to access jarfile mirthsync.jar
Question
Where is this mirthsync.jar file supposed to come from? Is there something I need to do in order to generate the mirthsync.jar file?
Generate it via lein uberjar (which creates target/uberjar/*-standalone.jar) or download it from a release.
I'm trying to generate working Python modules from the containerd API .proto files as to be found here: https://github.com/containerd/containerd/tree/master/api.
Unfortunately, containerd's own .proto files contain references such as (in api/events/container.proto):
import weak "github.com/containerd/containerd/protobuf/plugin/fieldpath.proto";
Now, this import is actually located in protobuf/plugin/fieldpath.proto, as opposed to (vendor/)github.com/containerd/.... A simple -I ... does not work in this context, as the import uses a "github"-absolute path, whereas the corresponding sources aren't located inside the vendor branch.
Simply copying over the sources inside vendor/github.com/... will cause runtime errors when trying to use the generated Python modules: this is because there are now two separate instances for the same protocol elements trying to register with GRPC with the same protocol element name, yet from two different Python modules. The GRPC Python runtime thus throws an error and terminates.
How can I correctly get this resolved when using python3 -m grpc.tools.protoc ...?
After some trial+error I've finally come up with a working solution that works as follows and which might be helpful for others facing gRPC-based APIs with are more complex than many gRPC examples.
copy over the API .proto files into a directory structure that reflects the final desired Python package and module structure. For containerd, this means having everything within a containerd/ directory structure, avoiding github.com/ folders and aliasing (import aliasing will break things).
fix all import statement paths that would either cause module aliasing or won't fit with the final desired package structure. A good way to do this is with sed while copying over the proto files. In my case, replace "github.com/containerd/containerd/..." with just "containerd/..." import paths.
in case of vendor'ed .proto files somehow related to gRPC infrastructure and where gRPC PyPI packages exist, such as grpcio and protobuf, put then side-by-side with your API .proto files, but do not vendor them into the API directory hierarchy. Ensure to put them in a directory structure that mimics the package structure of the already available PyPI packages.
use protoc via the python3 interpreter to generate the Python modules only for your API .proto files; make sure that the supplemental .proto files from grpc and protobuf are includable, but do not create modules for them. protoc does this already correctly unless you vendor the supplemental .proto files into your API .proto files ... so don't do this.
make sure that your grpcio and protobuf PyPI packages are recent and somehow in sync, avoid especially totally outdated Debian distro packages, install from PyPI ... even if this is a painfully slow process on arm64 because there are no binary wheels for grpcio and grpciotools. Symptoms when doing not so include runtime errors of missing grpc or protobuf object fields.
The gRPC documentation describes how to use the protoc command line program to compile a *.proto file to a certain language for all languages except for Node.
There, it is only described (at the time of this writing) how to dynamically load and at runtime (behind the scenes) generate the JS code.
Is it possible to use the protoc program to compile proto files to JS directly, similar to other languages?
I've found how to do static code generation for Node on this GitHub page.
Here's a copy of the example they provide:
npm install -g grpc-tools
grpc_tools_node_protoc --js_out=import_style=commonjs,binary:../node/static_codegen/ --grpc_out=../node/static_codegen --plugin=protoc-gen-grpc=`which grpc_tools_node_protoc_plugin` helloworld.proto
grpc_tools_node_protoc --js_out=import_style=commonjs,binary:../node/static_codegen/route_guide/ --grpc_out=../node/static_codegen/route_guide/ --plugin=protoc-gen-grpc=`which grpc_tools_node_protoc_plugin` route_guide.proto
I'm currently playing around with Jaeger Query and trying to access its content through the API, which uses gRPC. I'm not familiar with gRPC, but my understanding is that I need to use the Python gRPC compiler (grpcio_tools.protoc) on the relevant proto file to get useful Python definitions. What I'm trying to do is find out ways to access Jaeger Query by API, without the frontend UI.
Currently, I'm very stuck on compiling the proto files. Every time I try, I get dependency issues (Import "fileNameHere" was not found or has errors.). The Jaeger query.proto file contains import references to files outside the repo. Whilst I can find these and manually collect them, they also have dependencies. I get the impression that following through and collecting each of these one by one is not how this was intended to be done.
Am I doing something wrong here? The direct documentation through Jaeger is limited for this. The below is my basic terminal session, before including any manually found files (which themselves have dependencies I would have to go and find the files for).
$ python -m grpc_tools.protoc --grcp_python_out=. --python_out=. --proto_path=. query.proto
model.proto: File not found.
gogoproto/gogo.proto: File not found.
google/api/annotations.proto: File not found.
protoc-gen-swagger/options/annotations.proto: File not found.
query.proto:20:1: Import "model.proto" was not found or had errors.
query.proto:21:1: Import "gogoproto/gogo.proto" was not found or had errors.
query.proto:22:1: Import "google/api/annotations.proto" was not found or had errors.
query.proto:25:1: Import "protoc-gen-swagger/options/annotations.proto" was not found or had errors.
query.proto:61:12: "jaeger.api_v2.Span" is not defined.
query.proto:137:12: "jaeger.api_v2.DependencyLink" is not defined.
Thanks for any help.
A colleague of mine provided the answer... It was hidden in the Makefile, which hadn't worked for me as I don't use Golang (and it had been more complex than just installing Golang and running it, but I digress...).
The following .sh will do the trick. This assumes the query.proto file is a subdirectory from the same location as the script below, under model/proto/api_v2/ (as it appears in the main Jaeger repo).
#!/usr/bin/env sh
set +x
rm -rf ./js_out 2> /dev/null
mkdir ./js_out
PROTO_INCLUDES="
-I model/proto \
-I idl/proto \
-I vendor/github.com/grpc-ecosystem/grpc-gateway \
-I vendor/github.com/gogo/googleapis \
-I vendor/github.com/gogo/protobuf/protobuf \
-I vendor/github.com/gogo/protobuf"
python -m grpc_tools.protoc ${PROTO_INCLUDES} --grpc_python_out=./python_out --python_out=./python_out model/proto/api_v2/query.proto
This will definitely generate the needed Python file, but it will still be missing dependencies.
I did the following to get the Jaeger gRPC Python APIs:
git clone --recurse-submodules https://github.com/jaegertracing/jaeger-idl
cd jaeger-idl/
make proto
Use the files inside proto-gen-python/.
Note:
While importing the generated code, if you face the error:
AttributeError: module 'google.protobuf.descriptor' has no attribute '_internal_create_key'
Do:
pip3 install --upgrade pip
pip3 install --upgrade protobuf
recently I'm reading nginx source code, but I got confused how to build it's code by autoconf, I have try my best to write a Makefile.am, unfortunate, I'm failed to write a correct Makefile.am file, so I cann't get a configure file, does anybody know how to write a Makefile.am?
I know how to write a Makefile.am, but you have no need to.
As you know, the nginx source package is a GNU autotools
package.
You don't do the autotooling. The people who write nginx do that. When you
download the source package, the configure.ac, the Makefile.am(s) and other
autotools files are already there along with all the source code.
To build the package, all you have to do is run the configure script to
generate correct makefiles for your system, then run make. (This is why the
build system is called autotools.)
Source packages are distributed from http://nginx.org/download/. Assuming
you want nginx 1.10.2 (the stable release at this time), you simply do this
in a suitable working directory:
$ wget http://nginx.org/download/nginx-1.10.2.tar.gz
$ tar zxf nginx-1.10.2.tar.gz
$ cd nginx-1.10.2
$ ./configure
$ make
Then it's built in ./nginx-1.10.2. If you then want to install nginx in your system, continue:
$ sudo make install
Building any autotooled source package is essential the same as this.
For full details and variations, do read NSTALLING NGINX OPEN SOURCE
I have write a email to nginx's author, he said configure file was written by hand