Google Cloud Endpoints: Transcoding messages with protobuf.Any fields - google-cloud-endpoints

I have a grpc service that takes as input a message that contains fields of type protobuf.Any and I can't figure out the way to write correct json input for it. I am running on GKE, with cloud endpoint ESP and my service running in the same pod.
The protos look like:
message AnyArray {
repeated google.protobuf.Any value = 1;
}
message Metric {
string metric = 1;
int64 timestamp = 2;
double value = 3;
map<string, AnyArray> metadata = 4;
}
I have tried multiple combinations for the input json with no luck, most of the times cloud endpoints returns "Proto field is not repeating, cannot start list." Failed examples:
{
"metadata": {
"sample-key": {
"value": [1, "one"]
}
},
"metric": "request-count",
"timestamp": 1528425789,
"value": 0
}
{
"metadata": {
"sample-key": {
"value": [{
"#type": "type.googleapis.com/google.protobuf.Duration",
"value": "1.212s"
}, {
"#type": "type.googleapis.com/google.protobuf.Duration",
"value": "1.212s"
}]
}
},
"metric": "request-count",
"timestamp": 1528425789,
"value": 0
}
Response from ESP
{
"code": 3,
"message": "metadata[0].value: Proto field is not repeating, cannot start list.",
"details": [{
"#type": "type.googleapis.com/google.rpc.DebugInfo",
"stackEntries": [],
"detail": "internal"
}]
}
Any help would be greatly appreciated.
Thanks!

google.protobuf.Any means any proto message type so your first example doesn't work. For the second example, google.protobuf.Duration have a special JSON mapping, so following example should work:
{
"metadata": {
"sample-key": {
"value": [{
"#type": "type.googleapis.com/google.protobuf.Duration",
"value": "1.212s"
}, {
"#type": "type.googleapis.com/google.protobuf.Duration",
"value": "1.212s"
}]
}
},
"metric": "request-count",
"timestamp": 1528425789,
"value": 0
}
Note you will have to include all possible proto message types for the google.protobuf.Any in the proto descriptor set provided Cloud Endpoints service, otherwise it will fail to translate unknown types.
Another possible approach is use google.protobuf.Struct for your metadata, which eliminates the limitation above, but you will have to convert it to your expected types in your gRPC service.

Using protobuf.Struct instead of protobuf.Any solved my problem as #lizan recommended. Thanks!

Related

Put Azure Key Vault value in parameter array

I am trying to deploy a App service webapp via ARM template and need to put a secret from a key vault into an app setting (env variable).
I have always simply used an array of values from a parameters file to populate these app settings, but now I am struggling to get a keyvault value into that array. Something like shown below in an ARM parameter file.
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"someStringParam": {
"value": "stringLiteralValueHere"
},
"envVars": {
"value": [
{
"name": "envVarKeyName",
"value": "stringLiteralValueHere"
},
{
"name": "KVsecret1",
"value": ##KEY VAULT SECRET HERE##
}
]
}
}
}
I have tried using a reference to the keyvault for the value but that errors on deployment.
{
"name": "KVsecret1",
"reference": {
"keyVault": {
"id": "/subscriptions/<subscription_id>/resourceGroups/<resource_group>/providers/Microsoft.KeyVault/vaults/<vault_name>"
},
"secretName": "secret1"
}
}
I have also tried using a parameter inside of the parameter file, but that just used the literal string for the value.
"parameters": {
"KVsecret1": {
"reference": {
"keyVault": {
"id": "/subscriptions/<subscription_id>/resourceGroups/<resource_group>/providers/Microsoft.KeyVault/vaults/<vault_name>"
},
"secretName": "KVsecret1"
}
},
"envVars": {
"value": [
{
"name": "envVarKeyName",
"value": "stringLiteralValueHere"
},
{
"name": "KVsecret1",
"value": "[parameters('KVsecret1')]"
}
]
}
}
Is this possible??
EDIT: Adding some detail here.
I am also trying to shoe horn a reference to another resource to get put the app insights instrumentation key into an app setting. Below is what I would like to do, but the copy function needs to use the name of the property and that is dynamic in this case as it changes with the each member of the array from the parameter file.
{
"type": "Microsoft.Web/sites/config",
"apiVersion": "2022-03-01",
"name": "[concat(parameters('backEndwebAppName'),'/appsettings')]",
"kind": "string",
"properties": {
"APPINSIGHTS_INSTRUMENTATIONKEY": "[reference(concat('microsoft.insights/components/',parameters('appInsightsName')),'2020-02-02').InstrumentationKey]",
"secret1FromKeyvault": "[parameters('secret1FromKeyvault')]",
"copy": [
{
"name": "envVarsFromParams",
"count": "[length(parameters('backEndEnvVariables'))]",
"input": {
"name": "[parameters('backEndEnvVariables')[copyIndex('envVarsFromParams').name]]",
"value": "[parameters('backEndEnvVariables')[copyIndex('envVarsFromParams').value]]"
}
}
]
},
"dependsOn": [
"[resourceId('Microsoft.Web/sites', parameters('backEndwebAppName'))]"
]
},
This isn't possible today within the param file, but in your scenario (if it's as simple as your OP example) you can just union the two in your template. So in your parameter file, you have 2 params kvSecret (the reference) and envVars (all your other env vars) and then in the template use:
"variables": {
"keySecretObj": {
"name": "kvSecret",
"value": "[parameters('kvSecret')]"
},
"envVarsFinal": "[union(parameters(variables('kvSecretObj`), parameters(`envVars`))]"
That help?

Data Factory: JSON data is interpreted as expression - ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression

I want to copy items from
CosmosDB databaseA/productCollection
to
CosmosDB databaseB/productCollection
Therefore I decided to use Azure Data Factory.
I actived also "Export as-is to JSON files or Cosmos DB collection".
The read operation works as expected.
Unfortunately, the write operation stops because of an error related to the data:
ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'Currency'
{
"ProductName": "Sample",
"Price": {
"#Currency": "GBP",
"$": "2624.83"
}
}
I'm not able to change to input data itself.
The output data has to equal the input data.
Is there possiblity, that #Currency will not be interpreted as an expression
In ARM, this part is failling:
Price.{#Currency}
I had the same problem and I was able to resolve accordingly.
I am using a Pipeline with a Source that is a Dataset referencing JSON data.
Clicking the button highlighted below.
I had to change the JSON from
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"#context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
To ( Escaping the # with ## )
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"##context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
I tried to reproduce your issue but it works for me. I used copy activity to transfer data from account A to account B.
Additional, if this operation is just need to be executed once, please consider using Azure Cosmos DB Migration Tool. It's free for usage. You could export the data from cosmos db A as json file then import it into cosmos db B very simply.Also, it could be executed in the cmd so that it could be made as a scheduled job on the windows system.

How should a "write" request be structured for Firestore REST API (v1beta1)?

Based on the Google Discovery document, and RPC reference, it appears that the :write resource should be available for Firestore database interactions, but performing such a request to my (POST https://firestore.googleapis.com/v1beta1/projects/[my project]/databases/(default)/documents:write) results in:
[
{
"error": {
"code": 400,
"message": "Invalid value (Object), ",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.BadRequest",
"fieldViolations": [
{
"description": "Invalid value (Object), "
}
]
}
]
}
}
]
Is this possible? A related SO answer alludes to this being available as a means of field transforms, the same reason I require this, but I cannot construct valid a JSON body to succeed in the request. Currently, variations on the following don't work as expected when trying a minimum successful response:
{
"writes": [
{
"update": {
"name": "projects/{projectId}/databases/[my project]/documents/exampleId",
"fields": {
"example": {
"integerValue": 100
},
"timestamp": {
"nullValue": null
}
},
"transform": {
"document": "projects/[my project]]/databases/(default)/documents/examples/exampleId",
"fieldTransforms": [
{
"fieldPath": "timestamp",
"setToServerValue": "REQUEST_TIME"
}
]
}
}
}
]
}
First of all, note that you should use the v1 version of the REST API, not the betas.
To create a document, you would use the createDocument method, while to update a document you would use the patch one.
For the document creation you should therefore make a POST HTTP Request to the following URL
https://firestore.googleapis.com/v1/projects/<your-project-id>/databases/(default)/documents/<the-desired-collection>
with the following Request body:
{
fields: {
example: {
integerValue: 100
}
}
}
You need to use documents:commit instead of documents:write
also the name field should be in this format:
"name": "projects/projectID/databases/(default)/documents/collectionName/DocumentId"
See this post.

JAGQL - Why do I need an id for a post call?

I'm using JAGQL to build a JSON API compatible express server. My database behind it is MongoDB (jsonapi-store-mongodb). I posted my question here as well: https://github.com/holidayextras/jsonapi-store-mongodb/issues/59
According to the JAGQL documentation, https://jagql.github.io/pages/project_setup/resources.html#generateid,
I am told that
generateId
By default, the server autogenerates a UUID for resources which are created without specifying an ID. To disable this behavior (for example, if the database generates an ID by auto-incrementing), set generateId to false. If the resource's ID is not a UUID, it is also necessary to specify an id attribute with the correct type. See /examples/resorces/autoincrement.js for an example of such a resource.
But when I send a POST request to one of my resources, I get this:
"jsonapi": {
"version": "1.0"
},
"meta": {},
"links": {
"self": "/myresource"
},
"errors": [
{
"status": "403",
"code": "EFORBIDDEN",
"title": "Param validation failed",
"detail": [
{
"message": "\"id\" is required",
"path": [
"id"
],
"type": "any.required",
"context": {
"key": "id",
"label": "id"
}
}
]
}
]
What am I missing?
See here for more details: https://github.com/jagql/framework/issues/106
In your resource definition, you want to add primaryKey: 'uuid':
{
resource: 'things',
handlers,
primaryKey: 'uuid',
attributes: {
...
}
}

Request probleme with Google Cloud Datastore and Filter

I'm currently doing some tests on google datastore, but I'm having a problem with my queries.
If I believe in the documentation https://cloud.google.com/datastore/docs/concepts/queries we can realize a filter on several columns with the instruction EQUALS.
But when testing, I get an error from the API.
While searching on Datastore's github, I found this reference: https://github.com/GoogleCloudPlatform/google-cloud-dotnet/issues/304 which corresponds to my problem, except that for my case the query to the look good.
Here is the request sent:
{
{
"kind": [{
"name": "talk.message"
}],
"filter": {
"compositeFilter": {
"op": "AND",
"filters": [{
"propertyFilter": {
"property": {
"name": "Conversation"
},
"op": "EQUAL",
"value": {
"stringValue": "2f16c14f6939464ea687d316438ad4cb"
}
}
},
{
"propertyFilter": {
"property": {
"name": "CreatedOn"
},
"op": "LESS_THAN_OR_EQUAL",
"value": {
"timestampValue": "2019-03-15T10:43:31.474166300Z"
}
}
},
{
"propertyFilter": {
"property": {
"name": "CreatedOn"
},
"op": "GREATER_THAN_OR_EQUAL",
"value": {
"timestampValue": "2019-03-14T10:43:31.474175100Z"
}
}
}
]
}
}
}
}
And here is the answer from the API:
{Grpc.Core.RpcException: Status(
StatusCode=FailedPrecondition,
Detail="no matching index found. recommended index is:
- kind: talk.message
properties:
- name: Conversation
- name: CreatedOn"
)
According to the documentation, this should be good... but it's not !
What am I missing ?
Your query includes both an EQUALS (on Conversation) and a non-EQUALS filter (on CreatedOn), therefore you need a composite index to fulfil the query. So your query is valid, but it needs a composite index to be able to run the query.

Resources