What is the issue in the Google map engine Query - google-maps-api-3

i need to query some data with where clause
as per the API google map engine i have request as below.
https://www.googleapis.com/mapsengine/v1/tables/14538994882799551513-11853667273131550346/features?where=gx_id%3D900
for the above URL it says BAD REQUEST where=gx_id=900
and the response is
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "The value is invalid.",
"locationType": "parameter",
"location": "query"
}
],
"code": 400,
"message": "The value is invalid."
}
}
Please suggest me what is wrong in this URL

You use a Number in your query, but the particular gx_id seems to be of type String.
Enclose the Number with single-quotes:
https://www.googleapis.com/mapsengine/v1/tables/14538994882799551513-11853667273131550346/features?where=gx_id%3D%27900%27

Also, you don't have to worry about creating and keeping track of your ID's. the API says you need to include gx_id only to make sure that no Feature is sent twice.
Therefore, you can just throw in a string representing your system's current time for example.

Related

CRM Portal: WebAPI: Error while executing WebAPI request: Attribute {0} cannot be found for table {1}

I am getting this response back when trying to execute a WebAPI request, but can't figure out why its erroring out. Moreover the error is not very helpful, as it doesn't tell which entity or which field, and I cannot identify any fields missing.
{
"error": {
"code": "90040100",
"message": "Attribute {0} cannot be found for table {1}.",
"innererror": {
"code": "90040100",
"message": "Attribute {0} cannot be found for table {1}.",
"type": "InvalidAttribute"
}
}
}
I tried this myself a few times but got the same errors that you describe here. According to the Microsoft documentation this is explicitly not supported.
Calling actions and functions using the portals Web API is not supported.

BigQueryInsertJobOperator - required Parameter is missing, but which?

I've been trying to get this operator working for some time since switching to airflow 2.0 BigQueryInsertJobOperator.
The error I'm seeing shows there is something missing from our connection, oddly enough this connection works in another DAG where we are using google's api to access google sheets:
export AIRFLOW_CONN_GOOGLE_CLOUD_DEFAULT=
"google-cloud-platform://?extra__google_cloud_platform__project=\analytics&extra__google_cloud_platform__keyfile_dict=
{\"type\": \"service_account\", \"project_id\": \"analytics\",
\"private_key_id\": \"${GCLOUD_PRIVATE_KEY_ID}\", \"private_key\": \"${GCLOUD_PRIVATE_KEY}\",
\"client_email\": \"d#lytics.iam.gserviceaccount.com\", \"client_id\": \"12345667\",
\"auth_uri\": \"https://accounts.google.com/o/oauth2/auth\",
\"token_uri\": \"https://accounts.google.com/o/oauth2/token\",
\"auth_provider_x509_cert_url\": \"https://www.googleapis.com/oauth2/v1/certs\",
\"client_x509_cert_url\": \"https://www.googleapis.com/robot/v1/metadata/x509/d#lytics.iam.gserviceaccount.com\"}"
This is the error I'm seeing:
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"errors": [
{
"message": "Login Required.",
"domain": "global",
"reason": "required",
"location": "Authorization",
"locationType": "header"
}
],
"status": "UNAUTHENTICATED"
}
}
is there a way I can look up what else might be required in terms of formatting, etc, perhaps a really good example on how to get the correct connection setup for this Operator??
In my logs I'm seeing this error which makes me think perhaps it might not be a credential issue?
File "/usr/local/lib/python3.8/site-packages/google/cloud/_http.py", line 438, in api_request
raise exceptions.from_http_response(response)
google.api_core.exceptions.BadRequest: 400 POST https://bigquery.googleapis.com/bigquery/v2/projects/vice-analytics/jobs?prettyPrint=false: Required parameter is missing
Create a service account json key, which contains all the required info posted in your error message.
https://cloud.google.com/iam/docs/creating-managing-service-account-keys
Then you can paste the json key into the Airflow UI: Admin -> Connections in the json key field and reference this in your dag with: gcp_conn_id="name of connection you created"
Or add the json key as an env variable (on macos):
export GOOGLE_APPLICATION_CREDENTIALS="link to your json key file"

Google calendar RRULE fails when updating but not when initially creating calendar entry

I already have an event entry that I created from my custom calendar with the following RRULE
FREQ=WEEKLY;DTSTART=20201203T090000;INTERVAL=2;BYDAY=TH;EXDATE:20201203T090000
When I update the event without changing the rrule I get an error
Error Message: Google_Service_Exception: { "error": { "errors": [ { "domain": "global", "reason": "invalid", "message": "Invalid recurrence rule." } ], "code": 400, "message": "Invalid recurrence rule." } }
The only thing I can think of is that this rrule has dates in the past. Can anyone confirm that this would be the issue?
FREQ=WEEKLY;DTSTART=20201203T090000;INTERVAL=2;BYDAY=TH;EXDATE:20201203T090000 is not a valid recurrence rule. With all the invalid parts removed the rule would look like this:
FREQ=WEEKLY;INTERVAL=2;BYDAY=TH
DTSTART and EXDATE are separate properties.

endpoints-proto-datastore - field should be required on POST but not GET

Let's say I have a Model with two mandatory fields:
class ExampleModel(EndpointsModel):
attr1 = ndb.StringProperty(required=True)
attr2 = ndb.StringProperty(required=True)
Then I want to use endpoints-proto-datastore to query on either attr1 or attr2:
#ExampleModel.query_method(query_fields=('attr1', 'attr2'),
path='example', name='list')
def example_list(self, query):
return query
This fails if I only provide one of the fields - from API Explorer it's a required field, but the API itself returns:
{
"error": {
"code": 400,
"errors": [
{
"domain": "global",
"message": "Error parsing ProtoRPC request (Unable to parse request content: Message CombinedContainer is missing required field attr2)",
"reason": "badRequest"
}
],
"message": "Error parsing ProtoRPC request (Unable to parse request content: Message CombinedContainer is missing required field attr2)"
}
}
Obviously I could mark them as not required, then handle the check within the application code - but I was wondering if someone else had come up with a better solution.
Many thanks
This is an old question but I ran into the same confusion. This was the answer I found. Basically if you want to make something mandated on Post but not get you need to make a custom proto class. Which can only be used with method and not query_method.

google prediction api "hello prediction" tutorial

I have run through the Google Prediction API tutorials and documentation for "hello prediction - https://cloud.google.com/prediction/docs/hello_world
However when training the model in the developer console my request fails with the following output:
Request:
POST https://www.googleapis.com/prediction/v1.6/projects/959568262740/trainedmodels?key={YOUR_API_KEY}
{
"id": "language_id",
"storageDataLocation": "http://storage.googleapis.com/2341234/language_id.txt"
}
Response:
400 OK
- Show headers -
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "Training data file is empty.",
"locationType": "other",
"location": "id"
}
],
"code": 400,
"message": "Training data file is empty."
}
}
I've implemented Authorize requests using OAuth 2.0: - is there anything else that I should be doing, or that may have changed between Google PRediction API v 1.6 and the tutorial. Any link to any additional tutorial or article on the subject would also be extremely valuable so can debug myself!
Your storageDataLocation is not correct.
When you go to the Overview tab in Google Developers Console, you'll find your Project ID: xxxxx (example) and the bucket you've stored in is called yyyy (example).
Then replace the "storageDataLocation:" "xxxxx/yyyy". Should solve this problem. You need the relative path to Google Storage, and not the absolute web path.

Resources