Can I use the Watson Data API with the Cloud Pak for Data Lite plan? - watson-knowledge-catalog

I've created a Cloud Pak for Data Lite account (free version) and added a few assets to a catalog. I wanted to try accessing the metadata of those assets using an API. I thought that I could use the Watson Data API for this, but it doesn't seem to be working. For example, the following request in Postman returns a 404 error:
GET https://dataplatform.cloud.ibm.com/v2/assets/{asset-id}?catalog_id={catalog-id}
Authorization: Bearer {access-token}
I successfully created the api key at https://cloud.ibm.com/iam/overview, used a request to https://iam.ng.bluemix.net/identity/token to get the {access-token}, and I found the {asset-id} and {catalog-id} from the asset's Cloud Pak URL on the https://dataplatform.cloud.ibm.com/.
I found this Medium article https://medium.com/#vrvignesh/manage-and-automate-assets-of-cloud-pak-for-data-2-5-using-collect-and-organize-rest-apis-part-1-6b1b07d252e1 which states that I need to have Watson™ Knowledge Catalog and Watson studio setup. Unfortunately, it seems that I can't setup Watson™ Knowledge Catalog on the lite-plan when I already am using Cloud Pak for Data on the lite-plan. Except that I thought Cloud Pak uses the Watson Knowledge Catalog internally, so maybe it already is setup? I'm confused.
I'm new to the IBM Cloud, so I'm probably missing something simple here.
My main question is whether it's possible to request the metadata of assets in Cloud Pak for Data using an API on the Lite plan.

Yes. The Lite version of Cloud Pak for Data does support API access using the Watson Data API.
The problem was that I was using the wrong API endpoint URL. The correct endpoint URL is
https://api.dataplatform.cloud.ibm.com
Source: https://cloud.ibm.com/apidocs/watson-data-api?_ga=2.69949400.931528022.1628866891-939983482.1628697566
I was able to successfully use the following request
GET https://api.dataplatform.cloud.ibm.com/v2/assets/{asset-id}?catalog_id={catalog-id}
As for the other parts of the question, it does seem that I have an active Watson Knowledge Catalog running as part of Cloud Pak for Data. It can be seen as active on the Resource List page https://cloud.ibm.com/resources

Related

Why use a legacy api for a Firebase FCM notifications sample?

What is the impetus for the author of https://github.com/firebase/functions-samples/blob/master/fcm-notifications/functions/index.js Line 74 to use a Legacy API?
Put another way, is it possible to use Firebase FCM non-legacy API to achieve the same outcome?
I checked with the author of that sample. Back when the sample was created, what is now called the legacy API, was the best API available.
When the new v1 API was released we looked into upgrading the sample to use that. But (as Umar commented) since the new API no longer support sending to multiple tokens with one call, that upgrade got deprioritized after some other tasks.
I recommend that you file a bug on the Github repo, to get the sample updated to use the latest API. I'd also recommend filing a feature request to get "sending to multiple tokens in one call" back into the new FCM API, since it seems like a rather useful feature.
Update
It seems that so-called multi-cast send operations are coming to the V1 API. From an #AskFirebase video about FCM:
We are planning to add a multicast feature to HTTP V1 that will allow you to send to multiple tokens in a single API request.

Cognitive Search requests with API keys generated in Azure console return 401s

I was previously using a cognitive search API key with no issues. Recently, it expired (I assume due a migration to Azure but it's unclear).
To get a new API key, I took the following steps:
created an Azure account added the Cognitive Search APIs service
(with image search, the service I'm interested in)
selected the
standard package (1k req/month at $3/month if I recall)
created the
service
When I attempt to use the new API key, either through curl,
my app, or the test console, I receive a 401. I recreated the service
and the new API key fails as well.
Thanks.
It's been a few months since you asked this question, but having just had this difficulty myself, I thought I'd share the solution.
If you create an instance of the service in Azure, you can presently create it on a whole host of different regions and it'll create successfully and provide you a key for it. However, if you look at the Azure Services by Region, you'll see that most of the Cognitive Services are only actually available in the West US region.
If you go back to the Azure portal, delete your instance and recreate it in the West US region, I expect you'll be more successful.

Error when accessing OneDrive for Business via graph API

I have built a asp.net app that successfully accesses o365 exchange with an App Only Token and Graph. I am now trying to access a specific user's OneDrive for Business files with the same token and a Get request similar to the following:
https://mycomp-my.sharepoint.com/_api/v2.0/drives/simon#mycomp.com/items
but I get the following error:
3001000;reason='There has been an error authenticating the request.';category='invalid_client'
Any idea what is the cause of the error?
It's possible to do app-delegated access to OneDrive for Business today using the direct API endpoint and the Sites.ReadWrite.All app-delegated permission scope in AAD. I'd consider it more "in preview" than supported, so Yina's answer is technically correct. I'm still getting the documentation for how build an app finalized, but we'll be publishing something soon.
Andrew Connell has a good blog post about how to get this setup, available here: http://www.andrewconnell.com/blog/user-app-app-only-permissions-client-credentials-grant-flow-in-azure-ad-office-365-apis
Use of the OneDrive API is possible using this same method.
App Only access to a user's OneDrive is not supported via Microsoft Graph at this point in time.

Query HealthKit data via REST API

Is it possible to get data from Healthkit the same way as you would query regular API (With user's consent) to store in my webapp?
Something like: healthkit.com/api/v1/user/GetWeight
If yes, where can I find a list of available methods?
If not, are there any workarounds?
You'll have to build:
your own REST API service to store and retrieve the desired data;
an iOS app that accesses the data on-device using the HealthKit SDK and POSTs it to your API.
Neither step is trivial. Good luck!
An alternative may be to install Google Fit on the iPhone, which would connect to healthkit and sync that data to the cloud, which can then be queried via Fit's REST API https://developers.google.com/fit/rest/
If it's a REST/json API you want it's not available and i guess it never will be.
HealthKit is just a standard API available in the IOS8 SDK accessible from application running on a iDevice and written in objective-c/swift.
Not sure if this question is still relevant for someone, but now you have the option to use shortcuts to gather the desired health data and post it to your own api, also you may use automation in order to make it all happen without any hustle.

BigRquery - RUN_QUERY_JOB

I've installed "bigrquery" like this:
devtools::install_github("hadley/bigrquery")
library(bigrquery)
And i get this error, when trying to extract data:
Error: Access Denied: Job triple-xxx-xxx:job_zu6P-qSxxx7DBVICij6_QyDv0: RUN_QUERY_JOB
I've looked here and on the web and everyone says that you just need 2 things to extrac data from Google BigQuery:
1.-Have a Project for it (BigQuery Enabled):
2.-Put a billing address for BigQuery.
I've done that, but still got the problem.
IMPORTAT:
For other packages that interact with Google products (Google Analytics), e.g RGA; you need to create a Client ID (OAUTH), do i need to to this with "bigrquery"???
Someone can update the method to get the data?
Ps. I can get the data in the broswer (with the Web Interface provided by Google). But not in R from "bigrquery" - I'm using the version hosted on CRAN.
Ps2. I don't want that the "authentications" to be stored in the cache, is there a way to make "bigrquery" to ask for authentication everytime it tries to connect to BigQuery?
I found this issue on this post, but with the solution out-of-date:
Google App Engine authorization for Google BigQuery
This error means that the user that was running the query was not authorized to run jobs in the project (triple-xxx-xxx). You'd need to add the user that is running the query to the project via the developers console (https://console.developers.google.com/project).
To answer some of your other questions:
You don't need to create a clientid to use bigquery.
I'm not sure if there is a way to force bigrquery to re-authorize every time. That said, looking at the source code (https://github.com/hadley/bigrquery/blob/master/R/auth.r) you may be able to call set_access_cred with null to clear the authentication.

Resources