How to get the state of a remote job in Livy using Java API - livy

Is it possible to monitor the state of an already running remote job in Livy with Java API? How can this be done?
I looked over Livy Java API docs. A JobHandle would let me pool the state of the app. However, the only way I can see to obtain it is via LivyClient.submit while I need to get a handle of the job that was already submitted outside of Java code. I'm afraid that Java API doesn't support it and creating REST calls in Java code is the only option. If anyone has found a way to get information about batch jobs via Livy Java API, please show the code used.

Related

How To Use Event Arc Locally For Cloud Run?

So I am switching from cloud functions to Cloud Run and I am trying to figure out how to run Event Arc locally. I know you can set up an emulator for Eventarc using Firebase Emulator but Im not sure to have it trigger one of my Cloud Run functions when I write to my local Firestore db. Can someone please let me know how this can be done?
I did see one vague answer here:
Emulation of event-driven design in Cloud Run while developing locally?
But to me this doesn't make sense given that if Im using the local DB and local functions how would a remote instance work with my local dev environment. If this is possible please let me know and how I can accomplish this. Thanks.
It's not an easy task and the team is working to make the local tests easier. For now, I can share my hack.
First of all, you have to know that eventarc is roughtly a wrapper that create several resources behind the scene, especially a PubSub topic and Push subscription to your Cloud Run service. Because of that, an eventarc event is no more than a POST request with the event content as body.
For my hack, I have a Cloud Run service, on GCP, that log the headers and body of any incoming requests. I set up an eventarc with that service as target, and I trigger an event.
I go to the logs, copy the headers and the body of the received event, and I create a curl POST request with that.
Then, when I want to test my local service, I reuse my curl POST request, and submit it to my localhost server.

Azure Key Vault Linked service not working, debug failed, trigger success

I have created a Linked Service using key vault and then used that Linked service in Data Linked Service (Azure SQL database). Both Linked services independently tested successfully. I have used that in a very simple pipeline, while I am debugging the pipeline, it gets failed with an error:
'Invalid linked service reference. Name: '.
This is referring to Key Vault linked service.
When I trigger the pipeline, it works fine. I have published my changes so many time but no success.
So my basic query is - My pipeline is not working on Debug, however it is working fine with Trigger now.
I had faced exactly the same problem, I performed the following actions:
Save all existing pipelines
Validated all
Publish all
Closed the datafactory browser window/tab
logged back into datafactory
Opened the pipeline again and the debug worked fine. I didn't have to touch the Azure Vault configuration. Its most likely to do with cached vault configuration (or a sync issue with the cached vault config)
When a pipeline is working by trigger, but not by debug, that suggests either: there is a difference between the published version and the version in the UI, or, you have parameters that depend upon the trigger.
It's very strange thing I have noticed in Linked Service in ADF. I have selected Azure Key Vault near to password and just passed AKV linked service name there and it worked.
That suggests that JSON is not properly working with Azure key vault services in Linked Services. Well, my issue has been resolved however logically I am still unclear.
If any one looking for resolution of same, please refer below. Thank you.
Key Vault Linked Service

How to connect a database server running on local machine as a service to web application hosted on pivotal cloud foundry?

I am trying to test run a basic .NET web application on pivotal cloud foundry. This web application uses as its database a MongoDB server hosted on my local machine. At the moment I am limited to use of the cloud infrastructure by using just the Apps Manager.
I have read the pivotal cloud foundry docs about user provided services, but cannot figure out as to how the connection is to be really made. I have already come across various other ways like using MongoDB as a service (beta version), but at the moment I am not allowed access to the Operations Manager. Looking for an explanation on user provided services or how to implement the service broker API, specifically.
I am new to Mongo as well, so any suggestion regarding making a connection through tweaking Mongo may help as well. Thanks
The use case you describe (web app in PCF connecting to a resource in your local machine) is not recommended.
You can create a MongoDB instance for development purposes in PCF.
$ cf marketplace
...
mlab sandbox Fully managed MongoDB-as-a-Service
...
You can create a mlab service and bind it to your application. You will then have a MongoDB instance in PCF that you can use for development purposes.
Edit:
In that case a user provided service might help you, where you pass in your remote MongoDB instance configuration that you can read in your application. e.g.:
cf.exe cups my-mongodb -p '{"key1":"value1","key2":"value2"}'
You can add your local mongo-db as a CUPS service to your PCF Dev.
Check out the following post.
How to create a CUPS service for mongoDB?

How Do You Call A REST API From Within Watson Conversation?

I am testing out this android chat application using Bluemix https://github.com/IBM-Bluemix/chatbot-watson-android
At some point in the conversation I will need to call a REST API/webservice to retrieve info about data that has been gathered and send it back to the user as a chat.
I don't want to do it from within the android application as the application wont work when I deploy it to another platform (e.g. slack).
Is there a way to call REST APIs from within watson?
I don't think the conversation service can do it directly, but can it link to another Bluemix service and use the result of that?
If you are already using some form of middleware this can be achieved by setting an action tag in the .JSON editor of the node that should fire the action. This then gets picked up by your middleware listener.
Alternatively try the new cloud actions feature that has just been released here https://console.bluemix.net/docs/services/conversation/dialog-actions.html#dialog-actions which is really simple and easy
I would create a server to intermediate the communication between your app (android) and the conversation service. This server could call/retrieve the required data before sending the conversation response to your app.
As you're using Bluemix, you could use Node-Red to easily do this.
Here is an example of an app that I made exactly this.
If you are starting with Watson and Bluemix, I strongly advice trying to use the Node-red starting pack. It's really easy to integrate Watson services and call REST API/web-services, even integrate with a database.
Here is a starting point to this:
https://nodered.org/docs/platforms/bluemix
Happy coding!

How to send data from rest api to a kaa server

How to send data from rest api to kaa server without using the sdk
the above it is possible or i only can push new data using the sdk ?
I tried use the api methods but i don't know what is the appropriate
Don't think you would be able to use anything other than example SDK to send data to Kaa. Kaa SDKs have implicit info about the schemas used in the kaa server. You can use a sample SDK and modify for your own use.
REST log appender provided with kaa is very easy to use. If you face any problem then you can search for other questions related to kaa REST log appender and you would be able to find a solution. If not, pls ask for help on your specific error msg or issue. You can also refer to the question asked here

Resources