How to get list of installed features in Karaf using REST API? - apache-karaf

I know using command line it can be get by running feature:list -i but is there any API/JSON available to fetch this?

You can use jolokia and hawtio to retrieve that information. Quite easily. I believe you can easily add the hawtio repo from the native karaf repos in features (repo-add hawtio). Then you need to install jolokio, hawtio, and the karaf web console. From the karaf webconsole alone you can see a full list of features, but I find the hawtio interface to be a god send.

A REST API can be installed without the need for Hawtio, which uses jolokia for accessing the bundle list under the hood.
The jolokia project provides web applications called agents serving a REST API. For quick experiments you can deploy the war jolokia-war-unsecured into the hot deploy folder of a running karaf instance. This installs a A REST web service at e.g. http://localhost/jolokia-war-unsecured/ which does not require any authentications.

Related

Remote apache karaf bundle management via jolokia?

I need to remotly manage the bundles that run on my karaf instances, ideally via HTTP calls or python scripts.
I set up my karaf instance and can access to it at http://mykarafserver:8040/jolokia.
I found just one example of usage in the jolokia website :
{
"type":"read",
"mbean":"java.lang:type=Memory",
"attribute":"HeapMemoryUsage",
"path":"used"
}
and I get a result, but I can't find the urls of Json syntax to start, stop, restart and get status of my bundles. I believe this is possible as some tools like Hawtio can manage Camel and Karaf stuf.

Why do we need to deploy a meteor app instead of just starting it?

As we all know, we can run a meteor app by just typing meteor in a terminal.
By default it will start a server and use port 3000.
So why do I need to deploy it using MUP etc.
I can configure it to use port 80 or use nginx to route to port 80 for the app. So the port is not the point.
Edit:
Assume meteor is running on a VPS or cloud server with public IP address, not a personal computer.
MUP does a few extra things you can do yourself:
it 'bundles' the code into a single file, using meteor build bundle
the javascript is one file, and css another; it's minified, and obfuscated so it's smaller and faster to load, and less easy to decipher on the client.
some packages are also meant to be removed when running in production. For example meteorToys, the utility toolset to look up collections and much more, is not bundled into the production bundle, as per the instructions in its package. This insures you don't deploy code with security vulnerabilities (Meteor toys basically opens up client side delete / updates etc... if you're not careful)
So, in short, it installs a minimal version of your site, making sure that what's meant for development only doesn't get push to a production environment.
EDIT: On other reason to do this, is that you don't need all the Meteor build tools on your production server; that can add up to a lot of stuff, especially if you keep caches going for a while...
I believe it also takes care of hooking up to a remote MongoDB Instance (at least it used to be the case on the free meteor site) which is more scalable and fault tolerant than running on the same instance as the web server, as well as provision storage etc... if needed.
basically, to deploy a Meteor app yourself manually, you need to:
on your dev box:
meteor build bundle your app to a tar file (using the architecture flag corresponding to the OS you will use)
on the server:
install node v0.10 (or whatever is the current version of node required by Meteor)
you might have to install Fiber#1.0.5 (but I believe this is now part of meteor install already)
untar the bundle, get into bundle/programs/server/ and run npm install
run the server with node main.js in the bundle folder.
The purpose of deploying an application is that you are situating your project on hardware outside of your local machine. For example if you deploy an application on Heroku app you create a repository on heroku's systems and that code based is used to serve your application off of their servers.
If you just start an application on your personal system, you will suffer a lack of network and resource availability as well as under use of computer time at non-peak hours as your system will need to remain attentive for additional users without having alternative tasks. Hosting providers provide resources as needed, and their diverse client base allows their systems to work around the clock on a global scale.

How do you push updates to a deployed meteor app that has a filesystem?

I have an app running on my own digitalocean VM that I'm trying to play around with to figure out how to run a meteor production server. I deployed it with meteor build, but now I'm a bit unsure about how to push updates. If I build a new tarball on my own machine, I will loose file references that my users have made to files in bundle/uploads, because the remote filesystem isn't incorporated into my local project. I can imagine some hacky ways to work around this, but besides hosting the files on s3 or another 3rd party server, is there any way to "hot code push" into the deployed app without needing to move files around on my server?
Am I crazy for wondering what the meteor equivalent of git push/pull is in production, or just ignorant?
You can use dokku (https://github.com/progrium/dokku). DigitalOcean allows you to create an instance pre-installed with dokku too.
Once you've set up your ssh keys, set the environment variables, ROOT_URL, PORT and MONGO_URL you can add that server as a git remote and simply git push to it.
Dokku will automatically build up the Meteor app and have it running, and keep it up to date whenever you git push.
I find Dokku is very convenient. There's also flynn and deis which are able to do the same in multi tenant environment with way more options.
Just one thing to keep in mind with this is to push the guys who own the repo to keep the Node version in the buildpack up to date. Meteor is a bit overzealous when it comes to using the latest version of Node and refusing older versions.
Meteor does lack a bit in this department. I can't remember where I may have heard this, but I believe they intend on adding this very popular Meteor deployment package to their library. Short of switching to a more compatible host, I'm not aware of any better solutions.

Apache Karaf deployment in a Jenkins build pipeline

Currently I am trying to improve automation in my test environment which uses Apache Karaf 2.4 and Jenkins.
My goal is, if Jenkins successfully builds the project (and updates a Maven repository), to deploy the new version in Karaf. Karaf and Jenkins run on different machines.
My current solution is to use a feature-descriptor, but I am facing a lot of trouble with it: I found no easy way to update feature-bundles in Karaf 2.4. As far as I know, there is no command that is able to update all bundles of an existing feature in one command.
My current approach is to grep the output of the list command after a special pattern and find out all BIDs and then run update for all IDs. This approach tends to create bugs (it may include bundles that are not part of the feature when they match the same naming-pattern). And I was wondering if there is a better way to automatically update all my feature bundles in one?
Also, there is another problem: When a new feature gets added or removed from the feature-file, I found no elegant way to update it. My current approach was to first uninstall all associated bundles (with grep again...), then remove the repository from Karaf and then replace the new version of the feature-file in the deployment folder. As this is very circumstantial, I was wondering, if there is a better way to do this in Karaf 2.4?
I think deployment got better in Karaf 3, but I am unable to use it because of this bug (link). As I am in a testing environment and my software tends to be extremely unstable, I often have to kill the Karaf process and restart it again. And this is a big problem in Karaf 3. Hopefully it will be fixed in Karaf 4 but I do not want to wait until it is released.
Do you have any suggestions, how my deployment-process could be improved? Maybe there are better ways than my solution (I really hope so!), but I haven't seen them yet.
This is a nice request, cause I've been working on something similar just recently. Basically I've got the same setup you got.
So you just need to install the latest Jolokia osgi servlet on top of karaf to make all JMX commands accessible via REST.
For a showcase I created a maven plugin (I'm going to publish those sources in a couple of weeks - might even get into the karaf maven plugin) that installs the current bundle via a rest request on top of Karaf, it's currently also able to check for existing bundles.
Basically you need to issue the following rest POST:
{
"type":"EXEC",
"mbean":"org.apache.karaf:type=bundle,name=root",
"operation":"install(java.lang.String,boolean)",
"arguments":["mvn:${project.groupId}/${project.artifactId}/${project.version}", true]
}
You'll need to use the latest Jolokia snapshot version to get around the role based authentication which is used in the latest Karaf versions.
For this you'll also need to create a configuration file in etc, called org.jolokia.osgi.cfg, wich contains the following entries:
org.jolokia.user=karaf
org.jolokia.realm=karaf
org.jolokia.authMode=jaas
For more details also check the issue for it.

Is it possible to publish a git repository or a zip file to CloudControl directly by REST API?

I want to publish a maven project to CloudControl by REST API, not by command line tool, is it possible? That means, create a app by REST API, deploy source or binary code by REST API. Just like Heroku, I can build a url directly into app using REST API https://api.heroku.com/apps/myapp/builds. Thanks very much!
We currently don't provide a way to upload images directly. You'd first have to push to the Git remote which will build the image. All other steps like creating an app/deployment, deploying or adding add-ons are available via the REST-API. You can see examples of the API usage in the python library the command line client uses.

Resources