Setting Pydev to use with Openstack - openstack

I am currently looking into OpenStack and want to know about the development environment that I could use for debugging different issues that arise when I run it on my VM. By development environment I am referring to the IDE that can help me understand the functioning and call flow of OpenStack. I have Pydev installed with Eclipse on my machine but I don't know how to run all the Openstack dameons (nova-api,nova-compute,nova-network, glance-api etc.) together in Pydev.
Any help would be much appreciated.
(P.S: I am trying to avoid pdb for now, as Pydev would allow me to see the code and my location in it more conveniently...)

I'm not familiar with OpenStack, but if you could use pdb, you could definitely use the PyDev remote debugger the same way: http://pydev.org/manual_adv_remote_debugger.html

Looks like this question got asked on the OpenStack dev mailing list too on the thread [OpenStack] Development/Debugging where there are some answers.

Related

How can you specify your terminal emulator in Corda

Xterm is used when running Corda locally on one computer using gradle.
Is there a way to specify your terminal editor when running as suggested by the following issue?
https://github.com/corda/corda/issues/2605
I completely share your pain on this. The way that runnodes has its tooling baked in makes it impossible for you to customize how the cordform plugin runs the nodes without digging into the internals.
Some other ideas for you
one thing you could do would be to stop using cordform altogether and run your corda network using dockerform (example here: https://github.com/corda/samples-java/blob/master/Features/dockerform-yocordapp/build.gradle#L93) so that the plugin doesn't need to actually create new terminals.
the much harder way would be to actually download the corda gradle plugins (https://github.com/corda/corda-gradle-plugins#installing-locally) and install it locally with your edits to the cordform task so that it opens the terminal of your choice. You may be able to PR them as the cordform task that's usually used to generate the runnodes script comes from here as far as I know.
As a separate note, I saw your github issue and I was disappointed by how that got handled. I'm sorry you had that experience and I'm going to dig into that issue internally to find out what's happening with that.
feel free to reach out to me (David Awad) on slack.corda.net and I can let you know what's going on there.
Thanks as always

Shiny as Stand-Alone Program

I wrote a Shiny app, and now I need to turn it into a Stand-Alone Program. The reasoning behind this is that I need to share the app but can't do this with shinyapps.io or a server as I need the app to be able to access user's folders.
So far, I found these 2 tutorials: deploying-desktop-apps and packaging-your-shiny-app. Both of them (supposedly) work on Windows, but I have a Mac, and I want to app to be available for users of all systems, or at least Mac and Linux. Any thoughts and suggestions would be appreciated!
I actually tried to follow the tutorial mentioned above, and can't even install R-portable for my Mac. So I'm looking for something different.
Running a Virtual Machine to follow Windows tutorial is an option, but in this case, the app will be Windows-specific, and I don't want this.
This thread is really old I know, but I'm also trying to find answers on creating a standalone version of R for Mac.
This would support for
https://github.com/chasemc/electricShine
which supports Windows

How to deploy a Realm Object Server

I'm looking into using the new Realm Mobile Platform for a project of mine. I've gone through the guides and was able to get it up and running locally no problem. My question is, what's the best way to deploy the Realm Object Server so it can be run remotely? I read through the guide found here but didn't really understand it. I only have minimal experience deploying a rails app to heroku. How can I get it deployed to Heroku or a similar service? Any help is appreciated. Thanks!
It's hard to tell you what the "best" way is. There are always drawbacks and benefits to any setup, and everyone has different goals and objectives, so I don't think there is an objective "best way to run it," as you say.
The Realm Object Server doesn't support Heroku for the time being (or at least, no easy one-click-install integration). We know that this is something that people want, so it's on our radar, but I can't give you a definite answer as to when or even if we will do this one day.
The way most people run the Object Server is by running a virtual machine, and running the service inside of that. There are multiple ways to achieve this: start a virtual machine with your favourite cloud provider, and then install the Realm Object Server on top of that. Alternatively, Realm also provides an AMI image, which is Amazon lingo for "a pre-configured virtual machine image," that contains the Object Server pre-installed, and allows you to run your Object Server at the click of a button.
Please bear in mind that Realm Object Server is currently packaged for RHEL/CentOS 6 & 7, and Ubuntu 16.04.
Here are some links that should help you get started:
A basic tutorial on how to setup Ubuntu 16.04 on Digital Ocean
AWS' documentation on launching an EC2 instance from an AMI
Try this image to run realm-objserct server on openshift online.
https://hub.docker.com/r/viksgyl/realm-object-server/

How to install OpenContrail without OpenStack

I want to understand what magic can OpenContrail do as Software Defined Networking and I am new to this OpenContrail Concept and VMs, etc. To understand this, I just want to install OpenContrail on my Ubuntu VM. I tried to follow this Official quick Start Guide. But. It looks It installs OpenStack Components also when I invoke fabric Scripts.
Is it mandatory to use OpenStack to understand the magic of OpenContrail.? If Yes, Why is it so.?
Thanks,
Ganesh
You can try to use a simpler set of instructions and use docker containers:
http://www.opencontrail.org/docker-with-opencontrail/
There is also work going on in order to be able to provision opencontrail with kubernetes as a cluster management system. Reach out in the #opencontrail channel on freenode.net if you want to try one of these options.

What is a good hosting solution for running node.js with R / Rserve?

I need to run R with Node.js, using Rio (https://github.com/albertosantini/node-rio) as the node binding to Rserve.
I like Heroku but this seems like it is pushing the Heroku envelope beyond what it or I am competent with:
I've looked briefly into installing a custom buildpack
https://github.com/virtualstaticvoid/heroku-buildpack-r
to run simultaneously with node.js:
https://github.com/ddollar/heroku-buildpack-multi
This all seems pretty scary. Anyone got any good advice for how best to host this? My app works just fine locally.
http://prgmr.com/xen/
I currently use this solution to run my Node.js server and it's currently great.
They have wonderful support and they're uptime is 100%. I cannot recommend this any higher, but you will need to know how to set up a simple OS and run it from the ground up.
For example, if you want to run a server without having it stop when you close the SSH connection, you would use screen node script.js and press [control] + [A] + [D] keys.
You might already know this, so simply take my advice and view the website.
After some research and recommendations from Heroku, I believe the Heroku solution would be
Use https://github.com/virtualstaticvoid/heroku-buildpack-r
in combination with
https://github.com/ddollar/heroku-buildpack-multi#readme
to build a multi build pack.

Resources