I've been exploring this problem of how to write a suite of small utilities but serve them up together. It's like writing multiple little .py files that each gives us a panel app, but I wanted to serve them up via a single Docker container with a single entrypoint.
Voila gives us the ability to serve up multiple notebooks by taking advantage of its jupyter extension; is something analogous possible with panel? For example, I'm wondering whether I could do panel serve . [--options] to serve up all .py files in a directory?
h/t Philipp Rudiger, the lead developer of Panel, who gave pointed me to this answer:
Use panel serve src1.py nb1.ipynb ... to serve multiple apps simultaneously.
You might want to provide your own index page since the default bokeh one isn't too pretty.
Related
In Perforce, I notice that my work-space is linked to a specific directory (location) in my local hard drive. Is it possible to change the location of this mapping for each file? For example if I have two scripts in two completely different directories locally -
C:/File1.pl & D:/File2.pl
And I want to map these 2 scripts under the same folder in perforce. Is this possible?
The root directory must be the same for all files in a single workspace.
However, you can define multiple workspaces, one which resides on your C:\ disk and one which resides on your D:\ disk.
Generally, a single workspace is used for a single project, and generally all files for a single project are located together in a single area of your workstation. I'm having trouble thinking of a scenario in which you'd want to have files be part of a single project, and yet stored in various places scattered around your workstation. Can you explain your scenario further?
There are techniques (the SUBST command, using Windows Junction Points, etc.) which can be used to create aliases for files on a different disk, but given what you've described, using multiple workspaces seems like the clearest approach to me.
I'm very new to the world of git (done some svn in the past) and would like some advice on trying to accomplish the following.
My current workflow is that I setup the static html files using Middleman to get the base HTML structure and styles before porting over to a Wordpress template. These static files are located at C:/git/project-name/HTMLTemplates.
My wordpress setup uses Xampp so the theme files are kept in C:/Xampp/wordpress/wp-content/themes/project-theme.
What I would like to do is have a single git repo that tracks the changes of the two different locations (HTMLTemplates and project-theme)
Is this at all possible, or do I simply create two individual repos (eg: proect-static and project-wordpress)?
No, there is no mechanism in git for this. Git assumes that all files that it manages (the "working copy") live in a single directory (and subdirectories); there is no support for managing two separate directories in in repo.
So you'll have to somehow keep everything in one directory, probably as subdirectories HTMLTemplates and theme or similar.
You could use two git repos, but I'd strongly advise against this. A single repo should contain a whole "project", i.e. everything needed to build one piece of software (excluding things like external libraries). If you split your project across two repositories, you cannot usefully branch and merge (because you'd have to do it in both repos simultaneously), you cannot easily check out old versions etc..
To solve your problem, I see a few possible solutions:
Have some build / deployment script that copies everything to the right places. You probably alread have a script that invokes Middleman, and possibly tells Wordpress to refresh its cache, so you could add it there.
Set up a symbolic link for the wordpress directory. On UNIX-like systems this is easy and commonly done. On Windows, you can create "junction points", which I believe work similarly.
Configure Wordpress / Apache to read the directory directly from your git working copy. The path should be configurable.
I would prefer the first solution; this has the added advantage that it will decouple your development environment from the server configuration. This will make it easier if your setup later changes or your project needs to run in a different environment (development on a different machine, someone else also wants to work on your project, you want to deploy to a hosted server somewhere etc.).
Note: The problem is, I believe, that your are trying to use git as a deployment tool. While many people do this, git is not really suitable for this purpose. Deployment should usually be a separate step.
I need to deploy a website from the SVN to different servers all within our own network. The code is currently not compiled but probably will be in the future.
First, the site would need to be deployed to the development server for the developers to test.
Once the Developer signs off, it would be deployed to the staging server for the testers.
Once final sign off was given it would be deployed to a server farm- two live servers.
Each server has a couple of settings in the web.config to that are different (except the two live servers, of course). I would like to use templates, the way the Ruby on Rails world does. It seems like an elegant solution to multiple web.config files.
I also need to create a list/report of the files that were changed and what the change was since the last deployment.
I am thinking of writing a script that will do the following:
1. Take args for server to deploy to, and revision
2. Export a copy of the source to a directory with svn export -r <deploy revision>
3. Delete the web.config file
4. Use ttree (a template tool http://template-toolkit.org/) to create the correct web.config
5. Create a list of file changes with svn list -r <deploy revision>:<current server revision>
6. Store the <current server revision> of the website for when the script is run next time
The problem I have is it doesn't seem like the most elegant solution. It could become unmaintainable, and I prefer to use tools that are already available rather than re-invent the wheel. Unfortunately I don't think MSDeploy will do what I need, but I'm happy to use it, or anything else, if it will do what I need it to. Does anyone know of any tools that are up to the task or is the script my only option?
Check out TeamCity. I have my build server setup so that it can deploy to different environments with different settings based on the build configuration all in "One Click". It's relatively painless to setup and integrates directly with Subversion and other source control systems. This would be a more elegant solution to the issue you are dealing with...
I've seen several other posts similar to this (namely https://stackoverflow.com/questions/5237/solutions-for-working-with-multiple-branches-in-asp-net) but there are several issues that I have that seem to be different than other similar posts.
I have an ASP .NET application that uses a virtual directory off of localhost. There are several spots in the code where I need to reference the name of the virtual directory so the virtual directory needs to be in place and named correctly in order for it to work. I'm also using my httpd.conf file to format my URLs to avoid cluttered querystrings.
That being said, I just published my application and now need to create a branched environment for bug-fixes whenever there is a bug in the live code and I don't want to upload the dev code.
The trouble is that I need to be able to easily run my branched code parallel to my dev code without needing to do a bunch of work with IIS and config files every time I want to load in my branched code. The drawbacks are that the parallel environment needs to have the virtual directory in place and work with the same httpd.conf (for URL formatting).
I don't think Cassini would work because I need SSL and of course...the httpd.conf and the virtual directories would need to still be in place.
The perfect solution in my mind would be to run a parallel website to localhost with the same httpd.conf and the same virtual directory...but I'm running XP Pro and they don't "do" multiple websites.
Have your build process create the virtual directory each time the build is run.
I've used NantContrib's mkiisdir task for this.
With this approach you can't run multiple branches simultaneously, but you can quickly switch between branches by building the branch you want to run.
I would do as above, but you could hook it into your solutions post build event, but this wouldn't be parallel more a quick switch. I think there's a registry hack out there to get multiple sites in iis or if memory serves if you create an additional site through a script it works, it's just the GUI that's locked down. Or the better solution would be upgrade to windows server, and have different branches build to different ports.
I need to download two Excel files onto the client, and then run a (diff) executable against them. I know how to download a single Excel file, from
here. But how to download a second one automatically in succession? And then how to run a batch command on them? Is this even realistic? Any guidance or pointers would be greatly appreciated.
Thanks,
Mike
To download multiple files at once you have two main options:
1) Just open multiple windows to your page generation script to download multiple files as per http://www.webdeveloper.com/forum/showpost.php?s=b4f6b25edeb6b7ea55434c4685a675fe&p=950225&postcount=6
2) Archive the files into a package (zip/arj/7z etc..) and send the archive to the client.
eg. http://www.motobit.com/tips/detpg_multiple-files-one-request/
As for doing the diff client-side that is a lot more tricky as Shhnap has already mentioned. If you are doing this for a controlled client base you may be able to get them to allow permissions for an ActiveX script that runs something client side. (Or fire off a console application) - but if you don't have fine control over the client environment then i can't think of a way to do it.
As Shhnap suggested can you not just do the comparison server-side (and then send this to the client as a third file?)
Well, just some pointers because I'm not sure I completely understand the problem. You a user to be given two downloads at the same time and then run a diff command against those two files? On the server or the client i'm not sure? You'll have alot of problems automating the client side version because forcing people to run client side code is usually frowned upon by virus protection software.
The server side diff sounds exactly like a CGI moment to me: http://www.cs.tut.fi/~jkorpela/perl/cgi.html. That will allow you to generate a web-page that shows the diff between the two. CGI allows you to run programs on your server and display their output in a webpage; that's the simple explanation.
If that was not quite what you wanted then feel free to give me a comment and i'll try and edit to answer correctly.