How to Clear/Delete BrowserContext folders? - jxbrowser

Using JXBrowser 6.14...
I'm using a different context for every Browser instances, that means I'm creating a temp folder for every Browser instance that I have, I've decided to remove this temp context folders when the Java application shutdown the problem is somewhere this context folders still being used for JXBrowser so I'm not able to delete them. I've also used deleteOnExit() but I'm still facing some problems with some files.
So, I'm wondering is there a way to clean up all those context files/folders? probably when the Browser disposed...?
Thanks in advance.

Before you start deleting this folder, make sure that you dispose all Browser instances with BrowserContext configured to use this folder. If you have at least one running Browser instance that use the folder, you won't be able to delete it.
Also, make sure that you first delete all files inside the folder, and then remove the folder (empty) itself. As far as I know Java doesn't provide API that allows deleting folder with files inside. At least 1.6. First you need to delete each file.

Related

In ASP.NET is it possible to store class files in a folder not under app_code?

My searching skills seem to be failing me on this one. It is a simple question;
In ASP.NET is it possible to store class files in a folder not under app_code?
What I am trying to accomplish is create class files which when added (or modified) to the web site will not cause the web site to restart. Anything under App_code or in the bin folder causes a restart. We are not using .NET for the presentation layer.
My reason is simple, we make changes somewhat regularly, and I do not want to wait until a specific time to add/change a feature. These are 24 hour websites and there is no great time to restart them.
Edit:
I am using FluorineFX to access the middle tier. I created a folder called "ProdCode" from the root of the application. When I try to access the NameSpace ProdeCode, class Employee_Calendar method getEvents. I get the error "Failed to locate the requested type ProdCode.Employee_Calendar"
Well, the goal of a non-restart is huge different, and HUGE separate from that of being able to place code modules, or class modules in some other folder.
When you build the project, most of the modules (and class ones) are crunched down to a single .dll anyway.
So, while you are free to add new folders and inside of those folders add new code/class modules? That may not well eliminate the need for the site to re-load, or in fact the site to re-compile the code again.
All of the app_code, and any other code module will crank out and result in a single .dll file. So, I don't see how you going to gain, or win anything new here.
What you could perhaps do is build some classes outside of the project, compile them, and then set a reference to the external code (and class) modules in the other project. That would suggest a external .dll. This would work during debugging, but an "il-merge" usealy occurs when publishing as non debug, and thus the gazillion .dll's are merged into one.
So, separate out code - great idea.
Adding code to additional folders - sure - no problem (do right click on the given class, or code module and make SURE the build action is compile - this is a default for app code - for other folders I can't remember. You need to check this.
so, up to this point? Hey, all great.
But, to save site re-compile time? No, this where train is flying off a big broken bridge, and the whole she-bang is crashing up in a HUGE ball of flames.
You might be able to same some time during debugging, but those included module are pulled into the "main" .dll (same name as your applcation). Go check the "bin" folder now - you not see the app_code .dll's but only one main .dll with the name of your project.
Such re-compile time is useally rather fast for the site to re-load. I you are just changing markup, then fine. But the idea that you want to include compiled code, and attempting to avoid a re-load? No I would not consider this - even if you could! I mean, how many times have you seen code fail or NOT take even after a publish and FORGETTING to re-start the web server? Those .dll's are often loaded into memory, locked and 100+ more issues exist. I many a time lost half a day because my .dll's did not take (due to me not re-starting the web server). there is pain and then there is this kind of "pain" in which parts of your application don't load. I just can't imagine the risk vs rewards in trying to save some time??? - I must be missing something here?

LabVIEW Virtual Folder vs Auto-populated Folder

In a LabVIEW project, when should I use virtual folder and when should I use auto-populated folder? Why?
You don't have to use either. Your project could just have a main VI and have everything else handled automatically as a dependency.
Personally, I would suggest viewing the project window as a logical organization tool for your work and say that you should only put the things you actually need or want to access from there.
The conclusion from this is that you should generally not use auto-pop folders, as you don't need that. If you want to see the disk hierarchy, you can go to the files tab in the project window.
You can use virtual folders for your logical organization. For example, you might want to have support files for your builds (like an icon for the EXE) and putting those in a folder cleans up the project. Another example might be that you have a library (like a class) and you want to group subitems in that library into some logical groups. You can create virtual folders inside that library.
The one place where I do use auto-pop. folders is if I have some dynamically loaded VIs or another list of files which are placed in the same folder. Adding that folder to the project as auto-pop allows it to be handled cleanly.
It is your choice. How do you want to manage your files?
For myself, the answer is “never ever use auto-populating folders.” Those folders do not play well with libraries or classes (by design, not because of a bug), and they make it hard to remove items from a project but keep the files around (because deleting from the folder is automatically deleting from disk). There are plenty of others who agree with me. But they are nice for simple apps that don’t use any modern software design tools.

Let 3rd party work on Symfony design without access to complete project/source

I am working on a Symfony 2 WebApp. The WebApp has been online for about two years, now I would like to update the design. This work should be outsourced to a 3rd party.
Of course the designer needs access to all styles (sass files) and templates (twig files) to work on the design. How can I do this, without giving him access to the complete rest of the project as well?
At first the questions seems obvious: Create an user account (e.g. FTP) that allows access only for the style/image/template folders.
Problem is, that I do not have a root server on which I could specify user access for individual folders. The access control I can us is quite limited: A FTP user can only be limited to a root folder but than has access to all child folders. SSH users cannot be limited to any folder at all.
Setting up a root server with full access control to let the designer do his job would be possible of course. However I consider this solution to be an overkill.
Another solution would be to create a special branch of the project which all important/confidential source code remove. Of course the branch would still have to be usable but with a limited feature set. This would be possible but more work than doing the design on my own.
Long story short: Is there any standard way of letting 3rd parties work dedicated parts of a Symfony project without giving them access to the whole project?
EDIT:
Of course the designer needs some kind of access to a running instance/copy of the web app. Giving him a standalone copy of the templates/styles folder would be possible and secure (no other code is made public), but in this case it would be impossible to view the result of changes.
Taking into account comments under the question, I would suggest to create FTP user with restricted access to only his home directory.
Then put there all directories that he needs. Something like this:
/ftpuser_home
/views
/web
/sass
/anything-other
Last step is to symlink each of these directories to the running project instance directory under appropriate paths like
/ftpuser_home/views => /var/www/symfony/app/Resources/views
And so on.
This way they can work with your project. Once they're finished, you simply remove symlinks, physically move directories back to the project directory and commit changes to git.
Note: It could look like it would be easier to do it vice versa, which is to create symlinks in /ftpuser_home not in project directory, but you probably would come across permissions issues.
You you're not afraid of them and know how to solve them, then it may work even better.

How can I skip deleting a folder but still sync the folder contents for an Web Deploy (msdeploy) sync?

Let's say I have a folder 'content' that (for some reason) may or may not be present in the source folder tree for a Web Deploy sync operation.
With what skip setting(s) can I have Web Deploy sync the contents of this folder (regardless of whether it exists; I can use another skip setting to control which files are synced) without also deleting the folder in the snyc target when it doesn't exist in the sync source?
[Note – the line breaks in the example commands are purely cosmetic.]
With this command, Web Deploy will attempt to delete the 'content' folder in the sync target (and fail if there are files in it):
msdeploy.exe
-verb:sync
-source:dirPath=%source%
-dest:dirPath=%target%
-skip:skipaction=Delete,objectname=filePath,absolutepath=\\content
With this command, (it appears that) Web deploy will skip deleting the 'content' folder in the sync target but it will also fail to sync any files within that folder (which is eminently reasonable actually):
msdeploy.exe
-verb:sync
-source:dirPath=%source%
-dest:dirPath=%target%
-skip:skipaction=Delete,objectname=dirPath,absolutepath=\\content
It's perfectly acceptable if there is in-fact no way to do this! (But I'd like some details or references about why that would be.)
Some possible solutions:
Have a look at the -enableRule:DoNotDeleteRule switch (see rule descriptions).
You may need to include both of those skip params, per this blog post:
...delete rules on a child are only processed if the parent is not
being deleted. So if you skip a file but it’s containing directory
doesn’t exist on the source, the directory (and thus the file) will be
deleted anyway.
Also, remember the absolutePath param takes regex so some chars (like .) should be escaped.
This doesn't seem possible. If a skip delete rule prevents a folder from being deleted, then none of the child files in that folder will be deleted either, and thus the child files won't be synced (completely).
Thinking about this more, especially in light of jkoreska's answer, I think the solution I adopted isn't terrible. Basically the problem is that the source and target root folders may contain some subset of a set of folders, say for example the full set is bin, content, and templates.
The reason why I want to sync the contents of a folder that might not exist in the source (or target), is that I'd like to use the same Web Deploy command(s) for any number of instances of sources and targets.
My solution was to simply guarantee that the source instances always contained the full set of folders, and thus all of the targets would too – after at least one execution of the Web Deploy command(s).

Making plugin folder writable in ASP.NET

I'm using MEF in a plugin based ASP.NET application. Wiring up a File System Watcher with Container.Refresh() any new plugin is loaded automatically upon being copied to the plugin folder.
The problem is when I want to delete or replace a loaded plugin. It is locked by w3wp and cannot be deleted. I read about Shadow Copy but cannot find a good example or article.
Try adding the plugin folder to AppDomainSetup.ShadowCopyDirectories. This property is a semicolon-seperated list of directories containing assemblies that should be loaded via shadow copies.
Normally you also need to set AppDomainSetup.ShadowCopyFiles to "true" but I think this is already the default for ASP.NET appdomains.
However, be aware that loading a new version of a plugin will not magically unload the old version. The only way to do that is to unload the AppDomain containing it. Since this requires you to load plugins in a separate appdomain, this is probably more trouble than it is worth.
It is probably simpler, safer and more effective to just stop the service, update the DLLs, and restart.
Make sure you are cleaning up all your unmanaged resources properly. It sounds like you may have opened a file stream but didn't properly close/dispose of it, and this may lock up a file by the process that was working with it in the first place. More info on using statement here: http://www.blackwasp.co.uk/UsingStatement.aspx

Resources