Linux /var directory vs /opt directory - directory

From my limited experience in linux, I find a lot of 3rd party programs are installed by default to:
/opt
One program in particular, torque, installs by default to:
/var/spool/torque
I can override this with
./configure --with-server-home=/opt/torque
Is there any reason I should leave the default install directory alone, and not move it to
opt/torque?
I want someone who knows linux best practices to suggest where they think I should install it.
Update
I've researched File Hierarchy Standards (FHS). It seems like /opt and /var/opt are both valid locations for installing programs. I'd still like to know what is considered 'best practice' or even the most common location for installing applications.

There is more than one 'correct' place to install programs.
According to the Filesystem Hierarchy Standards, user programs are frequently installed to /opt/ or /usr/local
While there is nothing wrong with /var/spool/torque as your install directory, it does not follow the FHS.

This actually depends. If the program is a spooler then its installing in the correct location (that is var/spool). If not, then you should change it.
Also about opt vs var/opt generally in modern systems, the root directories (/run,/opt etc) and their "var" counterparts (/var/run,/var/opt) are treated as equal, and in many distros they are actually linked to one another.
So its pretty much your choise. Personally i choose to install under /var because many servers have it mounted on a different disk for data security and redundancy and i would like to make their life easier!

Related

Is there a convention for git version control between multiple operating systems in R?

Apologies if this isn't an appropriate question for SO - if not please let me know and I'll delete/move it. I just haven't found any resources on this myself. Anything I google related to "multiple operating systems git" gives me pages for applications that work on multiple OS's like GitHub or Tower.
I currently work regularly between two operating systems - PC at the office, Mac at home. I've been managing this with with git by using my master branch for PC/Windows R code, while using a OSXversion branch for Mac R code. This is fine for whenever I'm updating Windows or Mac specific code on each branch (such as package installation instructions in the comments). Where this gets tricky is for general improvement in my code that applies to both Mac & PC. What I've been doing is manually copy-pasting any general improvements between my Mac/PC code or cherry picking my merges. Is there a better way to be doing this?
It's fine to store code that runs on different operating systems in a Git repository. Simply check out the repository on each of the operating systems you're going to be working on.
The only thing you need to watch for is the line endings, which. Unless you're dealing with files that specifically require native CRLF / LF end-of-line styles, you're best to turn the automatic conversion off.
This can be done with:
git config --global core.autocrlf false
Further notes on autocrlf can be found on the GitHub help page itself.
As for your actual committing, you'll want to be following Git Flow, and simply have a develop branch that bases off of master. From here, you'll want to create individual feature branches. These feature branches can be worked on whilst on either Windows or Mac. The package installation instructions should really go in your README.md file.

deploying a Qt application

In a nutshell, the question is: I just finished my first application using Qt Creator on a computer running under Linux Ubuntu, now how do I make this available for everyone. Now follows the more detailed version ;)
I must apologize for asking this, I am aware that this question has probably been asked many times and that there is official documentation that I can read. I am just completely new to programming and I am very confused by everything I've read so far. If you are kind enough to help, please assume I know absolutely nothing :)
Here we go: I've just finished designing my first application (a scientific program) with Qt creator on my laptop which runs under Linux Ubuntu. It works fine and I'm very proud of it ;)
Here's what my project consists of: 40 header files, 42 source files, 1 pro file, 1 qrc file, 1 html file and 7 png files. In the code, I use #include for a bunch of fairly standard Qt classes (QWidget, QTextBrowser and so forth, maybe like 40 of those).
Now I'd like to make it available to other people. For Linux and Mac users, I've figured a way to do that: I can compress the folder containing my project, tell them to install Qt on their computer, then download and extract the files on their hard disk, open a terminal in the folder and run
qmake myProject.pro
qmake
make
That seems to work fine (by the way, does it matter that this is not precisely what Qt creator does? The qmake step there is qmake-qt4 myProject.pro -r -spec linux-g++ and the make step is make -w). Now, I assume there is a solution where I don't ask them to download and install something like 200Mo of Qt material. As for Microsoft Windows users, I don't have a clue.
I would be very grateful if you could explain to me in a very concrete way what I need to do. Needless to say, I'll go for the best and easiest solution, I don't need to understand everything about deployment. Many thanks in advance!
Edit: In case that's useful : I've been using Qt Creator 2.5.0 based on Qt 4.8.1 (64 bit), I'm working on a laptop with Ubuntu 12.04 64bits
For Linux and Mac users, I would compile the software for them in 32 and 64bit formats - no-one likes compiling unknown software from source. Obviously keep the source code option for those on more unusual architectures/OSs (and provide a shell script for them that mimics the commands Qt Creator calls!). As Qt runtimes are available from package managers on just about every distro (and come pre-installed on most anyway, KDE requires them for example), by not asking them to compile from source your users will have a much smaller download (if any) and won't require them to download software from a website potentially unknown to them. Of course the best way would be to try to get your software added as a package into the major distros' repositories, but that may take some time to organise.
Compile your software for Windows users for both 32 and 64bit formats. It's generally frowned upon to ask users to download runtime libraries they potentially don't know, and put them into their system32 folder... So most applications bundle all the libraries they need with their application. Qt-based applications are no different, and so put the runtimes into the folder where the executable is. Also it is much more professional to create a proper installer, there are a few free installer applications for Windows, a web search will give you the most popular (I think I saw a thread on SO about it as well).
As you can see the platforms aren't too dissimilar, the main point I would make is: Do not force people to compile from source! The vast majority of people on Earth do not even know what compiling is, so provide for the major arrchitectures/OSs yourself.

Installing software on Solaris

I'd like to install several unix utilities (incl. xmlstarlet, wget) on a solaris 10 machine which I don't have root access to (obviously, I have a user account). I'm not that experienced with solaris and am wondering if I can simply get hold of an uber binary for each utility I need and just place this in my home directory? Is this feasible?
Many thanks
wget is installed by default on Solaris 10 in /usr/sfw/bin/wget.
xmlstarlet requires four libraries that aren't included in Solaris 10 so it's going to be trickier but of course, you can build them and then xmlstarlet from their respective source code.
Have a look there for information about what is needed: http://www.opencsw.org/packages/xmlstarlet
If you really don't want to compile the binaries, there is certainly a way to manually install the files stored on these Solaris packages elsewhere and patch/fix them to make the whole work. I did that already.
Finally, don't underestimate the willingness of the system administrator to help.
As long as the binary doesn't try to do something that requires superuser privileges and the binary is compiled for your platform, you should be ok.

Packaging to use to deploy cross-platform?

On windows applications are typically packaged as MSI, on Redhat Linux as RPM, what would be a best open source packaging method that could be used to deploy applications to all platforms including different flavors of unix and windows?
Contents would include exes, unix binaries, java jar files, user data, even database scripts to be run.
(I recognize contents would vary per destination OS, ie. binaries would be different, win exe vs unix binary etc, but for example config files may be the same or in the case of java even the bytecode jars)
Key feature I'd like the packaging to support is different users and permissions for different directories, however I recognize supporting this feature multiplatform may be very difficult.
Rather than build a package that is supposed to work across all of your platforms, which is likely impossible, you should have your build system build different packages for each target platform.
With CPack (It come with CMake) you can create packages for Windows (with NSIS), Linux (rpm and deb), and OS X with "make package". CMake also simplify cross-platform building.
For a sample you can look at avogadro's CMakeLists.txt and AvoCPack.cmake
I have a client that uses IzPack to create a single installer (it's Java-based) that installs their app on Windows, OS X and Linux.
http://izpack.org/
NSIS is an open-source solution which, as far as I know is able to build installers that run on Windows and UNIX-likes alike. However, for software deployment on Windows (especially in corporate environments) MSI is the way to go and NSIS is more of a headache.
So I wouldn't advise that you try to build a single package/installer for different platforms. But rather, as RibaldEddie indicated, multiple packages: one for each platform. That also allows to restrict the contents of the package to the files relevant to each platform.
If you'd like to support packaging for multiple distributions, I'd suggest helping the packagers for those distributions out; use some sort of well-known build system for your software (GNU's autotools or something like scons or waf), and document the build, optional dependencies, and so forth pretty well.
That way, when a Debian, Ubuntu, Red Hat, SuSE, whatever, packager comes along, they'll be able to create the package for you. You can optionally include packaging templates for one or more distributions in a separate VCS tree that is available, if you'd like.
If you are looking at packaging a closed-source/proprietary application for multiple systems, you'd probably do best to package up a .tar.gz file and document the installation process for it. You'll also want to make sure that the build process used doesn't embed any path information into the application, so that it can be run in /opt, /usr, or /usr/local, which are some popular choices for third-party add-on software.
BitRock InstallBuilder allows you to create installer packages for each one of the platforms you mentioned (as well as creating RPM, DEB, packages etc. from a single project file)

How can I uninstall Win32 assemblies and cleanup WinSxS?

After a lot of trial and error (mostly due to lack of documentation and examples) I have managed to create MSI installers that install custom DLLs to WinSxS as side-by-side assembly. There is only one problem: Uninstalling leaves all files (DLLs, manifests and catalogs) in the WinSxS directory. How can or should I best clean that up? I know for sure that nothing else references it.
I have read somewhere that WinSxS has a self-scavenging process that cleans up over time but I could not find more information about that. Can you manually invoke this to clean up stuff?
The only other way I see is manually deleting those bits. First you have to change the owner of all files (assembly, catalog, manifest and their respective directory) from SYSTEM to an administrator account, adjust the permissions and delete them. There are also pieces left in the registry (I think HKLM\COMPONENTS\DerivedData\Components may be one place), but since WinSxS should be treated as opaque it is hard to find any information.
Scavenging isn't exposed anywhere that I know of. I'm not even sure when it is kicked off automatically. Maybe on uninstall of a service pack? Maybe some tool admins can run? I really forget.
Anyway, my suggestion is don't fight it. There are so many twisty turns down there that it just isn't worth trying to get the disk space back. Once uninstalled the bits still in the SxS cache will not be activated so they are just wasting space.
It's a dumb design but blame Microsoft and don't try to overcompensate.
Here is an article, it's kinda complete guide to WinSxS.
So, shortly, you can only uninstall some components (all their versions are in this folder), and you can run Service Pack bridge burning utility (in Vista it is named VSP1CLN.EXE and shipped with SP1). Note, that after execution, you shouldn't be able to uninstall SP or any components to state, prior to SP release date.
No-one is convinced you can - short of a complete reinstall, your bloaty WinSxS directory is there to stay.
There's been a long "discussion" of the problem on technet.
There is no documentation of the format, or any instructions how to remove files that are no longer needed - MS seems to think that disc space is cheap. There is a self-scavenging feature, but no-one's convinced it works, or if it does, it is very conservative (as you'd hope as you don't want it to break your OS)
You can tell is the scavenger is working by checking the "C:\Windows\winsxs\Temp\PendingDeletes." folder, as this is where files are moved by windows update or an installer moves them to - the scavenger just deletes the files in here.
You'll notice that after you uninstall your assembly, while the files are still there, they can no longer be bound to - so they are just "staged", or cached, but not really installed.
Rob & gbjbaanb are correct - you cannot manually invoke a scavenge yourself. Don't try to delete the files yourself - there are multiple places in the registry where they are registered, DerivedData\Components being only one of the many references.
I think the rule for Vista is scavenging is kicked off by the TrustedInstaller service after 10 minutes of machine inactivity, after the last servicing operation (service pack, hotfix, etc). But it's very fickle, so it doesn't run as often as it should. So just be patient, and the files will disappear on their own.
Well i was having some issues as i have an 80GB SSD for my windows and the WinSxs folder was about 12gb's
I was searching the net and i found this command:
DISM.exe /online /Cleanup-Image /spsuperseded
And now my WinSxs is 7gb which was wonderful news.
There are a few updates regarding the cleanup method that apply to newer OS. Check http://www.karafilis.net/winsxs-cleanup

Resources