How do I make a makefile that works on AIX, Linux and SunOS and has the ability to provide different compiler options for each environment?
I have access to an environment variable which describes the OS, but the AIX make utility does not like ifeq, so I can't do something like:
ifeq($(OS), AIX)
CFLAGS = $(CFLAGS) <IBM compiler options>
endif
You can use a construct like this:
CFLAGS_AIX = AIX flags
CFLAGS_Linux = Linux flags
CFLAGS_SunOS = SunOS flags
CFLAGS = $(CFLAGS_$(OS))
The portability of a Makefile is not directly related to the operating system, but to the implementation of make on the platform in question. (So there is an indirect relationship in that the implementation of make may be guessed (but NOT accurately) from the OS.) In general, this is a difficult problem for which many solutions have been proposed. You might want to look into automake, which will generate portable makefiles for you. However, automake's solution to the problem of setting flags for different unixen may not appeal to you as the solution is (basically) "don't do it". Rather than setting options based on the platform, the philosophy is to determine what flags are needed based on the functionality provided by the host or by the user at configure time. One convenient autoconf/automake based solution for the problem of assigning flags based on platform is to have a distinct file for each of your platforms which assigns CFLAGS at configure time, and have the file be specified in the CONFIG_SITE environment variable of the user running configure. You can assign CONFIG_SITE in the login script of the user based on the platform. (ie, push the problem away from configure/make and into the login setup) This makes the assignment transparent to the user building the software. (transparent but easily overridden).
Related
In gRPC, when building for arm, I need to disable those three variables:
-DRUN_HAVE_STD_REGEX=OFF
-DRUN_HAVE_POSIX_REGEX=OFF
-DRUN_HAVE_STEADY_CLOCK=OFF
It is not super clear to me what they do, so I wonder:
Why is it that CMake cannot detect them automatically when cross-compiling?
What is the impact of disabling them, say on a system that does support them? Will it sometimes crash? Reduce performances in some situations?
Because they are not auto-detected by CMake, it would be easier for me to always disable them if that works everywhere without major issues for my use-case.
gRPC uses CMake's try_run to automatically detect if the platform supports a feature when cross-compiling. However, some variables need to be supplied manually. From the documentation (emphasis added):
When cross compiling, the executable compiled in the first step usually cannot be run on the build host. The try_run command checks the CMAKE_CROSSCOMPILING variable to detect whether CMake is in cross-compiling mode. If that is the case, it will still try to compile the executable, but it will not try to run the executable unless the CMAKE_CROSSCOMPILING_EMULATOR variable is set. Instead it will create cache variables which must be filled by the user or by presetting them in some CMake script file to the values the executable would have produced if it had been run on its actual target platform.
Basically, it's saying that CMake won't try to run the compiled executable on the build machine unless some test results are specified manually (test which would have been run on the target machine). The below tests will usually cause problems:
-DRUN_HAVE_STD_REGEX
-DRUN_HAVE_GNU_POSIX_REGEX
-DRUN_HAVE_POSIX_REGEX
-DRUN_HAVE_STEADY_CLOCK
Hopefully that answers your first question. I do not know how to answer your second question, as I have always just set those variables manually to match the features of whatever system I've compiled for.
We recently decided to make Julia Language available on our cluster systems. The cluster system is not able to connect to the internet.
Is there any way to download all Julia packages and make them available for our different users to install and use them offline?
Another option that we have is a system that can connect to the internet temporarily, but it is always connected to the main cluster system. Is there any way to use this system as a mirror for the Julia packages or not?
We want to use "Julia 1.0.1".
our cluster operation system is: "CentOS 5.5
notes: I have seen the question asked before here, but it is for Julia 0.6 and a single package that will be copied by hand. I want that user uses the Pkg.add <pkgName> command but instead of the internet, the package manager gets the packages from our offline system.
Thank you for your help and time.
Caution:
Side effects are not known!
May please be tested properly before put into production!
a) Collect the required packages along with their dependent packages in compiled form, put them in folder, stdlib (for example: /opt/julia/julia-1.1.0/shared/julia/stdlib/v1.1/)
b) add stdlib path to environment variables, JULIA_DEPOT_PATH and JULIA_LOAD_PATH
The following is a crosspost of https://stackoverflow.com/a/74800608/18431399
PackageCompiler.jl seems like the best tool for using modern Julia (v1.8) on secure systems. The following approach requires a build server with the same architecture as the deployment server, something your institution probably already uses for developing containers, etc.
Build a sysimage with PackageCompiler's create_sysimage()
Upload the build (sysimage and depot) along with the Julia binaries to the secure system
Alias a script to julia, similar to the following example:
#!/bin/bash
set -Eeu -o pipefail
unset JULIA_LOAD_PATH
export JULIA_PROJECT=/Path/To/Project
export JULIA_DEPOT_PATH=/Path/To/Depot
export JULIA_PKG_OFFLINE=true
/Path/To/julia -J/Path/To/sysimage.so "$#"
I've been able to run a research pipeline on my institution's secure system, for which there is a public version of the approach.
Apologies if this isn't an appropriate question for SO - if not please let me know and I'll delete/move it. I just haven't found any resources on this myself. Anything I google related to "multiple operating systems git" gives me pages for applications that work on multiple OS's like GitHub or Tower.
I currently work regularly between two operating systems - PC at the office, Mac at home. I've been managing this with with git by using my master branch for PC/Windows R code, while using a OSXversion branch for Mac R code. This is fine for whenever I'm updating Windows or Mac specific code on each branch (such as package installation instructions in the comments). Where this gets tricky is for general improvement in my code that applies to both Mac & PC. What I've been doing is manually copy-pasting any general improvements between my Mac/PC code or cherry picking my merges. Is there a better way to be doing this?
It's fine to store code that runs on different operating systems in a Git repository. Simply check out the repository on each of the operating systems you're going to be working on.
The only thing you need to watch for is the line endings, which. Unless you're dealing with files that specifically require native CRLF / LF end-of-line styles, you're best to turn the automatic conversion off.
This can be done with:
git config --global core.autocrlf false
Further notes on autocrlf can be found on the GitHub help page itself.
As for your actual committing, you'll want to be following Git Flow, and simply have a develop branch that bases off of master. From here, you'll want to create individual feature branches. These feature branches can be worked on whilst on either Windows or Mac. The package installation instructions should really go in your README.md file.
I had this question on my exam, now in diagrams I saw, we have : hardware, kernel, system call interface to the kernel, then (compilers, shells, sys.libs) and on top some applications. Does OS scope include only kernel, and everything else is just some additional functions we choose to install , or does a Unix OS include everything from the list I gave above?
OS have more or less 2 definitions :
academic : OS is soft for doing a abstraction layer between
hardware and software
pragmatic : OS is soft that come with hardware when we buy it.
Compiler and shell don't enter in definition 1. It can be enter in definition 2.
And usually, users that are interesting by a compiler or a shell prefer to consider OS as asbtraction layer (academic definition).
Simple answer, No. They are not an internal part of Unix but additional functionality to help make the Operating System more usable.
The OS scope applies primarily to the kernel only.
Whilst you need a compiler to build the kernel, you don't necessarily require one for the general day to day use of the system. Most operating systems don't ship the compiler by default and instead, the kernel and applications is built on one machine and then the resulting binarys are packaged and distributed either with the computer directly (Windows/Unix) or via the internet for others to download and install (Linux/BSD)
Likewise with the shell. Although all operating systems ship with a default one (sh/bash/dash on Linux|Unix systems, Command Prompt/Powershell on Windows), most general users can go their entire lives without using it.
Having said that, if you were to delete the shell, you'll almost certainly find your system won't boot up. This is because a lot of core start-up scripts rely on the shell to stop / start the services presenting interfaces between the user and the kernel.
In summary:
You need a compiler to build the kernel and applications but not for running the OS.
You need a shell to execute applications (which also includes the compiler)
This is sort of related to a previous post of mine. I have the need to use the bigmemory library on my 32bit Windows PC to do some ugly matrix calculations. Unfortunately, it appears that the maintainers have temporarily ceased production of Windows binaries. I have Ubuntu on my home PC. I would really like to take the .tar.gz file and build it into a Windows binary that I can actually run at work. I realize there are more efficient ways, like installing RTools on the Windows device. However, our IT keeps our admin rights on lockdown, so I can never edit my PATH enviro variable. Could anyone provide some general guidance for doing this? Are there any tools I need to install on my Ubuntu PC above and beyond R?
I found similar questions, but nothing that thoroughly answered my questions.
Unless the package source is incompatible with current versions of R, you could use the R project's win-builder site to build a Windows binary. Quoting from the linked site, win-builder is a service:
intended for useRs who do not have Windows available for checking and building Windows binary packages.
As a convenience, Hadley Wickham's devtools package includes a utility function, build_win(), that you can use for this purpose. From ?build_win:
Works by building source package, and then uploading to http://win-builder.r-project.org/>. Once building is complete you'll receive a link to the built package in the email address listed in the maintainer field. It usually takes around 30 minutes.
Windows has four sets of environment variables (system, user, volatile and process sets). The first three sets are stored in the registry but the process set is not so even if they have locked down the registry its typically still possible to set the process environment variables (including the PATH) in a local process, i.e. on a temporary basis, so you might double check your assumptions that you can't modify anything. Its more likely that you can't modify the system variables and registry but can still modify the set in your local process. To check this from the Windows cmd line enter this:
set mytest=123
set mytest
and if the second line shows that mytest has the value 123 then you likely have all the permissions you need.
Furthermore anything you need to set is all handled automatically for you by R.bat in the batchfiles distribution so you don't have to set anything yourself.
Just ensure that Rtools and R are installed into the standard locations (you can tell them to skip the setting of any registry keys during the installation process), ensure R.bat is on your path or in current directory and run:
R.bat CMD INSTALL mypackage.tar.gz
without setting environment variables, registry keys or path.
If that does not work try Rpathset.bat also from the batchfiles which is not automatic like R.bat but on the other hand is extremely flexible since you must modify the SET statments in it to whatever you want.
There is a PDF document that comes with the batchfiles which gives more info.