R help files on different OS - r

Is there some way to access the Windows version of a helpfile on a Linux computer within R?
I write a decent amount of R code on my Linux machine, but I do have to make sure the code will run on a Windows machine for collaborators.
I've been burned a number of times by reading a help file on my Linux machine, writing my code, and then spending hours wondering why it's not working on the Windows machine until I check the helpfile on that machine and realise that it is different to the one on the Linux machine.
It'll usually have a "NOTE: On Windows, xxxx behaves differently...", and I wish I knew that while I was writing the code on my Linux machine!
I do realise that many help files are system-specific (for example ?system), but sometimes I would like to read the Windows version on my Linux computer. Today I found myself wanting to read ?windows but had to boot up my Windows laptop just to read that helpfile, because that function isn't available on Linux and so there's no help file.
cheers.

You can always look at the source which gives you clear conditionals -- this is from man/system.Rd:
#ifdef windows
Only double quotes are allowed on Windows: see the examples. (Note: a
Windows path name cannot contain a double quote, so we do not need to
worry about escaping embedded quotes.)
[...]
#endif
#ifdef unix
Unix-alikes pass the command line to a shell (normally \file{/bin/sh},
and POSIX requires that shell), so \code{command} can be anything the
shell regards as executable, including shell scripts, and it can
contain multiple commands separated by \code{;}.
[...]
#endif

Related

Downloading R on Linux for multiple clients

I've created a program that runs in R that I plan on distributing among a lot of other people. Currently the R script is ran completely automatically and behind the scenes with one .sh script which is exactly how it is intended to be. I'm trying to make it so theres no need for client intervention. The R script itself loads the packages and installs them if they aren't present which takes away the task of them installing the packages themselves.
Is there a way I can provide a folder within my Application's folder that they already download that contains R-script and its dependencies so the code can use that location of Rscript to compile and run the R-program I have created. The goal is to be able to download it and run without the need of internet connection to download R and maybe even the programs required packages if possible.
Any help or ideas is appreciated.
I assume that process you want called "creating binary package". Binary is programs (like EXE files) which can run directly on target CPU without any interpreter software (like Python interpreter for python scripts, or Java VM for java applications). I'm not so familiar with packaging of R programs but I found some materials regarding this issue:
1 - Building binary package R
2 - https://seandavi.github.io/post/build-linux-r-binary-packages/
3 - https://support.rstudio.com/hc/en-us/articles/200486508-Building-Testing-and-Distributing-Packages
Second link assumes Linux as target system. Opposite to interpreted languages, binary files often OS dependent (Linux, Windows, or Mac). I, personally, don't know how compatible are packages between Linux systems with different library sets.
Please comment if you find some information misleading, I'll correct the answer.

Run .bat script in unix

I have one .bat script on my windows share that is mounted to my UNIX machine. Bat script is set to make file transfer between 2 windows shares, but I would like to trigger this script from a unix machine if that is possible. I was reading that you can do it with wine or dosbox, but I don't have that installed on my unix. Is it possible to resolve this problem with some additional .sh script that will trigger my .bat script correctly?
Thank you in advance.
Best regards.
You cannot run a .bat script on a Unix machine for several reasons :
Unix has not the same commands (on the command line) as Windows. The POSIX standard defined a set of commands, if you use them you'll be portable on various POSIX systems (but not on Windows); for example to list a directory, you'll use DIR on MSDOS and Windows but ls on Unix and POSIX; to copy a file it is COPY on MSDOS and Windows but cp on Unix and POSIX; etc....
Unix has not the same command interpreter as Windows. The POSIX standard and the Unix tradition provides a Unix shell and POSIX has standardized /bin/sh (a.k.a. POSIX shell). Windows has CMD (inherited from MSDOS) and PowerShell.
The way of interpreting commands is different (on Windows look also into PowerShell, which I don't know). On Unix it is the shell (not the invoked programs) that is expanding your command and globbing. See this answer for more. The notion of current working directory is different.
the operating system concepts are (slightly or significantly) different on Windows and on Unix or POSIX. For example, files, directories, processes, libraries are different (for example, a file can be written by a process and removed by another one on Unix and it can have several names on Linux thru hard links), .... etc.... You could read Operating Systems: Three Easy Pieces for an overview.
the Unix philosophy is not (always) applicable to Windows.
So you need to study Unix (or POSIX) and write your own shell script from scratch. Don't try to "translate" a bat script to a Unix shell script, but redesign it entirely (starting from the problem you want it to solve).
(and Wine or DosBox is not helpful in your case)
Read also about SCP and perhaps FTP. Perhaps using some distributed version control system like git could be relevant for you (e.g. to share scripts, source code, etc...).
If you need to run remotely some Windows .bat script on a distant Windows machine (e.g. from a Unix machine), you should use some remote command running service (that is, find and use some equivalent of SSH service on Windows, and use the corresponding client on Unix). See this.
So if you need to remotely run on a Windows server something (e.g. some program, some script, some command) from a Unix machine you should ask a different question (or at least improve a lot the current one).
Read about the client-server model and about application layer to use the correct terminology. You should name what protocol, server, client, service you want to involve. Nothing is magically "triggered" without using them.
PS. I'm using Unix since 1987, Linux since 1993. I never used Windows.

Will a shell script written on ubuntu kernel always be able to run on RHEL also

I want to know that if a shell script is created and tested on ubuntu kernel, then will it always without fail also run on RHEL kernel provided the correct shell is invoked for running the script.
Ways in which the execution may differ when used on different distributions and different kernels:
Differences in the version and configuration of the Linux kernel - this may affect presence and format of the contents of files such as those in /proc and /sys, or the presence of particular device drivers.
Differences in the version of the shell used - /bin/sh may be Bash on one system and Dash on another, or Bash 3.x on one system and Bash 4.x on the other.
Differences in the installed programs your script invokes (and, if you got your package dependencies wrong, whether those programs are even present - what's "essential" on one distribution may be "optional" on another).
In short, different distributions have the same issues as different versions of one distribution, but more so.
It depends on what shell/interpreter it was written for and versions of the particular shell it was written for. For example, a bash script written using bash-4.4 may not work in bash-2.0 and so on. It's not quite related to to the distribution/kernel version you use but the shell you use.
So, without details, it's not possible to assert whether a script that works on Ubuntu will work on RHEL. If you use the same shell and same version
on both machines then yes, it's going to work as expected (barring some very odd cases).

Wrong Architecture when running executable from Xcode4 on UNIX

First of all, I'm very new to programming.
I have a build a program using Xcode 4 on Snow Leopard.
Architecture of the project is set to "Standard (32/64-bit intel)"
Afterwards I have exported the executable file to a UNIX computer for running.
ssh to that computer
Typing ./programname in the terminal (Of the UNIX computer) gives the following response:
Exec format error. Wrong Architecture.
The program runs just fine on my Mac laptop.
When you compile a program it will (*) be compiled for a specific platform and a specific operating system. It will also most likely be compiled against a specific set of libraries. Usually those parameters are exactly those of the computer doing the compilation (the other cases are called cross-compilation).
In other words: compiling a program on a Mac will produce a binary that runs only on a Mac (unless, again, you're doing cross-compilation). Your UNIX system (which UNIX, by the way?) has a different operating system, different libraries and probably even a different CPU architetcture.
Somewhat related: Apples advertised (or used to advertise) Mac OS X as a UNIX. While Mac OS X is certainly a UNIX-class operating system, that doesn't mean that it's binary compatible with every other UNIX-class OS out there.
* almost always, with the exception of systems designed to avoid this (e.g. Java)
Programs compiled by XCode will only run under MacOS X. Unless the "UNIX computer" in step 2 is running MacOS, the program will not be able to run.

Installing software on Solaris

I'd like to install several unix utilities (incl. xmlstarlet, wget) on a solaris 10 machine which I don't have root access to (obviously, I have a user account). I'm not that experienced with solaris and am wondering if I can simply get hold of an uber binary for each utility I need and just place this in my home directory? Is this feasible?
Many thanks
wget is installed by default on Solaris 10 in /usr/sfw/bin/wget.
xmlstarlet requires four libraries that aren't included in Solaris 10 so it's going to be trickier but of course, you can build them and then xmlstarlet from their respective source code.
Have a look there for information about what is needed: http://www.opencsw.org/packages/xmlstarlet
If you really don't want to compile the binaries, there is certainly a way to manually install the files stored on these Solaris packages elsewhere and patch/fix them to make the whole work. I did that already.
Finally, don't underestimate the willingness of the system administrator to help.
As long as the binary doesn't try to do something that requires superuser privileges and the binary is compiled for your platform, you should be ok.

Resources