Does Lasso 8.6 have a means of extracting an uploaded zip file to a specified path? - lasso-lang

I am trying to provide a means of allowing people to upload zips and have them extracted to a particular file path. It seems like zip functionality has been added in Lasso 9 but I'm curious if there is in fact a method for doing this in 8.6 or if anyone has any suggestions.

There are a couple of options (besides upgrading to 9):
First, you could use [os_process] to call the unzip command-line utility and have it do it for you
In 8.5, there was an example for the LJAPI documentation that created a [zip] custom type that you should be able to use. (I'm not sure if the 8.6 installer has it, but for OS X, after installing 8.5 you could find it here: /Applications/Lasso Professional 8/Documentation/3 - Language Guide/Examples/LJAPI/Tags/ZipType/) Chapter 67 of the Language Guide has documentation on how to get it installed and working.

Further expounding option 1 in bfad's answer: You might like the Lasso 8 shell tag from TagSwap to make this even easier. Here's an example where I extract tar'd and gzip'd archives:
// authenticate with user that has file permissions in this directory
inline(-username='username', -password='password');
// load shell tag from TagSwap
library_once('shell.inc');
// call tar from bash shell
shell('tar -zxf myfile.tgz');
/inline;

Related

How stop RStudio from creating empty "R" folder within "/home" directory at every startup

After having set the path for the default working directory as well as my first (and only) project within RStudio options I wonder why RStudio keeps creating an empty folder named "R" within my "/home" directory every time it is started.
Is there any file I could delete/edit (eventually create) to stop this annoying behaviour and if so, where is it located ?
System: Linux Mint v. 19.3
Software: RStudio v. 1.3.959 / R version 3.4.4
Thanks in advance for any hints.
Yes, you can prevent the creation of the R directory — R is configurable via a set of environment variables.
However, setting these correctly isn’t trivial. The first issue is that many R packages are sensitive to the R version they’re installed with. If you upgrade R and try to load the existing package, it may break. Therefore, the R package library path should be specific to the R version.
On clusters, an additional issue is that the same library path might be read by various cluster nodes that run on different architectures; this is rare, but it happens. In such cases, compiled R packages might need to be different depending on the architecture.
Consequently, in general the R library path needs to be specific both to the R version and the system architecture.
Next, even if you configure an alternative path R will silently ignore it if it doesn’t exist. So be sure to manually create the directory that you’ve configured.
Lastly, where to put this configuration? One option would be to put it into the user environment file, the path of which can be specified with the environment variable R_ENVIRON_USER — it defaults to $HOME/.Renviron. This isn’t ideal though, because it means the user can’t temporarily override this setting when calling R: variables in this file override the calling environment.
Instead, I recommend setting this in the user profile (e.g. $HOME/.profile). However, when you use a desktop launcher to launch your RStudio, this file won’t be read, so be sure to edit your *.desktop file accordingly.1
So in sum, add the following to your $HOME/.profile:
export R_LIBS_USER=${XDG_DATA_HOME:-$HOME/.local/share}/R/%p-library/%v
And make sure this directory exists: re-source ~/.profile (launching a new shell inside the current one is not enough), and execute
mkdir -p "$(Rscript -e 'cat(Sys.getenv("R_LIBS_USER"))')"
The above is using the XDG base dir specification, which is the de-facto standard on Linux systems.2 The path is using the placeholders %p and %v. R will fill these in with the system platform and the R version (in the form major.minor), respectively.
If you want to use a custom R configuration file (“user profile”) and/or R environment file, I suggest setting their location in the same way, by configuring R_PROFILE_USER and R_ENVIRON_USER (since their default location, once again, is in the user home directory):
export R_PROFILE_USER=${XDG_CONFIG_HOME:-$HOME/.config}/R/rprofile
export R_ENVIRON_USER=${XDG_CONFIG_HOME:-$HOME/.config}/R/renviron
1 I don’t have a Linux desktop system but I believe that editing the Env entry to the following should do it:
Exec=env R_LIBS_USER=${XDG_DATA_HOME:-$HOME/.local/share}/R/%p-library/%v /path/to/rstudio
2 Other systems require different handling. On macOS, the canonical setting for the library location would be $HOME/Library/Application Support/R/library/%v. However, setting environment variables on macOS for GUI applications is frustratingly complicated.
On Windows, the canonical location is %LOCALAPPDATA%/R/library/%v. To set this variable, use [Environment]::SetEnvironmentVariable in PowerShell or, when using cmd.exe, use setx.

p7zip / unzip case sensitivity - extracts archive to seperate folders

I am supplied data by a third party company in the form of self extracting EXE archives.
Take example:
fvdata.exe
When i extract this archive on my windows machine I get this(this is what I am trying to achieve):
fvdata_d
however on my CentOS 6.5 machine I get two folders:
fvdata_d FVdata_d
I believe the tech putting this archive together was a bit sloppy when it came to their case sensitivity. However I am not sure how to fix this.
Commands I have tried on the linux machine to extract:
7za x fvdata.exe -y -ssc-
unzip fvdata.exe -C
unzip fvdata.exe
If I can't do this maybe someone can reccommend a work around?
In unzip, the -LL option forces conversion of every filename to lowercase, regardless of the originating file system.
http://www.info-zip.org/mans/unzip.html

Preserve files/directories for rpm upgrade in .spec file(rpmbuild)

I wrote a .spec file on RHEL and I am building RPM using rpmbuild. I need ideas on how to handle the situation below.
My RPM creates an empty logs directory when it installs first time within the installation folder like below
/opt/MyInstallation-1.0.0-1/some executables
/opt/MyInstallation-1.0.0-1/lib/carries shared objects(.so files)
/opt/MyInstallation-1.0.0-1/config/carries some XML and custom configuration files(.xml, etc)
/opt/MyInstallation-1.0.0-1/log--->This is where application writes logs
When my RPM upgrades MyInstallation-1.0.0-1, to MyInstallation-1.0.0-2 for example, I get everything right as I wanted.
But, my question is how to preserve log files written in MyInstallation-1.0.0-1? Or to precisely copy the log directory to MyInstallation-1.0.0-2.
I believe if you tag the directory as %config, it is expected that the user will have files in there, so it will leave it alone.
I found a solution or workaround to this by hit and trial method :)
I am using rpmbuild version 4.8.0 on RHEL 6.3 x86_64. I believe it will work on other distros as well.
If you install with one name only like "MyInstallation" rather than "MyInstallation-version number-RPM Build Number" and create "logs directory as a standard directory(no additional flags on it)[See Original Question for scenario] Whenever you upgrade, you normally don't touch logs directory. RPM will leave its contents as it is. All you have to do is to ensure that you keep the line below in the install section.
%install
install --directory $RPM_BUILD_ROOT%{_prefix}/%{name}/log
Here, prefix and name are macros. That has to do nothing with underlying concept.
Regarding config files, the following is a very precise table that will help you guarding your config files. Again, this rule can't be applied on logs our applications create.
http://www-uxsup.csx.cam.ac.uk/~jw35/docs/rpm_config.html
Thanks & Regards.

Editing SAS config files to execute R (making SAS play well with others)

There are many things that R just does better. Hence, I am trying to set my system up so I can execute R commands from within SAS using the [submit /R;] and [endsubmit;] commands. However I need some help getting my config files set up properly to do this.
First Steps (to allow SAS to read the R language):
I checked to see if my system was set up to read the R language (code below).
proc options option=rlang;
run;
I got the following in my log:
SAS (r) Proprietary Software Release 9.3 TS1M0
NORLANG Do not support access to R language interfaces
This meant I needed to add the -RLANG option to the config file. I did that. Below is an example of my config file (C:\Program Files\SASHome\SASFoundation\9.3\sasv9.cfg):
-RLANG
-config "C:\Program Files\SASHome\SASFoundation\9.3\nls\en\sasv9.cfg"
(NOTE: the -RLANG had to be above the config reference for this to be recognized properly.)
And the resulting output in my log after re-opening enterprise guide and re-running the proc options code above:
SAS (r) Proprietary Software Release 9.3 TS1M0
RLANG Support access to R language interfaces
Issue (specific to Enterprise Guide?):
I am using SAS 9.3 and R 2.15.2, so according to this (http://blogs.sas.com/content/iml/2013/09/16/what-versions-of-r-are-supported-by-sas/) these versions are compatible.
However, I execute SAS through Enterprise Guide 4.3 (I like organization better). It appears that Enterprise Guide may require some additional stuff in the config file to allow R to run and recognize where it is on my computer.
For example, I try running the following code:
Proc iml;
submit /R;
directory <- "C:\\Data\\Filepath"
FILEpattern1 <- "Fall 12-13.xlsx"
setwd(directory)
filenames1 <- list.files(pattern=FILEpattern1)
endsubmit;
And I get the following error:
15 Proc iml;
NOTE: IML Ready
16 submit /R;
17 directory <- "C:\\Data\\Filepath"
18 FILEpattern1 <- "Fall 12-13.xlsx"
19
20 setwd(directory)
21 filenames1 <- list.files(pattern=FILEpattern1)
22 endsubmit;
ERROR: SAS could not initialize the R language interface.
statement : SUBMIT at line 16 column 1
According to this thread (https://communities.sas.com/thread/34758), individuals using Enterprise Guide also need to define where R_Home is on their computer. The thread discusses changing something in sasenv_local but I need more specific directions.
Any suggestions or advice on how to get this working?
If the issue is solely defining R_HOME in the local environment variables, you have at least three options. You can add this to your config file if you have access to that (the file referenced in -config in the OP):
-SET R_HOME "r_home location"
You could use options set to do the same thing (options set=R_HOME='r_home location';) if you don't have permission to modify your config file, as well.
You also should be able to modify the environment variable in Windows directly, by going to My Computer, right click-properties, Advanced, Environment Variables, and setting it there. Again, this requires administrative rights.
See this paper for more information.
As noted by the OP. R_HOME needs to be set to the base directory for R (such as c:\program files\R), not to the \bin folder or any other particular location.

Searching pdfs in Plone 4 on Mac OS

I've got Plone 4 running on Mac OS Server 10.6. I'd like to make it possible for the search facility on my Plone site to search for text within the pdf files which are stored there.
I've searched around, but the closest I can find is information about doing this on Plone 3 with Linux.
Please could anyone help?
The basic idea is the same. You need to install the external "pdftohtml" command line utility. In Plone 4 you don't need to do any other configuration in the ZMI or other places. Once the pdftohtml tool is installed new files being uploaded will get their contents indexed. You can go to the catalog in the ZMI to the indexes tab and "reindex" the "SearchableText" index to also cover already uploaded files.
One way to install system packages on Mac is to use MacPorts (http://www.macports.org/install.php). If you use that approach, you can call:
$ sudo port install poppler
Once that has finished, you should be able to call the tool and see something like:
$ pdftohtml -v
pdftohtml version 0.16.5
Copyright 2005-2011 The Poppler Developers - http://poppler.freedesktop.org
You might need to add /opt/local/bin to the PATH variable of the user running the Plone process.
The documentation for Plone 3 applies for Plone 4 in the same way.

Resources