How can I keep variables used in a script in a secondary file? For example, use store variables in myscript.env for use in a script myscript.sh.
So whenever I need to change variables I can edit myscript.env instead of myscript.sh.
Simply include the file using the "dot operator".
assuming these two files are in the same directory, the following will work:
t.sh:
#!/usr/bin/sh
# Note the first dot on the following line
. ./t.env
echo $TESTVAR
t.env:
TESTVAR="Hello world"
When run:
~/tmp$ sh ./t.sh
Hello world
This is quite a usual design pattern. Think along the lines of
myscript.sh:
#!/path/to/shell
CONFIG_FILE=myscript.env
CONFIG_DIR=`dirname $0`
# or e.g. CONFIG_DIR=/etc/myscript
CONFIG="$CONFIG_DIR/$CONFIG_FILE"
. $CONFIG
echo "SOMEVAR=$SOMEVAR"
myscript.env
SOMEVAR="The value of some var"
Now /path/to/myscript.sh will output SOMEVAR=The value of some var
Related
I have a file parama.out having 3 variables like:
p_id
P_level
p_name
And the value for these variable will be User Input at the run-time. I tried using while loop but instead of asking for the user input its moving to the next line and throwing an error.
I used the code:
cat params.out |\
while read line
do
echo "Please provide the value for $line:\c"
read $line
done
There are a few things I used to make this work.
Put "read $line" on a seperate line (or user && or ; to seperate the two commands).
Remove the "\" after pipe to while (I removed the cat and pipe way to input the file).
Add "/dev/tty" to the read command, to let it know where the input should come from. Otherwise it'll take input from the file.
while read line
do
echo "Please provide the value for $line:"
read $line </dev/tty
echo "p_id: $p_id"
echo "p_level: $p_level"
echo "p_name: $p_name"
done < params.out
This gives the following output:
Please provide the value for p_id:
a
p_id: a
p_level:
p_name:
Please provide the value for p_level:
s
p_id: a
p_level: s
p_name:
Please provide the value for p_name:
d
p_id: a
p_level: s
p_name: d
I want to use a relative file path as a command line argument but as the example and assessment below will demonstrate, the variable passes \..\ as a string, it doesn't evaluate it.
Can I can force the command line to parse and expand the variable as a string?
: For example: I have a R script file I want to launch from the command line:
Set RPath=C:\Program Files\R\R-3.1.0\bin\Rscript.exe
SET RScript=%CD%\..\..\HCF_v9.R
SET SourceFile=%CD%\..\Source\
ECHO String used for Source Location - %SourceFile%
"%RPath%" "%RScript%" %SourceFile%
The inclusion of \..\ works in the call to R as an external program because the batch file can resolve it's own commands.
The variable of SourceFile however doesn't work because the SourceFile variable hasn't expanded \..\, it has just included it as part of the string and R can't process \..\
You can use the for replaceable parameters to resolve to the real path
for %%a in ("..\..\HCF_v9.R") do set "RScript=%%~fa"
#MC ND has provided the batch file approach; an R-centric approach would be to pass the current directory to R, and modify it there.
; batch file
Set RPath=C:\Program Files\R\R-3.1.0\bin\Rscript.exe
SET RScript=%CD%\..\..\HCF_v9.R
"%RPath%" "%RScript%" %CD%
# in R
srcpath <- commandArgs(TRUE)[1]
srcpath <- normalizePath(file.path(srcpath, "../Source"))
Using QMake, I read some boiler plate code, make modifications and write the modified code to a file.
However, I get very strange results. I have simplified the problem down to the following:
# Open boiler plate file
interfaceBoilerPlateCode = $$cat($$boilerPlateFile, blob)
# Make sure we read the right content
message("Content read: $$interfaceBoilerPlateCode")
# Write the read text into a file
output = $$system(echo $$interfaceBoilerPlateCode >> $$targetFile) # Doesnt work
output = $$system(echo "Howde" >> $$targetFile) # This works
The file being read is a plain text file containing only the string "Howde".
The contents of the file get read correctly.
However, when I try and write the contents of the file to another target file, I get no output (literally: no errors/warnings but no new file generated). However, if I use echo with just a string defined in the code itself (as in the last line of snippet above), a new file gets generated with the string "Howde" inside it.
What is going on? What am I doing wrong that the penultimate line does not generate a new file?
Use write_file. Instead of:
$$system(echo $$content >> $$file_path)
use
write_file($$file_path, $$content)
Say I have an .Rmd file like this:
The total number of steps per day can also be calculated
using `tapply`.
```{r}
tapply(d$steps, INDEX=d$date, FUN=sum)[1:5]
```
What seems to be different is that, per default, `xtabs`
returns 0 for `NA` values and `tapply` returns `NA`.
In my terminal window, this looks like this:
It would be great if somehow I could inform vim that the R chunk is actually R code which it could highlight just as it does when working in an actual .R file.
Is this possible?
Yes you can. This code is taken from here.
Put this in the ~/.vim/r.vim file (if any of these files do not exist, create them)
function! TextEnableCodeSnip(filetype,start,end,textSnipHl) abort
let ft=toupper(a:filetype)
let group='textGroup'.ft
if exists('b:current_syntax')
let s:current_syntax=b:current_syntax
" Remove current syntax definition, as some syntax files (e.g. cpp.vim)
" do nothing if b:current_syntax is defined.
unlet b:current_syntax
endif
execute 'syntax include #'.group.' syntax/'.a:filetype.'.vim'
try
execute 'syntax include #'.group.' after/syntax/'.a:filetype.'.vim'
catch
endtry
if exists('s:current_syntax')
let b:current_syntax=s:current_syntax
else
unlet b:current_syntax
endif
execute 'syntax region textSnip'.ft.'
\ matchgroup='.a:textSnipHl.'
\ start="'.a:start.'" end="'.a:end.'"
\ contains=#'.group
endfunction
Now you can use
:call TextEnableCodeSnip( 'r', '```{r}', '```', 'SpecialComment')
As long as there is an r.vim syntax file.
You could also automatically call this method every time you open a .Rmd file:
autocmd BufNewFile,BufRead *.Rmd :call TextEnableCodeSnip( 'r', '```{r}', '```', 'SpecialComment')
If you wanted to highlight with r followed by any number of characters you can use regular expressions:
:call TextEnableCodeSnip( 'r', '```{r.*}', '```', 'SpecialComment')
Or in your .vimrc:
autocmd BufNewFile,BufRead *.Rmd :call TextEnableCodeSnip( 'r', '```{r.*}', '```', 'SpecialComment')
The .* regular expression means any repeating character. So r.* means r followed by any number of characters.
Therefore this will work with
```{r whatever you want to put here}`
Some r code here
```
You might also be interested in my SyntaxRange plugin, which is based on the Vim Tip in #Zach's answer. It simplifies the setup of such syntax regions.
The setup would be like this (e.g. in ~/.vim/ftplugin/markdown.vim):
call SyntaxRange#Include('^````{r}', '^````', 'r', 'SpecialComment')
I am writing a CGI script in Perl with a section of embedded R script which produces a graph. The original data filename is unknown as it has been uploaded by the CGI script and is stored in a Perl variable called $filename.
My question is that I now would like to open that file in R using read.table(). I am using Statistics::R and so I have tried:
my $R = Statistics::R->new();
$R->set('filename',$filename);
my $out1 = $R->run(
q`rm(list=ls())`,
# Fetch data
q`setwd("/var/www/uploads")`,
q`peakdata<-read.table(filename, sep="",col.names=c("mz","intensity","ionsscore","matched","query","index","hit"))`,
q`attach(peakdata)` ...etc
I can get this to work ONLY if I change $filename into something static and known like 'data.txt' before trying to open the file in read.table - is there a way for me to open a file with a variable for a name?
Thank you in advance.
One possible way to do this is by doing a little more work in Perl.
This is untested code, to give you some ideas:
my $filename = 'fileNameIGotFromSomewhere.txt'
my $source_dir = '/var/www/uploads';
my $file = "$source_dir/$fielname";
# make sure we can read it
unless ( -r $file ) {
die 'can read that data file: $!";
}
Then instead of $R->set, you could interpolate the file name into the R program. Where you've used the single-quote operator, use the double-quote operator instead:
So instead of:
q`peakdata<-read.table(filename, sep="",col.names= .... )`
Use:
qq`peakdata<-read.table($filename, sep="",col.names= .... )`
Now this looks like it would be inviting problems similar to SQL/Code Injections, so that's why I put int the logic to insure that the file exists and is readable. You might be able to think other checks to add to safeguard your use of user-supplied info.