I actually use jq (1.5) with Windows 10 to Format different json files. I tried today to move the filters to a filter file to cut the length of my cmd commands.
I copied the filter directly from the command with all Quotations but i received an Syntax error. I tried to remove the qotations or Change them to ' but i still receive the Syntax error:
jq: error: syntax error, unexpected IDENT, expecting $end (Windows cmd shell quoting issues?) at <top-level>, line 1:
[.cruises[] | { nid: .cruise_nid, shipcategory: .ship_category, ship: .ship_title, company: .company_title, includeflight: .includes_flight, nights, waypoints: .waypoint_cities, title: .route_title}] C:\import\dreamlines_cruises.json > C:\Import\import_cruises.json
Any tips?
Regards Timo
Your jq filter as given (i.e. without quotation marks) looks fine, so let's assume you have successfully placed the text (hopefully formatted for readability :-) in a file, say format.jq
Then you would run something like this:
jq -f format.jq dreamlines_cruises.json
Related
I have a json output, representing a linux command in one of it's values:
... ,"proc.cmdline":"sh -c pgrep -fl \"unicorn.* worker\[.*?\]\"", ...
In some cases, the command contains a backslash, so the outputing json will contain a backslash too.
I need to parse the output with jq, but it fails with an error:
parse error: Invalid escape at line 1, column 373
It refers to this: \[
However, this is a part of the command, so it is expected to be there.
If a manually edit the line, converting \[ to \\[, then it passes. However the resulting output contains both backslashes:
...
"proc.cmdline": "sh -c pgrep -fl \"unicorn.* worker\\[.*?\\]\"",
...
Now, I can't be there to manually edit every time. This output is produced automatically by another software, and I need to parse it with jq every time it comes in.
Also, even if I was able to edit every \[ to \\[, (like by using something like sed) the output becomes a lie, the second \ is fake.
Any ideas on how to work around this?
EDIT: here is the full json for reference (received raw by the output of the program I'm using (falco)):
{"priority":"Debug","rule":"Run shell untrusted","time":"2019-05-15T07:32:36.597411997Z", "output_fields": {"evt.time":1557905556597411997,"proc.aname[2]":"gitlab-mon","proc.aname[3]":"runsv","proc.aname[4]":"runsvdir","proc.aname[5]":"wrapper","proc.aname[6]":"docker-containe","proc.aname[7]":"docker-containe","proc.cmdline":"sh -c pgrep -fl \"unicorn.* worker\[.*?\]\"","proc.name":"sh","proc.pcmdline":"reactor.rb:249 ","proc.pname":"reactor.rb:249","user.name":null}}
JSON standard is quite explicit about which characters have to be escaped, and [ is not one of them (though reverse solidus - \ is). So it's your script / software generating JSON violates the JSON standard - you can validate it on any of well-known online JSON validators, e.g., like this one: https://jsoncompare.com/#!/simple/ - it will produce the error too.
If you cannot enhance/fix your script generating that JSON, then you'd need to ensure you double quote those non-compliant quotations before passing to JSON processor: e.g.
... | sed -E 's/\\([][])/\\\\\1/g' | ...
You'll need to fix whatever is generating that "json" string. Use something that produces compliant json.
If that's not an option for you, then you will have to modify it so that it is valid json. Fortunately jq can handle that. Read it in raw, fix the string then parse it.
Assuming we just need to fix the \[ and \] sequence:
$ ... | jq -R 'gsub("\\\\(?<c>[[\\]])"; "\\\\\(.c)") | fromjson | "your filter"'
Remember, "sh -c pgrep -fl \"unicorn.* worker\\[.*?\\]\"" is a string with escapes... it represents the value:
sh -c pgrep -fl "unicorn.* worker\[.*?\]"
So it's absolutely correct to have "both backslashes."
I need to get different SFTP exit codes for each error. For instance 'no such file or directory' --> exit code=552 or 550 instead of returning 1.
I've tried the following and it did not work:
//A05FTP EXEC PROC=SFTPROC,COND=(0,NE)
//COPSFTP.MYSTDIN DD *
host="xpto.xpty.xptz"
lzopts mode=text
cd /home/apl/files/unl
ls
a=`ls | wc -l`
echo `$a`
echo $?
QUIT
//*
and the output in spool is:
cozsftp> lzopts mode=text
mode=text
cozsftp> lzopts mode=text
mode=text
cozsftp> cd /home/apl/files/unl
Ý09.807¨ Invalid command.
cozsftp> a= 1
CoZBatchÝI¨: returning rc=exitcode=1
Can anyone help me?
COZBATCH allows you to embed shell scripts into JCL, so you don't need to use BPXBATCH. BPXBATCH really is a poor utility. If you're using Co:Z then good for you it rocks.
If you want to run shell commands you need to use the ! escape character.
!echo $a
FWIW, SFTP always returns 1 on error. I'm not sure if you can change that. Errors should be logged in the sysout.
Your problem may simply be the echo `$a`. Try enclosing with quotes instead of tick marks.
More generally, if you want to do more detailed error checking, instead of using the SFTP procedure (SFTPROC), I think you'd do better to write yourself a simple script that you execute with BPXBATCH. The script would issue the same SFTP commands, but you could capture and redirect the output (STDOUT/STDERR) and based on the return value ($?) and any error messages, you could certainly detect all the unusual conditions you might want.
I want to run a shiny app in command line and it did well:
R -e "shiny::runApp('~/User/Appname',launch.browser=TRUE)"
But got error: syntax error near unexpected token `(' once I set an alias in .profile:
alias report="R -e "shiny::runApp('~/User/Appname',launch.browser=TRUE)""
Need your help here, I guess something wrong in the quotes?
Since you're using double quotes inside double quotes you need to escape the inner ones like this:
alias report="R -e \"shiny::runApp('~/User/Appname',launch.browser=TRUE)\""
However cleaner approach is to have a shell function instead and avoid all the escaping:
report() {
R -e "shiny::runApp('~/User/Appname',launch.browser=TRUE)"
}
I have a list of 87 text files populated daily in UNIX.
Eg: FP*.txt --- file names start with FP & are txt files
Daily new content gets appended to the end. I want to take a back up of all the text files using shell script and create blank files of the same name. I tried the following.
echo "" > FP*.txt
and > FP*.txt
Both gave the same error.
FP*.txt: ambiguous redirect
however when used on single file both work fine. So how to resolve it. And also creating a list of hard-coded names is not an option, as 87 are too many and they may increase in future.
Also some file names contain ( & , in their names. When tried the following codes,
echo "" > FP---Sample.txt
This worked fine. But when tried the following gave error.
echo "" > FP(1)---Sample.txt
This gave an error.
syntax error near unexpected token `('
So how to accomplish it??
Wrap the filename in quotes:
echo "" > "FP(1)---Sample.txt"
You get a
FP*.txt: ambiguous redirect
because you can just redirect to one file, while FP*.txt expands to more than one. That's why it works when the expansion just returns one name.
To do what you need, you'd better loop through the results of FP*.txt. For example, you can use:
bk_date=$(date "+%Y%m%d_%H%M%S") # 20140210_150526 format of date
for file in FP*.txt
do
cp -p "$file" "${file}.${bk_date}" # make a backup of the file
> "$file" # blank it. This is same as echo "" > "$file"
done
Regarding the error:
syntax error near unexpected token `('
Martin Dinov correctly explained that you need to quote the name to make it be part of the file name.
I am struggling with something small but important with syntax, trying to pass a pre-defined path and filename to awk within the system() call in R (OSX, R3.0.1; readLines() and scan() can NOT accomplish what I need).
The use of system and the file name, directly within R, works fine
system("awk 'NR==2' ~/path/filename", intern=TRUE)
However
filename<-"~/path/filename"
system("awk 'NR==2' filename", intern=TRUE)
returns the frustrating error
character(0)
attr(,"status")
[1] 2
Warning message:
running command 'awk 'NR==2' filename' had status 2
awk: can't open file filename
source line number 1
I expect I need to escape something somewhere in the filename, but I don't know where, or how.
This would be my first line of R code. :)
I guess the problem is you wrote the filename variable in literal string. You should first build the awk command with string concatenation, and then pass it to system(), like:
system(paste("awk 'NR==2' ", filename), intern=TRUE)
Try to replace ~/path/filename with its absolute form instead. e.g. /home/user/path/filename.