I have error with scan function, why?
https://jqplay.org/s/E-0qbbzRPS
I need do this without -r
There are two issues with your filter. Firstly, you need to separate parameters to a function with semicolon ;, not comma ,:
scan("([0-9A-Za-z_]+) == '([0-9A-Za-z_]+)"; "g")
Secondly, scan with two parameters is not implemented (in contradiction to the manual).
jq: error: scan/2 is not defined at <top-level>, line 1:
But as you are using scan, your regex will match multiple occurrences anyway, so you may as well just drop it :
.spec.selector | [scan("([0-9A-Za-z_]+) == '([0-9A-Za-z_]+)") | {(.[0]): .[1]}]
[
{
"app": "nginx"
}
]
Demo
Related
As in Passing bash variable to jq, we should be able to use a JQ variable as $VAR in a jq expression.
projectID=$(jq -r --arg EMAILID "$EMAILID" '
.resource[]
| select(.username==$EMAILID)
| .id' file.json
)
SO to extract project_id from the json file sample.json.
{
"dev": {
"gcp": {
"project_id": "forecast-dev-1234",
"project_number": "123456789",
"endpoint_id": "6837352639743655936"
}
}
}
Run the JQ expression using a variable but did not work.
# TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '.$TARGET.gcp.project_id' sample.json
-----
jq: error: syntax error, unexpected '
.$TARGET.gcp.project_id
(Unix shell quoting issues?) at <top-level>, line 1:
.$TARGET.gcp.project_id
jq: error: try .["field"] instead of .field for unusually named fields at <top-level>, line 1:
.$TARGET.gcp.project_id
jq: 2 compile errors
Please help understand why and how to use the variable to form an expression to extract project_id.
JQ Manual does not provide clear explanation for variable and ---arg. Is there a good resource that clearly explain JQ variable and how to use it?
Another way to set the exit status is with the halt_error builtin function.
--arg name value:
This option passes a value to the jq program as a predefined variable. If you run jq with --arg foo bar, then $foo is available in the program and has the value "bar". Note that value will be treated as a string, so --arg foo 123 will bind $foo to "123".
Workaround
Using interpolation.
$ TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '."\($TARGET)".gcp.project_id' sample_interpolation.json
-----
forecast-dev-1234
Version
jq --version
---
jq-1.64l
You're using variables fine, the problem is that object-identifier syntax doesn't allow general expressions. It's a shorthand syntax for when the key you're looking up is a fixed identifier-like string, like .foo or .project_id. As noted in the manual, you can use the more general generic object index filter for arbitrary keys including those that are calculated by some expression, such as .[$TARGET]:
$ TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '.[$TARGET].gcp.project_id' sample.json
forecast-dev-1234
Bash script find a a tags in ECR repo:
aws ecr describe-images --repository-name laplacelab-backend-repo
\ --query 'sort_by(imageDetails,& imagePushedAt)[*]'
\--output json | jq -r '.[].imageTags'
Output:
[
"v1",
"sometag",
...
]
How I can extract the version number? v<number> can contain the only version tag. I need to get a number and increment version for the set to var. If output of sort_by(imageDetails,& imagePushedAt)[*] is empty JSON arr instead
[
{
"registryId": "057296704062",
"repositoryName": "laplacelab-backend-repo",
"imageDigest": "sha256:c14685cf0be7bf7ab1b42f529ca13fe2e9ce00030427d8122928bf2d46063bb7",
"imageTags": [
"v1"
],
"imageSizeInBytes": 351676915,
"imagePushedAt": 1593514683.0
}
]
Set 2
No one repo sort_by(imageDetails,& imagePushedAt)[*] return [] set 1.
As a result, I try to get var VERSION with next version for an update or 1 if the repo is empty.
You could use the select() function on the imageTags array and get only the tag starting with v and increment it.
jq '( .[].imageTags[] | select(startswith("v")) | ltrimstr("v") | tonumber | .+1 ) // 1'
For other cases like the tags array being empty or containing null strings (error case), the value defaults to 1
For storing into the variable e.g. say version (avoid using uppercase variable names from a user scripts), use command substitution. See How do I set a variable to the output of a command in Bash?
version=$( <your-pipeline> )
Note: This does not work well with version strings following Semantic versioning RFC, e.g. as v1.2.1 as jq does not have a library to parse them.
I would like to pass an argument without quotes (JQ arg has double quotes by default) since it should be used as a filter. For e.g.
propt='.properties'
final=($(jq -r -c --arg p $propt '$p' sample.json))
echo $final
sample.json
{
"type": "object",
"description": "Contains information",
"properties": {
"type": {
"description": "Type"
}
}
}
So ultimately it prints out .properties instead of the expected {"type":{"description":"Type"}}
I use a bash shell for this purpose.
Please let me know what I am doing wrong.
If I understand you correctly, you're getting sidetracked by thinking you need to set up a variable in jq, instead of just letting the shell do an expansion:
% foo='.properties'
% jq -r -c "$foo" sample.json
output:
{"type":{"description":"Type"}}
Note the double quotes on $foo to still allow the shell to expand the variable to .properties. That said you could unsafely use: jq -r -c $foo sample.json
You can't use --arg in that way. The value of a --arg is a string, not a jq filter expression. If you do --arg p .properties, then $p will contain the string ".properties", it won't be evaluated as a program. Find a different way to do what you want, perhaps by defining a function.
For example, if you prefixed your program with def p: .properties; then you could use .|p in your program in the way that you're using $p now, and it would access the .properties of whatever value is in context.
Since jq does not have an “eval” function, the appropriate way to specify a path programmatically in jq is using a JSON array in conjunction with jq’s getpath and setpath built-ins, as appropriate.
Thus in your case you could use the -—argjson command-line option to pass in the path of interest, e.g.
-—argson p '["properties"]'
and your jq program would use getpath($p).
Needless to say, this approach works for arbitrarily nested paths.
I am able to use grep in normal command line.
grep "ABC" Filename -C4
This is giving me the desired output which is 4 lines above and below the matched pattern line.
But if I use the same command in a Unix shell script, I am unable to grep the lines above and below the pattern. It is giving me output as the only lines where pattern is matched and an error in the end that cannot says cannot open grep : -C4
The results are similar if I use -A4 and -B4
I'll assume you need a portable POSIX solution without the GNU extensions (-C NUM, -A NUM, and -B NUM are all GNU, as are arguments following the pattern and/or file name).
POSIX grep can't do this, but POSIX awk can. This can be invoked as e.g. grepC -C4 "ABC" Filename (assuming it is named "grepC", is executable, and is in your $PATH):
#!/bin/sh
die() { echo "$*\nUsage: $0 [-C NUMBER] PATTERN [FILE]..." >&2; exit 2; }
CONTEXT=0 # default value
case $1 in
-C ) CONTEXT="$2"; shift 2 ;; # extract "4" from "-C 4"
-C* ) CONTEXT="${1#-C}"; shift ;; # extract "4" from "-C4"
--|-) shift ;; # no args or use std input (implicit)
-* ) [ -f "$1" ] || die "Illegal option '$1'" ;; # non-option non-file
esac
[ "$CONTEXT" -ge 0 ] 2>/dev/null || die "Invalid context '$CONTEXT'"
[ "$#" = 0 ] && die "Missing PATTERN"
PATTERN="$1"
shift
awk '
/'"$PATTERN"'/ {
match='$CONTEXT'
for(i=1; i<=CONTEXT; i++) if(NR>i) print last[i];
print
next
}
match { print; match-- }
{ for(i='$CONTEXT'; i>1; i--) last[i] = last[i-1]; last[1] = $0 }
' "$#"
This sets up die as a fatal error function, then finds the desired lines of context from your arguments (either -C NUMBER or -CNUMBER), with an error for unsupported options (unless they're files).
If the context is not a number or there is no pattern, we again fatally error out.
Otherwise, we save the pattern, shift it away, and reserve the rest of the options for handing to awk as files ("$#").
There are three stanzas in this awk call:
Match the pattern itself. This requires ending the single-quote portion of the string in order to incorporate the $PATTERN variable (which may not behave correctly if imported via awk -v). Upon that match, we store the number of lines of context into the match variable, loop through the previous lines saved in the last hash (if we've gone far enough to have had them), and print them. We then skip to the next line without evaluating the other two stanzas.
If there was a match, we need the next few lines for context. As this stanza prints them, it decrements the counter. A new match (previous stanza) will reset that count.
We need to save previous lines for recalling upon a match. This loops through the number of lines of context we care about and stores them in the last hash. The current line ($0) is stored in last[1].
I'm new to shell scripting and need some help. I am trying to write a script to bounce some servers and I am having a few issues with my if statements. The First and Second one below is giving me a too many arguments error.
For the first one, I am the variable $jmsProcess is a ps -ef | grep command and I only want to go into the if-statement, if this returns some results. This is the same issue for the second one.
In the Third if-statement I want it to check if either of those variables have the value true but this gives me a
if[ [ false || false ] == true ]: command not found
Error.
#Check the JMS process has been killed
if [ $jmsProcess != null ] # SHOULD THIS BE NULL???
then
echo "JMS Process is still running"
$jmsRunning = "true"
fi
#Check the Bone process has been killed
if [ $boneProcess != null ] # SHOULD THIS BE NULL???
then
echo "B-One Process is still Running"
$boneRunning = "true"
fi
if[ [ $jmsRunning || $boneRunning ] == true ] # CHECK THIS FOR QUOTES
then
# $killProcess
fi
null is not a Bash keyword. If you want to check whether a variable is defined, you can do the following:
if [ "${var+defined}" = defined ]
To check whether it's empty or undefined:
if [ -z "${var-}" ]
We don't know how you're setting any of the variable values, (jmsProcess, boneProcess).
Try surrounding all var values like "$var" (with dbl-quotes) and see if the error messages change.
Also there are numerous syntax issues in code visible above. Hard to tell if it is an artifact of posting here, (The code block feature is pretty robust), so I'm going to comment on what I see wrong.
if[ [ false || false ] == true ]: command not found
There are a lot of issues here: false is an shell command. Try typing it on the command line and then do echo $?. you should see 1. true; echo $? will return 0. But the if statements continue or fall-over to the else block based on the last return code (with some special case exceptions).
Also, I'm guessing you're trying to make some sort of reg exp with [ false || false ] == true. Won't work. see below.
You can set status variables to have the value of false (or true) which will be evaluated correctly by the shell.
Also, if[ will give the 'command not found' msg. So by using vars that have the value false, you can do
Try
jmsRunning=false ; boneRunning=true
if [[ ${jmsRunning} || ${boneRunning} ]] ; then
echo both NOT running
else
echo at least 1 is running
fi
Change both to false to see the message change.
Also, null is just a string in a shell script, you probably mean "".
Finally, var assignments cannot have spaces surrounding the '=' sign AND do not use the '$' char at the front when it is on the left hand side of the statment, i.e.
boneRunning=true
I hope this helps.