The command below is returning an error (jq version: 1.6):
$ jq --arg b bar . <<< '{ "foo": $b }'
parse error: Invalid numeric literal at line 1, column 12
Expected output:
{
"foo": "bar"
}
The jq 1.6 manual describes the --arg option thusly:
--arg name value: This option passes a
value to the jq program as a predefined variable. If you run jq with
--arg foo bar, then $foo is available in
the program and has the value "bar". Note that
value will be treated as a string, so --arg foo
123 will bind $foo to "123".
Named arguments are also available to the jq program as
$ARGS.named.
My usage appears correct. What's going on here?
My variable call was not within the jq program
The here-string I'm passing into jq
{ "foo": $b }
is not "the jq program" mentioned in the manual's --arg description. The lone . was the entire program, and did not use the variable $b.
I was trying to construct JSON from scratch by passing in my pattern on stdin. Instead, I should have provided the --null-input option, and replaced the . with the pattern I was attempting to pass in.
Description of --null-input
--null-input/-n:
Don't read any input at all! Instead, the filter is run once
using null as the input. This is useful when using jq as
a simple calculator or to construct JSON data from scratch.
Here's the correct invocation:
$ jq --arg b bar --null-input '{ "foo": $b }'
{
"foo": "bar"
}
Related
curl http://api.open-notify.org/iss-now.json
{"message": "success", "timestamp": 1665708640, "iss_position": {"longitude": "-114.2621", "latitude": "8.5148"}}
I want to parse the json to get property message.
x=$(curl http://api.open-notify.org/iss-now.json | jq .message)
echo $x
"success"
I want to get success without containing double quote.
x=$(curl http://api.open-notify.org/iss-now.json | jq .message | sed 's/"//g')
echo $x
success
Can jq achieve same target with its some argument without piping to sed?
Just use jq's command-line option -r.
As in Passing bash variable to jq, we should be able to use a JQ variable as $VAR in a jq expression.
projectID=$(jq -r --arg EMAILID "$EMAILID" '
.resource[]
| select(.username==$EMAILID)
| .id' file.json
)
SO to extract project_id from the json file sample.json.
{
"dev": {
"gcp": {
"project_id": "forecast-dev-1234",
"project_number": "123456789",
"endpoint_id": "6837352639743655936"
}
}
}
Run the JQ expression using a variable but did not work.
# TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '.$TARGET.gcp.project_id' sample.json
-----
jq: error: syntax error, unexpected '
.$TARGET.gcp.project_id
(Unix shell quoting issues?) at <top-level>, line 1:
.$TARGET.gcp.project_id
jq: error: try .["field"] instead of .field for unusually named fields at <top-level>, line 1:
.$TARGET.gcp.project_id
jq: 2 compile errors
Please help understand why and how to use the variable to form an expression to extract project_id.
JQ Manual does not provide clear explanation for variable and ---arg. Is there a good resource that clearly explain JQ variable and how to use it?
Another way to set the exit status is with the halt_error builtin function.
--arg name value:
This option passes a value to the jq program as a predefined variable. If you run jq with --arg foo bar, then $foo is available in the program and has the value "bar". Note that value will be treated as a string, so --arg foo 123 will bind $foo to "123".
Workaround
Using interpolation.
$ TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '."\($TARGET)".gcp.project_id' sample_interpolation.json
-----
forecast-dev-1234
Version
jq --version
---
jq-1.64l
You're using variables fine, the problem is that object-identifier syntax doesn't allow general expressions. It's a shorthand syntax for when the key you're looking up is a fixed identifier-like string, like .foo or .project_id. As noted in the manual, you can use the more general generic object index filter for arbitrary keys including those that are calculated by some expression, such as .[$TARGET]:
$ TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '.[$TARGET].gcp.project_id' sample.json
forecast-dev-1234
I would like to pass an argument without quotes (JQ arg has double quotes by default) since it should be used as a filter. For e.g.
propt='.properties'
final=($(jq -r -c --arg p $propt '$p' sample.json))
echo $final
sample.json
{
"type": "object",
"description": "Contains information",
"properties": {
"type": {
"description": "Type"
}
}
}
So ultimately it prints out .properties instead of the expected {"type":{"description":"Type"}}
I use a bash shell for this purpose.
Please let me know what I am doing wrong.
If I understand you correctly, you're getting sidetracked by thinking you need to set up a variable in jq, instead of just letting the shell do an expansion:
% foo='.properties'
% jq -r -c "$foo" sample.json
output:
{"type":{"description":"Type"}}
Note the double quotes on $foo to still allow the shell to expand the variable to .properties. That said you could unsafely use: jq -r -c $foo sample.json
You can't use --arg in that way. The value of a --arg is a string, not a jq filter expression. If you do --arg p .properties, then $p will contain the string ".properties", it won't be evaluated as a program. Find a different way to do what you want, perhaps by defining a function.
For example, if you prefixed your program with def p: .properties; then you could use .|p in your program in the way that you're using $p now, and it would access the .properties of whatever value is in context.
Since jq does not have an “eval” function, the appropriate way to specify a path programmatically in jq is using a JSON array in conjunction with jq’s getpath and setpath built-ins, as appropriate.
Thus in your case you could use the -—argjson command-line option to pass in the path of interest, e.g.
-—argson p '["properties"]'
and your jq program would use getpath($p).
Needless to say, this approach works for arbitrarily nested paths.
I am using jq to parse log data, occasionally the logs contain malformed stuff (invalid json), when this happens, jq aborts processing at that point.
Is there a way to have jq keep processing what it can, while reporting the problems via stderr?
I understand that if you have newlines in your JSON, jq may have trouble if it starts with the next line, but in such cases you will still eventually get to the point that you find the start of a legitimate json message and can continue processing.
With jq-1.5 I was able to do the following:
With this example file:
cat << EOF > example.log
{"a": 1}
{invalid
{"b": 2}
EOF
Output non-json lines as unquoted strings:
cat example.log | jq --raw-input --raw-output '. as $raw | try fromjson catch $raw'
{
"a": 1
}
{invalid
{
"b": 2
}
Silently skip non-json lines:
cat example.log | jq --raw-input 'fromjson?'
{
"a": 1
}
{
"b": 2
}
You can add --slurp if the entire input is expected to be a single multiline json blob.
Example files:
cat << EOF > valid-multiline.log
{
"a": 1,
"b": 2
}
EOF
cat << EOF > invalid-multiline.log
{
asdf
"b": 2
}
EOF
Outputs
cat valid-multiline.log | jq --slurp --raw-input --raw-output '. as $raw | try fromjson catch $raw'
{
"a": 1,
"b": 2
}
cat invalid-multiline.log | jq --slurp --raw-input --raw-output '. as $raw | try fromjson catch $raw'
{
asdf
"b": 2
}
If you have jq 1.5, the answer is: yes, though in general, preprocessing (e.g. using hjson or any-json) would be preferable.
Anyway, the idea is simply to take advantage of the try/catch feature. Here is an illustration using the inputs filter. Note that jq should in general be invoked with the -n option for this to work.
recover.jq
def handle: inputs | [., "length is \(length)"] ;
def process: try handle catch ("Failed", process) ;
process
bad.json
[1,2,3]
{id=546456, userId=345345}
[4,5,6]
See jq run:
$ jq -n -f recover.jq bad.json
[
"[1,2,3]",
"length is 3"
]
"Failed"
[
"[4,5,6]",
"length is 3"
]
I am trying to use jq 1.5 to develop a script that can take one or more user inputs that represent a key and recursively remove them from JSON input.
The JSON I am referencing is here:
https://github.com/EmersonElectricCo/fsf/blob/master/docs/Test.json
My script, which seems to work pretty well, is as follows.
def post_recurse(f):
def r:
(f | select(. != null) | r), .;
r;
def post_recurse:
post_recurse(.[]?);
(post_recurse | objects) |= del(.META_BASIC_INFO)
However, I would like to replace META_BASIC_INFO with one or more user inputs. How would I go about accomplishing this? I presume with --arg from the command line, but I am unclear on how to incorporate this into my .jq script?
I've tried replacing del(.META_BASIC_INFO) with del(.$module) and invoking with cat test.json | ./jq -f fsf_key_filter.jq --arg module META_BASIC_INFO to test but this does not work.
Any guideance on this is greatly appreciated!
ANSWER:
Based on a couple of suggestions I was able to arrive to the following that works and users JQ.
Innvocation:
cat test.json | jq --argjson delete '["META_BASIC_INFO","SCAN_YARA"]' -f fsf_module_filter.jq
Code:
def post_recurse(f):
def r:
(f | select(. != null) | r), .;
r;
def post_recurse:
post_recurse(.[]?);
(post_recurse | objects) |= reduce $delete[] as $d (.; delpaths([[ $d ]]))
It seems the name module is a keyword in 1.5 so $module will result in a syntax error. You should use a different name. There are other builtins to do recursion for you, consider using them instead of churning out your own.
$ jq '(.. | objects | select(has($a))) |= del(.[$a])' --arg a "META_BASIC_INFO" Test.json
You could also use delpaths/1. For example:
$ jq -n '{"a":1, "b": 1} | delpaths([["a"]])'
{
"b": 1
}
That is, modifying your program so that the last line reads like this:
(post_recurse | objects) |= delpaths([[ $delete ]] )
you would invoke jq like so:
$ jq --arg delete "META_BASIC_INFO" -f delete.jq input.json
(One cannot use --arg module ... as "$module" has some kind of reserved status.)
Here's a "one-line" solution using walk/1:
jq --arg d "META_BASIC_INFO" 'walk(if type == "object" then del(.[$d]) else . end)' input.json
If walk/1 is not in your jq, here is its definition:
# Apply f to composite entities recursively, and to atoms
def walk(f):
. as $in
| if type == "object" then
reduce keys[] as $key
( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f
elif type == "array" then map( walk(f) ) | f
else f
end;
If you want to recursively delete a bunch of key-value pairs, then here's one approach using --argjson:
rdelete.jq:
def rdelete(key):
walk(if type == "object" then del(.[key]) else . end);
reduce $strings[] as $s (.; rdelete($s))
Invocation:
$ jq --argjson strings '["a","b"]' -f rdelete.jq input.json