I have a number of JSON files that look like this:
{
"property": "value1",
...
}
What I want is an output file that looks like this:
{
"<filename1>": "<value1>",
"<filename2>": "<value2>",
"<filename3>": "<value3>",
...
}
This can be achieved with two jq invocations and a shell pipe:
jq '{(input_filename):.property}' * | jq -s add
However, I was wondering whether this is possible with a single jq invocation (or any other simpler way).
I'm currently using jq version 1.5-1 in case it's relevant.
Use inputs in combination with the -n option to sequentially access all input files.
In direct analogy, you could just create the array that would have been created by the -s option using [inputs], and then add up the items as you did before:
jq -n '[inputs | {(input_filename): .property}] | add' *
But in a more straightforward way, you could employ reduce to iteratively build up your result object:
jq -n 'reduce inputs as $in ({}; .[input_filename] = $in.property)' *
Related
I have a list of country codes like FR, IT, DE and have been trying to figure out how to use this in a select statement. I was doing something like
cat stuff | jq -c '.[]| select(.country_iso3166_alpha2 == "US")'
But then my list grew to a large number of countries I want to match on. So I tried using IN since I'm using jq 1.6 and did something like this:
eu=("FR", "IT"); cat stuff | jq -c '.[]| select(.country_iso3166_alpha2 IN($eu)'
I've been reading the docs and looking at the cookbook but it's not making any sense to me. Thanks!
You can use --argjson to pass the list to jq and IN to select the matching entries.
jq -c --argjson eu '["FR", "IT"]' '.[]| select(.country_iso3166_alpha2 | IN($eu[]))' <stuff
Broken out to show the individual parts:
jq -c \
--argjson eu '["FR", "IT"]' \
'.[]| select(.country_iso3166_alpha2 | IN($eu[]))' \
<stuff
invoke jq with compact output
pass in the list of countries as a json array named "eu"
select using the IN operator, unpacking $eu[] to get its values
redirect the input file into jq
Unfortunately, jq does not understand bash arrays.
Things would probably be simplest if you can arrange to have your shell variable be a string representing a JSON array:
eu='["FR", "IT"]'
jq -n --argjson eu "$eu" '$eu'
Or if you prefer:
eu='["FR", "IT"]' jq -n 'env.eu | fromjson'
Another possibility would be to write your bash array into a file, and then slurp the file.
There are also many variants of the above....
I'm new to zsh scripting and I was wondering if it's possible to use the sha256sum function to encrypt every value in a list.
Here is what I have tried so far:
#!/bin/zsh
filenames=`cat filenames.txt`
output='shaNames.txt'
for name in $filenames
do
echo -n $name | sha256sum >> $output
done
What I'm trying to accomplish is to encrypt every name in the list and append it to a new text file.
Any suggestions on what am I doing wrong are appreciated.
You are assigning the output of cat filenames.txt to a multiline variable. The for loop will then only loop once over the content.
What you want to do instead is e.g.:
for name in $(cat filenames.txt)
do
echo -n "$name" | sha256sum >> "$output"
done
Note that while you can still use them, backticks are deprecated in favor of $(somecommand).
Also note that you should always put variables in double quotes, as they could contain spaces.
Your method would fail anyways if one line of your textfile would contain a space.
You could use the following instead:
while read name
do
echo -n "$name" | sha256sum >> "$output"
done < filenames.txt
To anyone who might need the same. What I was doing wrong was assigning the values in the file to a single string variable instead of a list.
To correct that one must use:
filenames=(`cat filenames.txt`)
The parenthesis indicates that a list or array is stored in the filenames variable.
I am trying to run a jq query on a windows machine and it extracts values from output on a separate line
jq -r .Accounts[].Id
Output
204359864429
224271824096
282276286062
210394168456
090161402717
How do I run the jq query so that it combines the output on a single line separated by space
This is what I need-
204359864429 224271824096 282276286062 210394168456 090161402717
Any help would be appreciated.
The usual way would be to use the #csv or #tsv operators to convert the result in the CSV or tab-delimited format. These operators need the result to be contained in an array. For your case also to have a single space delimit, we can do a simple join(" ") operation
jq -r '[.Accounts[].Id]|join(" ")'
You can use the #sh formatter:
jq -r ".Accounts[].Id | #sh"
From the jq docs:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
Reference:
https://stedolan.github.io/jq/manual/#Basicfilters
At first I thought the join() solution above did not work. Then I realized that I was "overfeeding" the join() filter, causing it to fail because I was providing more than a simple array as input. I had concatenated several filters with , and failed to limit the scope of my join().
Did not work:
jq -r \
'.ansible_facts |
.ansible_hostname,
.ansible_all_ipv4_addresses | join(" "),
.ansible_local."aws-public-ipv4".publicIP'
This gave the error,
jq: error (at <stdin>:0): Cannot iterate over string ("hostone")
because jq was attempting to "consume" not only ansible_all_ipv4_addresses but also the output of the preceding ansible_hostname filter (I am not certain why this is or whether it was even intended by the author of jq).
Does work:
jq -r \
'.ansible_facts |
.ansible_hostname,
(.ansible_all_ipv4_addresses | join(" ")),
.ansible_local."aws-public-ipv4".publicIP'
Here, I restrict join() to .ansible_all_ipv4_addresses only (ansible_all_ipv4_addresses is an array of IP addresses I wish to translate into a single, space-separated string).
P.S.: I found that the #sh filter produces space-separated output as desired, but in addition delimits each output item in single quotes.
P.P.S.:
Here was my workaround, until I discovered that join() works just the same as it when used properly (see above):
jq -r '.Accounts[].Id | #tsv | sub("\t";" ";"g")'
Explanation: the #tsv filter produces Tab Separated Values, then the sub() filter substitutes tabs with spaces, globally.
Using Home Assistant 0.92 to test my CLI for creating automated backuping. After a successful backup, the command responds with an output and I need to catch that value. I'm trying to use jq to parse it but only get an error.
$ hassio snapshots new --name"Testbackup"
This gives an output of slug: 07afd144 and I want to catch 07afd144
Tried following:
$ hassio snapshots new --name"Testbackup" | jq --raw-output '.data.slug'
This gives an output of parse error: Invalid numeric literal at line 1, column 5
The final result is planned to be:
slug=$(hassio snapshots new --name="${name}" | jq --raw-output '.data.slug')
where ${slug}=07afd144
What am I doing wrong?
jq is a tool for parsing and transforming JSON documents. What you have shown is not legal JSON. It is however a legal YAML document and can be transformed with yq. yq uses jq-like syntax, but can handle JSON, YAML, XML, and CSV files.
slug=$(hassio snapshots new --name="${name}" | yq '.slug')
slug: 07afd144 isn't valid JSON and as such cannot be parsed with jq. Furthermore, it doesn't contain a data property anywhere, so .data.slug doesn't make sense.
If the format is always this simple (property name, colon, space, value), then the value can be easily extracted with other common tools generally available on GNU+Linux systems:
cut (different invocations possible):
cut -d' ' -f2-
cut -c7-
awk:
awk '{print $2}'
sed:
sed 's/^slug: //'
`perl:
perl -lane 'print $F[0]'
or even grep (different invocations possible):
grep -o '[^ ]*$'
grep -o '[[:xdigit:]]*$'
I’ve just discovered jq and been really loving it. One thing I find myself doing a lot though stuff like:
result=$(jq --raw-output '.some | .filters // ""')
if [[ $result ]]; then
foo
else
bar
fi
The default to an empty string seems to play more nicely with bash "truthiness" than e.g. if [[ $result != "null" ]], and raw-output is usually necessary to store just the resultant string in a variable. My question is, I’m using these two tweaks so consistently in scripts, is there perhaps a better way to achieve the same functionality? Or would it make sense (as a possible enhancement to jq) to be able to set a couple env vars to control this behavior for the duration of the script?
You can use the -e flag that will make jq return exit code 0 if the last output value was neither false or null so then your logic may become:
result=$(jq -e -r '.some | .filters') && foo || bar