jq combine output on a single line separated by space - jq

I am trying to run a jq query on a windows machine and it extracts values from output on a separate line
jq -r .Accounts[].Id
Output
204359864429
224271824096
282276286062
210394168456
090161402717
How do I run the jq query so that it combines the output on a single line separated by space
This is what I need-
204359864429 224271824096 282276286062 210394168456 090161402717
Any help would be appreciated.

The usual way would be to use the #csv or #tsv operators to convert the result in the CSV or tab-delimited format. These operators need the result to be contained in an array. For your case also to have a single space delimit, we can do a simple join(" ") operation
jq -r '[.Accounts[].Id]|join(" ")'

You can use the #sh formatter:
jq -r ".Accounts[].Id | #sh"
From the jq docs:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
Reference:
https://stedolan.github.io/jq/manual/#Basicfilters

At first I thought the join() solution above did not work. Then I realized that I was "overfeeding" the join() filter, causing it to fail because I was providing more than a simple array as input. I had concatenated several filters with , and failed to limit the scope of my join().
Did not work:
jq -r \
'.ansible_facts |
.ansible_hostname,
.ansible_all_ipv4_addresses | join(" "),
.ansible_local."aws-public-ipv4".publicIP'
This gave the error,
jq: error (at <stdin>:0): Cannot iterate over string ("hostone")
because jq was attempting to "consume" not only ansible_all_ipv4_addresses but also the output of the preceding ansible_hostname filter (I am not certain why this is or whether it was even intended by the author of jq).
Does work:
jq -r \
'.ansible_facts |
.ansible_hostname,
(.ansible_all_ipv4_addresses | join(" ")),
.ansible_local."aws-public-ipv4".publicIP'
Here, I restrict join() to .ansible_all_ipv4_addresses only (ansible_all_ipv4_addresses is an array of IP addresses I wish to translate into a single, space-separated string).
P.S.: I found that the #sh filter produces space-separated output as desired, but in addition delimits each output item in single quotes.
P.P.S.:
Here was my workaround, until I discovered that join() works just the same as it when used properly (see above):
jq -r '.Accounts[].Id | #tsv | sub("\t";" ";"g")'
Explanation: the #tsv filter produces Tab Separated Values, then the sub() filter substitutes tabs with spaces, globally.

Related

How to encrypt every name in a list ZSH scripting using a for loop

I'm new to zsh scripting and I was wondering if it's possible to use the sha256sum function to encrypt every value in a list.
Here is what I have tried so far:
#!/bin/zsh
filenames=`cat filenames.txt`
output='shaNames.txt'
for name in $filenames
do
echo -n $name | sha256sum >> $output
done
What I'm trying to accomplish is to encrypt every name in the list and append it to a new text file.
Any suggestions on what am I doing wrong are appreciated.
You are assigning the output of cat filenames.txt to a multiline variable. The for loop will then only loop once over the content.
What you want to do instead is e.g.:
for name in $(cat filenames.txt)
do
echo -n "$name" | sha256sum >> "$output"
done
Note that while you can still use them, backticks are deprecated in favor of $(somecommand).
Also note that you should always put variables in double quotes, as they could contain spaces.
Your method would fail anyways if one line of your textfile would contain a space.
You could use the following instead:
while read name
do
echo -n "$name" | sha256sum >> "$output"
done < filenames.txt
To anyone who might need the same. What I was doing wrong was assigning the values in the file to a single string variable instead of a list.
To correct that one must use:
filenames=(`cat filenames.txt`)
The parenthesis indicates that a list or array is stored in the filenames variable.

How to determine the last part of URL with jq?

I have to distinguish between the following two paths.
shorter: https://www.example.com/
longer: https://www.example.com/foo/
In Bash script, using Bash built-in literals as follows returns only longer one.
$ url1=https://www.example.com/
$ url2=https://www.example.com/foo/
$ cut -d/ -f4 <<<${url1%/*} # this returns nothing
>$
$ cut -d/ -f4 <<<${url2%/*} # this returns last part of path
>$ foo
So it could be identified longer one in Bash script,
but now I have to define same filter for JSON value handled in jq.
If jq can write like the following, my goal can be achieved...
jq '. | select( .url | (cut -d/ -f4 <<< ${url2%/*})!=null) )'
But can not do that. How can do that?
jq has many string-handling functions -- one could do worse than checking the jq manual. For the task at hand, using a regex function would probably be best, but since you mentioned cut -d/ -f4, it might be of interest to note that much the same effect can be achieved by:
split("/")[3]
For the last non-trivial part you could consider:
sub("/ *$";"") | split("/")[-1]

jq in CLI create error when I want to parse the output

Using Home Assistant 0.92 to test my CLI for creating automated backuping. After a successful backup, the command responds with an output and I need to catch that value. I'm trying to use jq to parse it but only get an error.
$ hassio snapshots new --name"Testbackup"
This gives an output of slug: 07afd144 and I want to catch 07afd144
Tried following:
$ hassio snapshots new --name"Testbackup" | jq --raw-output '.data.slug'
This gives an output of parse error: Invalid numeric literal at line 1, column 5
The final result is planned to be:
slug=$(hassio snapshots new --name="${name}" | jq --raw-output '.data.slug')
where ${slug}=07afd144
What am I doing wrong?
jq is a tool for parsing and transforming JSON documents. What you have shown is not legal JSON. It is however a legal YAML document and can be transformed with yq. yq uses jq-like syntax, but can handle JSON, YAML, XML, and CSV files.
slug=$(hassio snapshots new --name="${name}" | yq '.slug')
slug: 07afd144 isn't valid JSON and as such cannot be parsed with jq. Furthermore, it doesn't contain a data property anywhere, so .data.slug doesn't make sense.
If the format is always this simple (property name, colon, space, value), then the value can be easily extracted with other common tools generally available on GNU+Linux systems:
cut (different invocations possible):
cut -d' ' -f2-
cut -c7-
awk:
awk '{print $2}'
sed:
sed 's/^slug: //'
`perl:
perl -lane 'print $F[0]'
or even grep (different invocations possible):
grep -o '[^ ]*$'
grep -o '[[:xdigit:]]*$'

jq doesn't work with keys which contains dash in it from a variable

If you'd like jq to escape dashes, you need to put your key between square brackets like this;
jq '.["key-key"]'
and apart from that, if you'd like to include a variable in jq, you need to use double quotes instead of single quotes;
jq "."${var[i+1]}""
but my variable contains dash in it and in this case, I've tried to merge the 2 examples above but it didn't work;
var=key-key
jq ".["${var[i+1]}"]."key""
how can I get this work?
Update:
This is the final script, which I've forgot to mention;
declare -a var=(
"key-key"
"key2-key2"
"key3-key3"
)
for ((i=0; i<${#var[#]})); do
curl -s "url" | jq ".["${var}"]."something""
done
To have double-quotes in a jq command you've enclosed in double-quotes, you'd escape them with a backslash :
jq ".[\"key-key\"]"
Another problem with your final command is that ${var[i+1]} expands to the empty string, because this syntax is used to index elements of an array, and you previously defined var as a simple string.
A better way to work with variables in jq commands is to define them through the --arg name value option, after which you can refer to them with $foo in a single-quotes enclosed command :
jq --arg keyName key-key '.[$keyName]'
To fix the code included in the update, I would use the following :
declare -a var=(
"key-key"
"key2-key2"
"key3-key3"
)
json=$(curl -s "url")
for searchedKey in "${var[#]}"; do
echo $json | jq --arg keyName $searchedKey '.[$keyName].something'
done

programmatic grep command output

Is there a way to get XML or equivalent output of grep command that can be passed on to other programs.
For example, grep can give the file names, line numbers and context of the pattern matched.
Filename and line number extraction can be done using some split command with delimiter ':'. However, if the filename contains ':' character (I know it is weird, but there is a possibility), it would need lot more processing.
With the context (grep -C option), it becomes even more complex. If the context of two matches overlaps, grep optimizes the output and it will be difficult to separate.
So I am wondering if grep command can simply generate an XML or JSON like output that other programs can just load.
There is an option -Z to grep which produces unambiguous output, by using Nul characters.

Resources