How to use jq to convert seconds since Unix epoch to a time string in human readable format but adjusted to Sydney, Australia time zone?
I tried filter:
now | strftime("%Y-%m-%dT%H:%M:%SZ")
But I don't know how to adjust the time format string to convey Sydney, Australia time zone.
Possibly I need to replace "Z" with the relevant time zone?
Both of the following convert to the time zone indicated by the TZ environment variable var:
localtime | strftime(...)
strflocaltime(...)
For example,
$ jq -nr 'now | strftime("%FT%T")'
2022-02-14T06:14:07
$ jq -nr 'now | gmtime | strftime("%FT%T")'
2022-02-14T06:14:07
$ jq -nr 'now | localtime | strftime("%FT%T")'
2022-02-14T02:14:07
$ jq -nr 'now | strflocaltime("%FT%T")'
2022-02-14T02:14:07
That uses your local time, as determined by TZ environment variable. Adjust as needed.
$ TZ=America/Halifax jq -nr 'now | strflocaltime("%FT%T")'
2022-02-14T02:14:07
$ TZ=America/Toronto jq -nr 'now | strflocaltime("%FT%T")'
2022-02-14T01:14:07
$ TZ=America/Vancouver jq -nr 'now | strflocaltime("%FT%T")'
2022-02-14T22:14:07
If you want to convert to different time zones in a single run of jq, you're out of luck. jq doesn't support converting to/from time zones other than UTC and this time zone.
Tested with both 1.5 and 1.6.
If you are using jq v1.6, use strflocaltime instead of strftime which displays the time in your timezone.
jq -n 'now | strflocaltime("%Y-%m-%dT%H:%M:%S")'
Demo
From the manual:
The strftime(fmt) builtin formats a time (GMT) with the given format. The strflocaltime does the same, but using the local timezone setting.
If your timezone is different, set the shell's TZ variable accordingly before calling jq
TZ=Australia/Sydney jq -n 'now | strflocaltime("%Y-%m-%dT%H:%M:%S")'
Related
If I do:
$ jq -cn 'now | localtime'
[2022,3,12,21,9,29.65448808670044,2,101]
It correctly gives the "broken down time" representation of current local time.
But If I do:
$ jq -cn 'now | localtime | mktime | localtime'
[2022,3,13,7,10,36,3,102]
It gives back a "broken down time" representation that is different than current local time.
I think when output of localtime is converted to seconds since unix epoch by mktime it is converted wrongly because it wrongly assumes GMT time?
If I do:
$ jq -cn 'now | gmtime | mktime | localtime'
Now this gives correct results (gives "broken down time" representation of current local time).
Am I correct? Thanks.
Yes.
From the jq docs:
The mktime builtin consumes "broken down time" representations of time output by gmtime and strptime.
You originally passed a local time, but it expects a UTC time. As you surmised, this is why your original code failed and the latter code worked. jq's mktime is the inverse of gmtime.[1]
$ jq -rn 'now | ., ( gmtime | mktime )'
1649770973.430903
1649770973
jq does not appear to provide a means to convert from a local time to epoch time.[2]
This differs from the behaviour of C's mktime. C's mktime expects a local time, making it the inverse of localtime.
In C, mktime can serve both roles. While it normally converts from local time, it can also convert from UTC by setting the local time zone to UTC.
i want to be able to grab valid time from a grib2 file
$ gdalinfo this_file_20211018_1300.grib2
Output:
....*a bunch of stuff i dont really need*...
....
.....
*what i actually need*
GRIB_VALID_TIME= 1634590800 sec UTC
so it is in UTC seconds.
i want to convert 1634590800 to a date format that looks like this :
this_file_20211018_1300.grib2
can i use gdalinfo or some other linux utility to extract specifically just valid time from a grib2 file ?
i found a way by just piping grep to the metadata I need :
TEMP=`gdalinfo this_pngfile_20211018_1300.grib2 | grep -i GRIB_VALID_TIME -m 1`
IFS=' '
read -a strarr <<< "$TEMP"
EPOCHTIME="${strarr[1]}"
DATEU=`date -d#$EPOCHTIME -u +%Y%m%d_%H%M`
echo "$DATEU.png"
You can use gdalinfo with jq:
gdalinfo -json gfs.t00z.pgrb2.0p25.f001 | jq -r '.bands | .[0].metadata."".GRIB_VALID_TIME'
1661907600
I know how to get the most used shell commands in zsh with
history 1 | awk '{$1="";print substr($0,2)}' | sort | uniq -c | sort -n | tail -n 20
but is there a way to restrict myself to let's say the last two or three months?
I need this because I would like to create aliases for the commands I am currently using most.
history in zsh have several flags to show date and time stamp. For this to work, you have to add setopt extended_history to your .zshrc file.
If you have extended_history enabled, history -i will show full time-date stamps in ISO8601 `yyyy-mm-dd hh:mm' format. Dates in this format can be compared as strings. So just change your awk script and use it to select only lines after some date.
history -i 1 | awk '{ if ($2 >= "2020-05-01") { $1=$2=$3="";print $0; } }' | sort | uniq -c | sort -n -r | head -n 20
Be aware that if you have HIST_IGNORE_ALL_DUPS or HIST_IGNORE_DUPS options enabled, this will not work as intended.
You can also use date command to get older date automatically.
How to add 1 hour in unix timestamp
date +%Y%m%d_%H%M
I need to add 1 hour in above format
A literal answer to your question is to use
date --date="next hour" +%Y%m%d_%H%M
but I'm guessing that you actually want to display the time in another timezone. To display in UTC:
date --utc +%Y%m%d_%H%M
and in another timezone, e.g.
TZ="Europe/Stockholm" date +%Y%m%d_%H%M
assuming of course that system clock is setup correctly.
De GNU date makes live easy. Without GNU date you can manipulate your timezone:
echo "$(TZ=GMT+1 date +%Y%m%d_%H%M)"
Be careful with Daylight Saving Time.
You can remember this trick when you to get the date (without time) of yesterday. Just adding 24 hours to the timezone can give problems during the Daylight Saving Time. You can use a trick to find yesterday:
You are sure that yesterday is 20 or 30 hours ago. Which one? Well, the most recent one that is not today.
echo "$(TZ=GMT+30 date +%Y-%m-%d)\n$(TZ=GMT+20 date +%Y-%m-%d)" |
grep -v $(date +%Y-%m-%d) | tail -1
Above command is for ksh. When you use bash, you want echo -e.
Or use printf:
printf "%s\n%s\n" "$(TZ=GMT+30 date +%Y-%m-%d)" "$(TZ=GMT+20 date +%Y-%m-%d)" |
grep -v $(date +%Y-%m-%d) | tail -1
I am getting data from the server in a file (in format1) everyday ,however i am getting the data for last one week.
I have to archive the data for 1.5 months exactly,because this data is being picked to make some graphical representation.
I have tried to merge the the files of 2 days and sort them uniquely (code1) ,however it didn't work because everyday name of raw file is changing.However Time-stamp is unique in this file,but I am not sure how to sort the unique data on base of a specific column also,is there any way to delete the data older than 1.5 months.
For Deletion ,The logic i thought is deleting by fetching today's date - least date of that file but again unable to fetch least date.
Format1
r01/WAS2/oss_change0_5.log:2016-03-21T11:13:36.354+0000 | (307,868,305) | OSS_CHANGE |
com.nokia.oss.configurator.rac.provisioningservices.util.Log.logAuditSuccessWithResources | RACPRS RNC 6.0 or
newer_Direct_Activation: LOCKING SUCCEEDED audit[ | Source='Server' | User identity='vpaineni' | Operation
identifier='CMNetworkMOWriterLocking' | Success code='T' | Cause code='N/A' | Identifier='SUCCESS' | Target element='PLMN-
PLMN/RNC-199/WBTS-720' | Client address='10.7.80.21' | Source session identifier='' | Target session identifier='' |
Category code='' | Effect code='' | Network Transaction identifier='' | Source user identity='' | Target user identity='' |
Timestamp='1458558816354']
Code1
cat file1 file2 |sort -u > file3
Data on Day2 ,the input file name Differ
r01/WAS2/oss_change0_11.log:2016-03-21T11:13:36.354+0000 | (307,868,305) | OSS_CHANGE |
com.nokia.oss.configurator.rac.provisioningservices.util.Log.logAuditSuccessWithResources | RACPRS RNC 6.0 or
newer_Direct_Activation: LOCKING SUCCEEDED audit[ | Source='Server' | User identity='vpaineni' | Operation
identifier='CMNetworkMOWriterLocking' | Success code='T' | Cause code='N/A' | Identifier='SUCCESS' | Target element='PLMN-
PLMN/RNC-199/WBTS-720' | Client address='10.7.80.21' | Source session identifier='' | Target session identifier='' |
Category code='' | Effect code='' | Network Transaction identifier='' | Source user identity='' | Target user identity='' |
Timestamp='1458558816354']
I have written almost similar kind of code a week back.
Awk is a good Tool ,if you want to do any operation column wise.
Also , Sort Unique will not work as file name is changing
Both unique rows and least date can be find using awk.
1 To Get Unique file content
cat file1 file2 |awk -F "\|" '!repeat[$21]++' > file3;
Here -F specifies your field separator
Repeat is taking 21st field that is time stamp
and will only print 1st occurrence of that time ,rest ignored
So,finally unique content of file1 and file2 will be available in file3
2 To Get least Date and find difference between 2 dates
Least_Date=`awk -F: '{print substr($2,1,10)}' RMCR10.log|sort|head -1`;
Today_Date=`date +%F` ;
Diff=`echo "( \`date -d $Today_Date +%s\` - \`date -d $Start_Date +%s\`) / (24*3600)" | bc -l`;
Diff1=${Diff/.*};
if [ "$Diff1" -ge "90" ]
then
Here we have used {:} as field separator, and finally substring to get exact date field then sorting and finding least
value.
Subtracting today's Date by using Binary calculator and then removing decimals.
Hope it helps .....