select current_timestamp(0) ;
Current TimeStamp(0)
2021-08-30 17:29:30+00:00
Actual CDT time now - 2021-08-30 17:29:30, but timezone is shown as GMT(+00:00) instead of CDT(-05:00). How can this be fixed.
Related
I have a collection with the next item and one of their properties is the date:
id: xxxxxx
name: xxxx
date: August 21, 2018 at 1:00:00 AM UTC+8 (timestamp)
Inside a firebase cloud function I am trying to query all objects from a period, this period can be the day, the week, the year, etc.
I want to query the items by the current server day, so I do this in a Firebase Cloud Function:
let auxDate = moment();
dateStart = auxDate.startOf('day').toDate();
dateEnd = auxDate.endOf('day').toDate();
await admin.firestore().collection('items')
.where("date", ">=", dateStart)
.where('date', '<=', dateEnd).get();
The values of dateStart and dateEnd printed in the console are:
Date start: Tue Aug 21 2018 00:00:00 GMT+0000 (UTC)
Date end: Tue Aug 21 2018 23:59:59 GMT+0000 (UTC)
And the query return 0 items. But when I change the date of the item to
id: xxxxxx
name: xxxx
date: August 21, 2018 at 8:00:00 AM UTC+8 (timestamp)
The query return the item correctly.
So now i know the problem is about the Offset, but how can i fix this? Why firebase save all dates in UTC+8?
I found the answer to my own question. I will explain in detail.
Understand the problem:
First thing is to remember we are doing this query inside a Cloud Function so we do not know the client timezone. In Firestore all dates are saved in UTC+00. So if i have an item like:
id: xxxxxx
name: xxxx
date: August 22, 2018 at 12:00:00 AM UTC+8 (timestamp)
it means this date is equals to August 21, 2018 at 04:00:00 PM UTC+0.
So if i want to query all dates between:
Date start: August 22, 2018 at 12:00:00 AM UTC+8 (timestamp)
Date end: August 22, 2018 at 11:59:59 PM UTC+8 (timestamp)
It means the query MUST be like:
Date start: August 21, 2018 at 04:00:00 AM UTC+0 (timestamp)
Date end: August 22, 2018 at 03:59:59 PM UTC+0 (timestamp)
So the problem is when I do let auxDate = moment() this create a date with server time with no offset so the value of auxDate is for example: August 21, 2018 at 11:23:43 PM UTC+0.
So the starting dates from auxDate become:
auxDate start: August 21, 2018 at 12:00:00 AM UTC+0 (timestamp)
auxDate end: August 21, 2018 at 11:59:59 PM UTC+0 (timestamp)
As we can see this query do not fill the ranges of our MUST be query and many items will be out.
Problem conclusion:
We cannot query correctly if we do not know the time zone of the client.
Solution:
Now i pass to this cloud function the current time of the client and calculate the start and end dates:
const clientCurrentDateTime = "2018-08-22T11:23:15+08:00";
let startDate = moment(clientCurrentDateTime).utcOffset(clientCurrentDateTime).startOf('day').toDate();
let endDate = moment(clientCurrentDateTime).utcOffset(clientCurrentDateTime).endOf('day').toDate();
This creates the correct start and end date-times for different offsets/timezones
I work in dbeaver. I have a table x.
TABLE x has a column "timestamp"
1464800406459
1464800400452
1464800414056
1464800422854
1464800411797
The result I want:
Wed, 01 Jun 2016 17:00:06.459 GMT
Wed, 01 Jun 2016 17:00:00.452 GMT
Wed, 01 Jun 2016 17:00:14.056 GMT
Wed, 01 Jun 2016 17:00:22.854 GMT
Wed, 01 Jun 2016 17:00:11.797 GMT
I tried redshift query
SELECT FROM_UNIXTIME(x.timestamp) as x_date_time
FROM x
but didn't work.
Error occurred:
Invalid operation: function from_unixtime(character varying) does not exist
I also tried
SELECT DATE_FORMAT(x.timestamp, '%d/%m/%Y') as x_date
FROM x
Error occurred:
Invalid operation: function date_format(character varying, "unknown") does not exist
Is there any wrong with the syntax? Or is there another way to convert to human readable date and time?
Redshift doesn't have the from_unixtime() function. You'll need to use the below SQL query to get the timestamp. It just adds the number of seconds to epoch and return as timestamp.
select timestamp 'epoch' + your_timestamp_column * interval '1 second' AS your_column_alias
from your_table
UDF is going to be pretty slow. Checked execution time for 3 solutions and 1k rows.
The slowest -
-- using UDF from one of the answers
SELECT from_unixtime(column_with_time_in_ms/ 1000)
FROM table_name LIMIT 1000;
Execution time: 00:00:02.348062s
2nd best -
SELECT date_add('ms',column_with_time_in_ms,'1970-01-01')
FROM table_name LIMIT 1000;
Execution time: 00:00:01.112831s
And the fastest -
SELECT TIMESTAMP 'epoch' + column_with_time_in_ms/1000 *INTERVAL '1 second'
FROM table_name LIMIT 1000;
Execution time: 00:00:00.095102s
Execution time calculated from stl_query -
SELECT *
,endtime - starttime
FROM stl_query
WHERE querytxt ilike('%table_name%limit%')
ORDER BY starttime DESC;
The simplest solution is to create from_unixtime() function:
CREATE OR REPLACE FUNCTION from_unixtime(epoch BIGINT)
RETURNS TIMESTAMP AS
'import datetime
return datetime.datetime.fromtimestamp(epoch)
'
LANGUAGE plpythonu IMMUTABLE;
See Redshift documentation on UDF for details
For quick reference, here is the SQL UDF implementation of the from_unixtime function shown above in Python. I've not tested the performance but I imagine it would be similar to the plain SQL version. It's a whole lot easier to write though.
Note: this calculates the number of seconds from the epoch.
CREATE FUNCTION from_unixtime (BIGINT)
RETURNS TIMESTAMP WITHOUT TIME ZONE
IMMUTABLE
as $$
SELECT TIMESTAMP 'epoch' + $1 / 1000 * interval '1 second'
$$ LANGUAGE sql;
I used it like this
CAST(DATEADD(S, CONVERT(int,LEFT(column_name, 10)), '1970-01-01')as timestamp) as column_name
SELECT
,task_id
,CAST(DATEADD(S, CONVERT(int,LEFT(SLA, 10)), '1970-01-01')as timestamp) as SLA
FROM my_schema.my_task_table ;
How can I to convert from Indian standard time format to oracle date format.
Eg:
Mon May 23 2016 00:00:00 GMT+0530 (India Standard Time)
required format
23-May-16
This parses the input string and returns a date value to you:
select cast(to_timestamp_tz('Mon May 23 2016 00:00:00 GMT+0530', 'Dy Mon DD YYYY HH24:MI:SS "GMT"TZHTZM') as date) as converted_to_date_value
from dual;
This parses the input string to a "timestamp with time zone" value and formats the value back to a string in your desired format:
select to_char(to_timestamp_tz('Mon May 23 2016 00:00:00 GMT+0530', 'Dy Mon DD YYYY HH24:MI:SS "GMT"TZHTZM'), 'DD-Mon-RR') as converted_to_your_format
from dual;
Enjoy!
Footnote: Please note that there's no such thing as "oracle date format" which you refer to. Oracle has its date data type, which can have many many many different display intepretations depending on your client as well as server locale settings.
ZPUBLICATIONDATETIME is of type TIMESTAMP.
So when I do this:
SELECT strftime('%d - %m - %Y ', datetime(ZPUBLICATIONDATETIME, 'unixepoch')) FROM ZTNNEWS;
I get 26 - 05 - 1984 instead of 2015. iOS (Core Data) writes datetime on 1 Jan 2001 based. What is the best approach to get the right date conversion?
Shall I just add 31 years to it or is there an alternative to unixepoch to put in there?
Essentially what I am trying to do is to get the records from past two days:
select *
from ZTNNEWS
where DATETIME(ZPUBLICATIONDATETIME) > DATETIME('now', '-2 day')
But because ZPUBLICATIONDATETIME is of type TIMESTAMP rather than Datetime, it doesn't output anything.
Any advice please?
Just had the same problem. Adding the date value to the seconds of 1 Jan 2001 seems to do the job:
SELECT * FROM ztnnews WHERE DATETIME(zpublicationtime + 978307200) > DATETIME('now', '-2 day');
I used ruby to get the seconds of 1 Jan 2001:
$ irb
2.2.3 :001 > require "time"
=> true
2.2.3 :002 > Time.parse( "2001-01-01T00:00:00Z").to_i
=> 978307200
I am using JodaTime2 library to create a date object with a given timezone as follow:
import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
DateTimeZone tz = DateTimeZone.forID("America/New_York");
System.out.println("timezone=" + tz);
Date d = new DateTime(2013, 1, 1, 0, 0, tz).toDate();
System.out.println("Cur Date = " + d);
However when I print this date, the timezone reported is CST. What am I missing ?
timezone=America/New_York
Cur Date = Tue Jan 01 13:00:00 CST 2013
You're printing out the value of a Date object. Date doesn't have a time zone - Date.toString() always just uses the "default" time zone. A Date is just a number of milliseconds since the Unix epoch; it doesn't know about calendars or time zones.
You should either just stick within the Joda Time world, or (if you must) use a SimpleDateFormatter to convert a Date to a String - you can set the time zone on the formatter.