convert an epoch timestamp to datetime in azure data factory - datetime

I'm working with data flow in azure data factory and i tried to convert an epoch formatted timestamp to date.
the value of the timestamp is '1574067907751' and i tried expressions :
toDate(toTimestamp(1574067907751*1000l))
or
toDate(toTimestamp(toInteger('1574067907751')*1000l,'yyyy-MM-dd HH:mm:ss'))
there is any other way to do that ?

https://learn.microsoft.com/en-us/azure/data-factory/concepts-data-flow-expression-builder#convert-to-dates-or-timestamps
"To convert milliseconds from epoch to a date or timestamp, use toTimestamp(). If time is coming in seconds, multiply by 1,000.
toTimestamp(1574127407*1000l)
The trailing "l" at the end of the previous expression signifies conversion to a long type as inline syntax."

Related

ZonedDateTime is returning same String for two different Timezone

I am trying to format current timestamp into two different time zone, But ZonedDateTime format is returning same output.
DateTimeFormatter outputFormat = new DateTimeFormatterBuilder().appendPattern("ddMMMyyyy HH:mm:ss").toFormatter();
System.out.println(ZonedDateTime.of(LocalDateTime.now(), ZoneId.of("America/Phoenix")).format(outputFormat));
System.out.println(ZonedDateTime.of(LocalDateTime.now(), ZoneId.of("Asia/Calcutta")).format(outputFormat));
above code snipped is returning same time stamp as
16Apr2022 13:28:19
16Apr2022 13:28:19
Can some one help me understand but its returning same String. What is correct way to format time in different time zone, using Java 8 Date Time APIs.

How to convert the UTC date into local date in Moment.js?

I have a date in this format "2020-12-16T02:48:00" that came from the server. How can I convert this into local date and time? I tried some code but couldn't succeed.
Below is the attempt that I had made in angular after receiving date from the server.
response.data.map(date=>{
var centralDate = moment( date).zone("-06:00");
date = moment(centralDate).local().format('YYYY-MM-DD hh:mm:ss');
})
If indeed the value is in UTC (as per the title of your question), and it looks like "2020-12-16T02:48:00", and you want to convert it to local time, then you should do the following:
moment.utc(date).local().format('YYYY-MM-DD HH:mm:ss');
That does the following:
Parses the input in terms of UTC
Converts it to local time
Formats it as a string in the given format
Note also that you had hh in your original format. That is for hours in a 12-hour time format and thus you shouldn't use it without also using either A or a to indicate AM/PM or am/pm. Otherwise HH is for hours in a 24-hour time format.
If your issue is that the timezone doesn't change you can resolve using utcOffset (https://momentjscom.readthedocs.io/en/latest/moment/03-manipulating/09-utc-offset/) in this way:
response.data.map(date=>{
date = moment( date).utcOffset(-360);
})
Where 360 is the conversion fo the hours in minutes
var d= new Date();
d = new Date(d+ "Z")
I am not an expert in angular but I guess the trouble in your date is the word “T”. May be using string removal function you can remove the word “T” and then it becomes a proper date time value?

Calculate a date in behind 24 hours Hive

My demand is really so silly, so basically I need to go back in time 24 hours in a timestamp column in Hive.
So far, I have tried two different ways but it's not going thru:
select
recordDate, --original date
cast(date_sub(cast(recorddate as timestamp),1) as timestamp), -- going one day behind without hour
cast((cast(cast(recorddate as timestamp) AS bigint)-1*3600) as timestamp) -- crazy year
from mtmbuckets.servpro_agents_events limit 10;
My output looks:
I appreciate the support you can give me.
thanks
There is not straight forward function in hive .
1 Create UDF to do so .
or
Convert date in no of second and do you calculation( -24 *60*60) sec then change back int to data.
use from_unixtime and unix_timestamp to achieve below code.
select from_unixtime(unix_timestamp(recorddate) - 86400)
from mtmbuckets.servpro_agen ts_events limit 10;;
From_unixtime
Convert time string with given pattern to Unix time stamp (in seconds) The result of this function is in seconds.
Unix_timestamp
Converts time string in format yyyy-MM-dd HH:mm:ss to Unix timestamp (in seconds), using the default timezone and the default locale, return 0 if fail: unix_timestamp('2009-03-20 11:30:01') = 1237573801

How to convert the format of inserted datetime in Informix?

I insert my date/time data into a CHAR column in the format: '6/4/2015 2:08:00 PM'.
I want that this should get automatically converted to format:
'2015-06-04 14:08:00' so that it can be used in a query because the format of DATETIME is YYYY-MM-DD hh:mm:ss.fffff.
How to convert it?
Given that you've stored the data in a string format (CHAR or VARCHAR), you have to decide how to make it work as a DATETIME YEAR TO SECOND value. For computational efficiency, and for storage efficiency, it would be better to store the value as a DATETIME YEAR TO SECOND value, converting it on input and (if necessary) reconverting on output. However, if you will frequently display the value without doing computations (including comparisons or sorting) it, then maybe a rococo locale-dependent string notation is OK.
The key function for converting the string to a DATETIME value is TO_DATE. You also need to look at the TO_CHAR function because that documents the format codes that you need to use, and because you'll use that to convert a DATETIME value to your original format.
Assuming the column name is time_string, then you need to use:
TO_DATE(time_string, '%m/%d/%Y %I:%M %x') -- What goes in place of x?
to convert to a DATETIME YEAR TO SECOND — or maybe DATETIME YEAR TO MINUTE — value (which will be further manipulated as if by EXTEND as necessary).
I would personally almost certainly convert the database column to DATETIME YEAR TO SECOND and, when necessary, convert to the string format on output with TO_CHAR. The column name would now be time_value (for sake of concreteness):
TO_CHAR(time_value, '%m/%d/%Y %I:%M %x') -- What goes in place of x?
The manual pages referenced do not immediately lead to a complete specification of the format strings. I think a relevant reference is GL_DATETIME environment variable, but finding that requires more knowledge of the arcana of the Informix product set than is desirable (it is not the first thing that should spring to anyone's mind — not even mine!). If that's correct (it probably is), then one of %p and %r should be used in place of %x in my examples. I have to get Informix (re)configured on my machine to be able to test it.

understanding how elasticsearch stores dates internally

I would like to understand how ES stores date values internally in its indexes. Does it convert to UTC?
I have a field "t" of type date. Here's the mapping:
"t": { "type" : "date" },
Now when I insert/add a document to ES, how does it store in its indexes.
"t" : "1427700477165" (milliseconds generated from Date.now() function). Does ES recognize its epoch time in UTC and stores as is?
"t" : "2015-03-29T23:59:59" (i adjust mapping date format accordingly)- how does ES store this. If it converts to UTC, how does it know what time zone this date is and convert it to UTC? Does ES get the default time zone from the machine its running on?
Thank you!
Internally (within an index) Elasticsearch stores all dates as numbers in epoch format - i.e. the number of milliseconds since 01 Jan 1970 00:00:00 GMT.
However Elasticsearch by default also stores your raw JSON posted message as well - so when returning the _source you'll see whatever was posted to Elasticsearch.
To be able to import date strings into the epoch format you need to specify the format in your mapping, for example either a predefined date format:
"t": { "type" : "date", "format" : "basic_date_time" }
for yyyyMMdd'T'HHmmss.SSSZ.
or specify a custom date format:
"t": { "type" : "date", "format" : "YYYY-MM-dd" }
If no format is specified, the default date parsing used is
ISODateTimeFormat.dateOptionalTimeParser.
Multiple date formats can be specified in the mapping - e.g. yyyy/MM/dd
HH:mm:ss||yyyy/MM/dd
If no timezone is specified then Elasticsearch assumes UTC

Resources