I have some graph data with date type values.
My gremlin query for the date type property is working, but output value is not the date value.
Environment:
Janusgraph 0.3.1
gremlinpython 3.4.3
Below is my example:
Data (JanusGraph): {"ID": "doc_1", "MY_DATE": [Tue Jan 10 00:00:00 KST 1079]}
Query: g.V().has("ID", "doc_1").valueMap("MY_DATE")
Output (gremlinpython): datetime(1079, 1, 16)
The error is 6 days (1079.1.10 -> 1079.1.16).
This mismatch does not occur when the years are above 1600.
Does the timestamp have some serialization/deserialization problems between janusgraph and gremlinpython?
Thanks
There were some issue with Python and dates but I would have them fixed for 3.4.3, which is the version you stated you were using. The issue is described here at TINKERPOP-2264 along with the fix, but basically there were some issues with timezones. From your example data, it looks like you store your date with a timezone (i.e. KST). I'm not completely sure, but I would imagine things would work as expected if the date was stored as UTC.
After some try & search, I found that there are some difference between java Date and python datetime. (Julian vs. Gregorian Calendar)
So I have replaced SimpleDateFormat with JodaTime and got the expected result as below:
Data (Raw): {"ID": "doc_1", "MY_DATE": "1079-1-29"}
Data (JanusGraph): {"ID": "doc_1", "MY_DATE": [Wed Jan 23 00:32:08 KST 1079]}
(I think the JanusGraph uses java Date object internally..)
Query: g.V().has("ID", "doc_1").valueMap("MY_DATE")
Output (gremlinpython): datetime(1079, 1, 29)
Thanks
Related
According to the header file of Poco::Timestamp, timestamps are in UTC, see Timestamp documentation. If timestamps are in UTC, shouldn't a method converting a Poco::LocalDateTime to Poco::Timestamp make sure that the returned timestamp is in UTC? Currently, Poco::LocalDateTime::timestamp() does not do this, and the returned timestamp is in local time.
It's especially strange since the assignment operator Poco::LocalDateTime::operator = (const Timestamp& timestamp) does a UTC to local time conversion. The following code asserts because of this:
Poco::LocalDateTime local1 = Poco::LocalDateTime( 2020, 1, 30 );
Poco::Timestamp timestamp = local1.timestamp();
Poco::LocalDateTime local2 = timestamp;
assert( local1 == local2 );
local1 will not have the same value as local2 in this example. Am I the only one who think this is strange behavior?
If you look at LocalDateTime::timestamp() you will see that it converts the timestamp before returning via Timestamp::fromUtcTime so that function returns a Timestamp in Local time, not UTC time.
You can use the Timestamp::utcTime() function or the Timestamp::raw() function but those return different types to prevent you from accidentally doing the wrong thing.
What are you actually trying to achieve here?
Like in a similar query on this forum I need, but I need it to work in Impala:
In a workaround my colleague and myself attempted the following:
-- combine start date and time into a datetime
-- impala can't handle am/pm so need to look for pm indicator and add 12 hours
-- and then subtract 12 hours if it's 12:xx am or pm
================
t1.appt_time,
hours_add(
to_timestamp(concat(to_date(t1.appt_date),' ',t1.appt_time),'yyyy-MM-dd H:mm'),
12*decode(lower(strright(t1.appt_time,2)),"pm",1,0) -
12*decode(strleft(t1.appt_time,2),'12',1,0)
) as appt_datetime,
t1. ...
=========
Has anybody an easier and more elegant approach ?
Your workaround is valid, Impala does currently support AM/PM formatting for dates. There are a few open issues related
https://issues.apache.org/jira/browse/IMPALA-3381
https://issues.apache.org/jira/browse/IMPALA-5237
https://issues.apache.org/jira/browse/IMPALA-2262
[EDITED]
I'd like to query and filter by dates in realm. Here's my code:
let factories = realm.objects('Factory')
for(factory of factories) {
let toys = realm.objects('Toy').filtered('factory_id == factory.id')
let lowerBound = new Date(year + '-1-1T00:00:00Z')
let uppperBound = new Date(year + '-2-1T00:00:00Z')
let janToys = toys.filtered('created_at > lowerBound AND created_at < uppperBound')
}
year is a variable declared before the code snippet above.
This doesn't work and I'm pretty sure it's because the date format is not correct.
Here is the date format when I log toys:
Fri Mar 24 2017 16:01:59 GMT+0800 (Malay Peninsula Standard Time)
I'd like to know how to query realm dates. I can't find it in the documentation and other posts here. I'd appreciate any help. If this is not a date format issue, please tell me.
Thank you!
EDIT: I added the outside loop. This may not be a date format issue. Here is the error message:
Error: Predicate expressions must compare a keypath and another keypath or a constant value`
It seems that Realm does not support dot operators inside its queries. Forcing Realm to be relational is not allowed. As a solution, I transformed my usage to follow NoSQL convention. Passing in objects instead of declaring relationships.
I am using Splunk 6.2.X along with Django bindings to create a Splunk app.
To get access to the earliest/latest dates from the timerange picker, am using the following in my JS.
mysearchbar.timerange.val()
Am getting back a map where the values are in epoch format:
Object {earliest_time: 1440122400, latest_time: 1440124200}
When I convert them using moment using the following, I get different datetime than expected:
> moment.unix('1440122400').utc().toString()
"Fri Aug 21 2015 02:00:00 GMT+0000"
However, the time does not correspond to the values that have been selected on the time range picker i.e. 08/20/2015 22:00:00.000
Am not sure what the difference is getting caused by? Am sure tht the timezone is not the factor as the time difference is erratically not equivalent to derive using simple timezone add/subtract.
I was wondering if this behaviour can be explained as to how to get the Splunk epoch datetime to UTC would be helpful.
I was able to get rid of the timezone issue by performing the following:
Setting the timezone of the Splunk engine to UTC in props.conf as follows:
TZ = GMT
Setting up the CentOS (the server hosting Splunk) to UTC
Hope this helps anyone else who stumbles upon similar issues.
Thanks.
I have a test that checks to see if an item was shipped today.
let(:todays_date) {I18n.l(Date.today, format: '%m/%d/%Y')}
expect(order.shipped_date.strftime("%m/%d/%Y")).to eq(todays_date)
This test fails with the following error:
Failure/Error: expect(order.shipped_date.strftime("%m/%d/%Y")).to eq(todays_date)
expected: "10/14/2014"
got: "10/15/2014"
When I check the date in SQLite is one day ahead than the system date.
sqlite> select date('now');
2014-10-15
sqlite> .exit
u2#u2-VirtualBox:~/tools/$ date
Tue Oct 14 20:13:03 EDT 2014
I appreciate any help you can provide.
Thanks!
The documentation says:
Universal Coordinated Time (UTC) is used.
To get the time in the local time zone, use the localtime modifier:
select date('now', 'localtime');
Thanks to #CL, I resolved this. I now select all dates in UTC so that they compare.
let(:todays_date) {I18n.l(Time.now.utc, format: '%m/%d/%Y')}