Add a day to a datetime project parameter and store in a variable of type datetime in using SSIS expression - datetime

I am trying to add a day to a project parameter of type DATETIME and store the results in a variable in SSIS which of type DATETIME.
I am using this below expression in variable
dateadd(day,1,(DT_DATE)(DT_DBDATE) #[$Project::Start_Date])
and getting this below error
TITLE: Expression Builder
Expression cannot be evaluated.
For help, click:
http://go.microsoft.com/fwlink?ProdName=Microsoft%C2%AE%20Visual%20Studio%C2%AE%202015&ProdVer=14.0.23107.0&EvtSrc=Microsoft.DataTransformationServices.Controls.TaskUIFramework.TaskUIFrameworkSR&EvtID=FailToEvaluateExpression&LinkId=20476
------------------------------ ADDITIONAL INFORMATION:
The expression contains unrecognized token "day". If "day" is a
variable, it should be expressed as "#day". The specified token is not
valid. If the token is intended to be a variable name, it should be
prefixed with the # symbol.
Attempt to parse the expression "dateadd(day,1,(DT_DATE)(DT_DBDATE)
#[$Project::Start_Date])" failed and returned error code 0xC00470A4.
The expression cannot be parsed. It might contain invalid elements or
it might not be well-formed. There may also be an out-of-memory error.
(Microsoft.DataTransformationServices.Controls)
------------------------------ BUTTONS:
OK
Can anyone help me to resolve the above problem.

TL;DR;
The argument to the first parameter for dateadd is a string, not a constant/enumeration so it should be
dateadd("day",1,(DT_DATE)(DT_DBDATE) #[$Project::Start_Date])
The long way around
I assume the desire is to get the next day's date with the supplied expression
dateadd(day,1,(DT_DATE)(DT_DBDATE) #[$Project::Start_Date])
When I run into issues with expressions, I break them down into the most atomic statement and then compose from there.
I'm using a SSIS scoped variable instead of a project parameter but the logic will hold true.
I have an SSIS variable, Start_Date of data type DateTime with an initial value of 2022-06-01 09:22 AM (convert that to your current locale's preference for date presenation)
I created a new variable, Start_DateOnly and used the following expression
(DT_DATE)(DT_DBDATE) #[User::Start_Date]
Great, that shows 2022-06-01 (no time component in the Variables window although if you evaluate in the Expression editor, it will show midnight). And the explainer - we convert to the DT_DBDATE datatype to drop the time component but DT_DBDATE is incompatible with the displayed DateTime data type so we explicitly convert to DT_DATE.
Cool beans, now all we need to do is confirm the dateadd function works as expected with our new variable
dateadd(day, 1, #[User::Start_DateOnly])
What the heck?
Expression cannot be evaluated.
The expression contains unrecognized token "day". If "day" is a variable, it should be expressed as "#day". The specified token is not valid. If the token is intended to be a variable name, it should be prefixed with the # symbol.
Oh... yeah, while this language is similar to TSQL, the datepart parameter is a string, not an enum/constant so the syntax should be
dateadd("day", 1, #[User::Start_DateOnly])
Yup, that evaluates to 2022-06-02 12:00 AM

Related

How to perform parameterized date comparison with Azure Data Factory mapping data flows?

In mapping data flows I'm using the following filter expression against a late binding (schema drift) source:
toString(byName('modifiedon')) > '2022-04-19 00:00:00'
The filter expression returns the correct output. However I run into problems when I attempt to parameterize the expression using the following parameters:
For example, let's replace 'modifiedon' with a string type parameter as shown below:
toString(byName($WatermarkColumnName)) > '2022-04-19 00:00:00'
This will give me no output at all. If I try to parameterize the timestamp with a string type parameter I'll get the following error message when attempting data preview:
Incompatible data types between declared type and actual parameter value
I have tried defining the WatermarkValue parameter as TimeStamp and then using toString($WatermakValue) function in the filter expression but then it returns all records instead!

Find the documents which have element in proper "dateTime" format

I am trying the following query in marklogic-9:
cts:element-value-match(xs:QName("cd:modificationDate"), "[Y0001]-[M01]-[D01]T[H01]:[m01]:[s01].000Z", ("type=dateTime","timezone=TZ"))
to achieve this, but this gives me the following error:
[1.0-ml] XDMP-ARG: cts:element-value-match(xs:QName("cd:modificationDate"), "[Y0001]-[M01]-[D01]T[H01]:[m01]:[s01].000Z", ("type=dateTime", "timezone=TZ")) -- arg2 is invalid
What i want to do is to find out all those documents which conforms to this specific pattern of dateTime. We have a date-range index on this element - modificationDate.
How best can we do this using marklogic and xquery api.
cts:element-value-match is really only useful on string range indexes and even there it only takes simple wildcards (* and ?), not general regular expressions or date formats.
If your range index is a dateTime range index, then every value must conform to the proper xs:dateTime format, so this query would tell you nothing.
This will give you a list of all the URIs where you have a valid dateTime in that element:
cts:uris("", (),
cts:element-range-query(xs:QName("modificationDate"), ">", xs:dateTime("0001-01-01T00:00:00"))
)

Joi unix timestamp set max value

I'm using Joi package to validate a timestamp field but How can I set a max() value on it, I want the input timestamp to be less than current time stamp
var schema = Joi.object().keys({
t: Joi.date().timestamp('unix').max(moment().unix()),
})
but I get the error that:
child "t" fails because ["t" must be less than or equal to "Sun Jan 18
1970 07:35:17 GMT+0330 (IRST)"]
I'm sure that the moment().unix() returns the current timestamp, but here it is casted to string.
It seems that max() and min() functions can do the trick but they only work if the threshold is specified in milliseconds.
t: Joi.date().timestamp('unix')
.max(moment().unix() * 1000)
.min(moment().subtract('42', 'weeks').unix() * 1000),
It doesn't look like Joi.date().max() accepts unix timestamps properly despite being able to specify in your schema that a unix timestamp is expected for incoming values.
If you need to use the current date in your schema you can pass the string 'now' instead of using the date. Or just make sure you enter the current date in format that .max() expects. I tried this using milliseconds and it seems to work as expected. I think Joi is using the default Date constructor under the hood to construct dates to compare which expects milliseconds.
var schema = Joi.object().keys({
t: Joi.date().timestamp('unix').max(moment().unix() * 1000)
});
From the docs on date.max()
Notes: 'now' can be passed in lieu of date so as to always compare relatively to the current date, allowing to explicitly ensure a date is either in the past or in the future.

momentJS UTC versus specifying the timezone in the moment constructor

Does the below 2 syntaxes are same,
moment(1456261200367, 'H:mm:ss.SSS').utc().valueOf() //1456343786120
moment(1456261200367 +0000, 'H:mm:ss.SSS Z').valueOf() //1456325786120
but as you could see if both of them coverts the given value to UTC mode then why there is a difference in the output?
Also I would like to know how a.valueOf() and b.valueOf() are same, when a.format() and b.format() are different, because moment() (moment parses and displays in local time) is different from moment.utc() (displays a moment in UTC mode)
var a = moment();
var b = moment.utc();
a.format();
b.format();
a.valueOf();
b.valueOf();
In the first part, you're using it incorrectly. You've passed numeric input which would normally be interpreted as a unix timestamp, but then you've supplied a string-based format string so the number is converted to a string. The format string here is telling moment how the input is specified, but it doesn't match what you're actually parsing.
This doesn't error though, because by default moment's parser is in "forgiving" mode. You can read more about this in the docs.
The correct way to pass a timestamp into moment is with one of these:
moment(1456261200367)
moment(1456261200367).utc()
moment.utc(1456261200367)
The last two are equivalent, but the moment.utc(timestamp) form is prefered.
With any of those, all three will have the same .valueOf(), which is just the timestamp you started with. The difference is in the mode that the moment object is in. The first one is in local mode, reflecting the time zone of the computer where it's running, while the other two are in UTC mode.
This is evident when you format the output using the format function, as with other many other functions. I believe that answers your second question as well.

Using UTCTime with SQLite in Yesod

While using a UTCTime field in my model in Yesod, I get the following error:
PersistMarshalError "field timestamp: Expected UTCTime, received PersistText \"09:18:07\""
I am using SQLite to store my database. My model looks as follows:
Myobject
timestamp UTCTime default=CURRENT_TIME
otherfield Text
Note that this error occurs both with and without the default value.
I am selecting the list of Myobject-entities as follows:
myobjects <- selectList [] [Desc MyobjectTimestamp]
Using MyobjectOtherfield instead of MyobjectTimestamp does not help either, which makes sense since all data is fetched and therefore marshaled anyway.
A similar question has been asked here, but the answer did not help me.
How can I use UTCTime in Yesod while using SQLite?
Edit:
The PersistText \"09:18:07\" that is mentioned in the error is the value the field defaulted to.
You stored a Text value "09:18:07", while it expected a UTCTime value. Did you insert values by hand?
getCurrentTime from Data.Time returns a value of type IO UTCTime, so you can either use putStr getCurrentTime in GHCI to get a valid representation, or use now <- liftIO getCurrentTime in your function.
EDIT:
Because getCurrentTime returns a timestamp like: 2013-10-25 10:16:32.1627238 UTC, inserting a value like that in your database should resolve the error.

Resources