I have a specific problem where I have records in my DB table like the following:
LastUpdated
10 January 2017
(The dates are stored in the database as a DateTime type.)
Now I need to fetch the difference in days between today's date and the last one including today's date. So, for example, today is the 12th, so the amount of days would be 2.
Now the second part of the problem is that I have another table setup like the following:
TransactionDate
1 January 2017
2 January 2017
3 January 2017
4 January 2017
5 January 2017
6 January 2017
7 January 2017
8 January 2017
9 January 2017
10 January 2017
So now after I perform a LINQ the updated results in my DBTable would look like the following:
3 January 2017
4 January 2017
5 January 2017
6 January 2017
7 January 2017
8 January 2017
9 January 2017
10 January 2017
11 January 2017
12 January 2017
So basically I'm trying to get the difference between the current date and the last updated date and then add it to the transaction details table. Upon adding the difference between two dates, I want to remove as much as the days in difference has been added, so that the total date span remains 10 days...
Can someone help me out with this?
Edit: this is the code so far:
var usr = ctx.SearchedUsers.FirstOrDefault(x => x.EbayUsername == username);
var TotalDays = (DateTime.Now - usr.LastUpdatedAt).Value.TotalDays;
Is this a correct way of fetching the difference between two dates like I've mentioned above?
Now after this I perform an HTTP request where I get the remaining two dates and insert it like:
ctx.TransactionDetails.Add(RemainingTwoDates);
ctx.SaveChanges();
Now I have dates expanding from 1st January to 12th of January, but I want to remove 1st and 2nd of January so that the total range of days stays 10;
How can I do this ?
You can remove transaction dates that are older than 10 days ago.
ctx.TransactionDetails.Add(RemainingTwoDays);
//Now you want to remove those older than 10 days
var today = DateTime.Today;
var tenDaysAgo = today.AddDays(-10);
var oldTrandactions = ctx.TransactionDetails.Where(t => t.TransactionDate <= tenDaysAgo).ToList();
foreach (var t in oldTrandactions) {
ctx.TransactionDetails.Remove(t);
}
ctx.SaveChanges();
Related
I have some fantasy football data from my league. 12 teams x 8 years = 96 observations. I'm trying to create tibble(year, team, record). The team and record variables are organized correctly. But my year column is in the wrong order. It's current order is below, but I need to reverse it so that 2019 starts at the top and 2012 is the last observation. Each value in the year column repeats 12 times since there are 12 teams. There are no NA values. Thanks in advance.
year team record
2012
2012
2012
2012
2012
2012
2012
2012
2012
2012
2012
2012
2013
2013
2013
.
.
.
2019
I'm dumb, this was quite easy. I'll leave it for others and I'll accept any other answer that works. I just inverted year numerically. year <- year[96:1] then did tibble(year, team, record)
I have a column like this of the Data data.frame:
Month
3
6
9
3
6
9
3
6
9
...
I want to update 3 with March, 6 with Jume, 9 with September. I know how to do it if I have two months 3 and 10 for example with: mutate(Data, Month=if_else(Month==3,"March","October")) How can I do it for three months?
Expected output:
Month
March
June
September
March
June
September
March
June
September
...
You could just use your numerical month values to access month.name, which is R's built-in vector of month names, starting at index 1:
Data <- data.frame(Month=c(3,6,9))
Data$MonthName <- month.name[Data$Month]
Data
Month MonthName
1 3 March
2 6 June
3 9 September
I have an ohlc daily data for US stocks. I would like to derive a weekly timeseries from it and compute SMA and EMA. To be able to do that though, requirement is to create the weekly timeseries from the maximum high per week, and another weekly timeseries from the minimum low per week. After that I, would then compute their sma and ema then assign to every days of the week (one period forward). So, first problem first, how do I get the weekly from the daily using R (any package), or better if you can show me an algo for it, any language but preferred is Golang? Anyway, I can rewrite it in golang if needed.
Date High Low Week(High) Week(Low) WkSMAHigh 2DP WkSMALow 2DP
(one period forward)
Dec 24 Fri 6 3 8 3 5.5 1.5
Dec 23 Thu 7 5 5.5 1.5
Dec 22 Wed 8 5 5.5 1.5
Dec 21 Tue 4 4 5.5 1.5
Assume Holiday (Dec 20)
Dec 17 Fri 4 3 6 2 None
Dec 16 Thu 4 3
Dec 15 Wed 5 2
Dec 14 Tue 6 4
Dec 13 Mon 6 4
Dec 10 Fri 5 1 5 1 None
Dec 9 Thu 4 3
Dec 8 Wed 3 2
Assume Holiday (Dec 6 & 7)
I'd start by generating a column which specifies which week it is.
You could use the lubridate package to do this, that would require converting your dates into Date types. It has a function called week which returns the number of full 7 day periods that have passed since Jan 1st + 1. However I don't know if this data goes over several years or not. Plus I think there's a simpler way to do this.
The example I'll give below will simply do it by creating a column which just repeats an integer 7 times up to the length of your data frame.
Pretend your data frame is called ohlcData.
# Create a sequence 7 at a time all the way up to the end of the data frame
# I limit the sequence to the length nrow(ohlcData) so the rounding error
# doesn't make the vectors uneven lengths
ohlcData$Week <- rep(seq(1, ceiling(nrow(ohlcData)/7), each = 7)[1:nrow(ohlcData)]
With that created we can then go ahead and use the plyr package, which has a really useful function called ddply. This function applies a function to columns of data grouped by another column of data. In this case we will apply the max and min functions to your data based on its grouping by our new column Week.
library(plyr)
weekMax <- ddply(ohlcData[,c("Week", "High")], "Week", numcolwise(max))
weekMin <- ddply(ohlcData[,c("Week", "Low")], "Week", numcolwise(min))
That will then give you the min and max of each week. The dataframe returned for both weekMax and weekMin will have 2 columns, Week and the value. Combine these however you see fit. Perhaps weekExtreme <- cbind(weekMax, weekMin[,2]). If you want to be able to marry up date ranges to the week numbers it will just be every 7th date starting with whatever your first date was.
I am using sentiment analysis function sentiment_by() from R package sentimentr (by trinker). I have a dataframe containing the following columns:
review comments
month
year
I ran the sentiment_by function on the dataframe to find the average polarity score based on the year and month and i get the following values.
review_year review_month word_count sd ave_sentiment
2015 March 8722 0.381686065 0.163440921
2015 April 7758 0.387046768 0.158812775
2015 May 7333 0.389256472 0.149220636
2015 November 14020 0.394711478 0.14691745
2016 February 7974 0.400406931 0.142345278
2015 September 8238 0.379989344 0.141740366
2015 February 7642 0.361415304 0.141624745
2015 December 24863 0.387409099 0.141606892
2016 March 8229 0.389033232 0.138552943
2016 January 10472 0.388300946 0.134302612
2015 August 7520 0.3640285 0.127980712
2016 May 3432 0.422246851 0.125041218
2015 June 8678 0.356612924 0.119333949
2015 January 9930 0.351126449 0.119225549
2016 April 9344 0.397066458 0.111879315
2015 July 8450 0.349963536 0.108881821
2015 October 7630 0.38017201 0.1044298
Now i run the sentiment_by function on the dataframe based on the comments alone and then i run the following function on the resultant data frame to find the average polarity score based on year and months.
sentiment_df[,list(avg=mean(ave_sentiment)),by="month,year"]
I get the following results.
month year avg
January 2015 0.110950199
February 2015 0.126943461
March 2015 0.146546669
April 2015 0.148264268
May 2015 0.143924126
June 2015 0.110691204
July 2015 0.106472437
August 2015 0.118976304
September 2015 0.135362187
October 2015 0.111441484
November 2015 0.137699548
December 2015 0.136786867
January 2016 0.128645808
February 2016 0.129139898
March 2016 0.134595706
April 2016 0.12106743
May 2016 0.142801514
As per my understanding both should return the same results, correct me if I am wrong. Reason for me to go for the second approach is because i need to average polarity based on both month and year, as well as based on months and i don't want to use the method twice as it will cause additional time delay. Could some one let me know what i am doing wrong here?
Here is an idea: Maybe the first function is taking the averages from the individual sentences, and the second one is taking the average from the "ave sentiment", which is already an average. So, the average of averages is not always equal to the average of the individual elements.
I'm challenged with a Leave Table setup issue and would like some guidance.
Background: I have a division at work where they do not accumulate any vacation time on their first year of service. All the accrued vacation time are backloaded and you receive the hours the following calendar year based on the previous year's service. I am having issues setting up the accrual service for the First Year Award Values because when I try to set the "Month Eligible" field to 13, it gives me an error. Screenshots can be provided or I can try to explain this better. But I'm up for any suggestions since I have a test environment to play around with this setup
Example 1:
DOH = jan 1, 2015 on Jan 1, 2016; member would accrue 10 days based on the service from Jan 1, 2015 to Dec 31, 2015 on Jan 1, 2017; member would accrue 10 days based on the service from Jan 1, 2016 to Dec 31, 2016
The breakdown for the 1st year of service is prorated based on month of hire:
Example 2:
DOH = feb 1, 2015 on Jan 1, 2016; member would accrue 9 days based on the service from Feb 1, 2015 to Dec 31, 2015 on Jan 1, 2017; member would accrue 10 days based on the service from Jan 1, 2016 to Dec 31, 2016
Example 3:
DOH = mar 1, 2015 on Jan 1, 2016; member would accrue 8 days based on the service from Feb 1, 2015 to Dec 31, 2015 on Jan 1, 2017; member would accrue 10 days based on the service from Jan 1, 2016 to Dec 31, 2016
with continuing the breakdown until the 12th month.
Example 4:
DOH = dec 1, 2015 on Jan 1, 2016; member would accrue 0 days based on the service from Dec 1, 2015 to Dec 31, 2015 on Jan 1, 2017; member would accrue 10 days based on the service from Jan 1, 2016 to Dec 31, 2016
Will this be part of the "Special Calculation Routine" checkbox?
I suggest using the Service Calc at Year Begin box instead. That will calculate leave accruals based on service as of Jan. 1 of the current year. For the accrual setup, try the following:
Service Units = Months
Accrual Rate Units = Hours per Year (Award Frequency = First Run of Year)
First Year Award Values ==> NOT USED
Accrual Rate Values (You did not indicate subsequent years, so you may need more intervals.)
After Service Interval Accrue Hours At
13 Service Months 10 Hours per Year
Service Bonus Values (Assuming no accrual if hired after October)
After Service Interval Award Bonus Hours
3 Service Months 1.000000
4 Service Months 1.000000
5 Service Months 1.000000
6 Service Months 1.000000
7 Service Months 1.000000
8 Service Months 1.000000
9 Service Months 1.000000
10 Service Months 1.000000
11 Service Months 1.000000
12 Service Months 1.000000
The SBV's + Svc Calc # Yr Begin should cover your first year requirement, but you may need to tweak the setup if I did not understand it correctly.