Setting the date format on an excel cell programmatically - asp.net

I would like to generate an excel sheet aout of a table in asp. The exporting works fine using an in-house developed framework to export the table. The problem that I'm having is that dates are just written in columns as strings and not initialized with the proper cell format. Is there any way to format the date without any 3rd party software by just setting the value of a cell with a string?? Such as:
sheet[i,j].Value = "{:DateTime}"+dateStringProperlyFormatted

I'm not in front of an appropriate box to test this at the moment, but you should be able to format the range with something like:
rangeOfInterest.NumberFormat = "MM/DD/YYYY"
The range could be your single cell, or whatever you'd like. You could also do the entire row with:
rangeOfInterest.EntireColumn.NumberFormat = "MM/DD/YYYY"
There's probably an :"EntreRow" version too, although I don't recall having ever used it.

Related

Importing dates from excel (some formatted as dates some as numbers)

I am working an uploaded document originally from google docs downloaded to an xlsx file. This data has been hand entered & formatted to be DD-MM-YY, however this data has uploaded inconsistently (see example below). I've tried a few different things (kicking myself for not saving the code) and it left me with just removing the incorrectly formatted dates.
Any suggestions for fixing this in excel or (preferably) in R? This is longitudinal data so it would be frustrating to have to go back into every excel sheet to update. Thanks!
data <- read_excel("DescriptiveStats.xlsx")
ex:
22/04/13
43168.0
43168.0
is a correct date value
22/04/13
is not a valid date. it is a text string. to convert it into date you will need to change it into 04/13/2022
there are a few options. one is to change the locale so the 22/04/13 would be valid. see more over here: locale differences in google sheets (documentation missing pages)
2nd option is to use regex to convert it. see examples:
https://stackoverflow.com/a/72410817/5632629
https://stackoverflow.com/a/73722854/5632629
however, it is very likely that 43168 is also not the correct date. if your date before import was 01/02/2022 then after import it could be: 44563 which is actually 02/01/2022 so be careful. you can check it with:
=TO_DATE(43168)
and date can be checked with:
=ISDATE("22/04/13")

How do I get EXCEL to interpret character variable without scientific notation in R using fwrite?

I have a relatively simple issue when writing out in R with fwrite from the data.table package I am getting a character vector interpreted as scientific notation by Excel. You can run the following code to create the data issue:
#create example
samp = data.table(id = c("7E39", "7G32","5D99999"))
fwrite(samp,"test.csv",row.names = F)
When you read this back into R you get values back no problem if you have scinote disable. My less code capable colleagues work with the csv directly in excel and they see this:
They can attempt to change the variable to text but excel then interprets all the zeros. I want them to see the original "7E39" from the data table created. Any ideas how to avoid this issue?
PS: I'm working with millions of rows so write.csv is not really an option
EDIT:
One workaround I've found is to just create a mock variable with quotes:
samp = data.table(id = c("7E39", "7G32","5D99999"))[,id2:=shQuote(id)]
I prefer a tidyr solution (pun intended), as I hate unnecessary columns
EDIT2:
Following R2Evan's solution I adapted it to data table with the following (factoring another numerical column, to see if any changes occured):
#create example
samp = data.table(id = c("7E39", "7G32","5D99999"))[,second_var:=c(1,2,3)]
fwrite(samp[,id:=sprintf("=%s", shQuote(id))],
"foo.csv", row.names=FALSE)
It's a kludge, and dang-it for Excel to force this (I've dealt with it before).
write.csv(data.frame(id=sprintf("=%s", shQuote(c("7E39", "7G32","5D99999")))),
"foo.csv", row.names=FALSE)
This is forcing Excel to consider that column a formula, and interpret it as such. You'll see that in Excel, it is a literal formula that assigns a static string.
This is obviously not portable and prone to all sorts of problems, but that is Excel's way in this regard.
(BTW: I used write.csv here, but frankly it doesn't matter which function you use, as long as it passes the string through.)
Another option, but one that your consumers will need to do, not you.
If you export the file "as is", meaning the cell content is just "7E39", then an auto-import within Excel will always try to be smart about that cell's content. However, you can manually import the data.
Using Excel 2016 (32bit, on win10_64bit, if it matters):
Open Excel (first), have an (optionally empty) worksheet already open
On the ribbon: Data > Get External Data > From Text
Navigate to the appropriate file (CSV)
Select "Delimited" (file type), click Next, select "Comma" (and optionally deselect any others that may default to selected), Next
Click on the specific column(s) and set the "Default data format" to "Text" (this will need to be done for any/all columns where this is a problem). Multiple columns can be Shift-selected (for a range of columns), but not Ctrl-selected. Finish.
Choose the top-left cell to import/paste the data (or a new worksheet)
Select Properties..., and deselect "Save query definition". Without this step, the data is considered a query into an external data source, which may not be a problem but makes some things a little annoying. (For example, try to highlight all data and delete it ... Excel really wants to make sure you know what you're doing there.)
This method provides a portable solution. It "punishes" the Excel users, but anybody/anything else will still be able to consume the files directly without change. The biggest disadvantage with this method is that you won't know if somebody loads it incorrectly unless/until they get odd results when the try to use the data and some fields are silently converted.

EPPlus is rounding up a large number

I am using EPPlus to generate an excel report in ASP.Net. I have the following code in use:
aCell.Value = CDec(aValue)
aCell is of type OfficeOpenXml.ExcelRange.
This works fine most of the time. However, when aValue is 201600000000515561, the cell in Excel is set with the value of 201600000000516000.
Stepping through the code and watching the value of aCell.Value shows that the correct value being set. But when the excel is opened, the value has been rounded up to the next 1000.
Does anybody have a solution or is this a bug in EPPlus?
Its not an Epplus thing, more of an Excel thing. Because of the way excel stores numbers you have up to 15 significant figures after which excel basically inserts 0's. If you count 201600000000515561 starting from the left you will see this. Have a look at this:
https://en.wikipedia.org/wiki/Numeric_precision_in_Microsoft_Excel
I think the most popular work around would be to store the numbers as text which of course leads to its own set of issues. Here is a good thread on it:
http://answers.microsoft.com/en-us/office/forum/office_2007-excel/15-digit-number-limitation-non-text-workaround/a4974853-7c3c-4830-8562-2e88369d981b?auth=1

Can SAS still read or create the combination of {.dat fixed-column ascii data file, .sas syntax file}, or is it obsolete?

In the past I have used the excellent SAScii package in R to read in this type of data: {.dat fixed-column data file + the corresponding .sas "syntax" file}. I want to be quite precise about that because there is no end of ambiguity surrounding phrases like "SAS file". These .dat files contain only integers, and the .sas files specify both the way to parse the columns and the way the integers represent the values in the actual data (this feature is sometimes called the "codebook".) I have found very good data in that format (i.e. in the form of the pair of files {.dat, .sas}) from places like Minnesota Population Center's IPUMS https://usa.ipums.org/usa/, and built up a lot of tools to analyze it using R and SAScii.
Now I have access to SAS itself, and but would still like to re-use some of my tools and techniques. However I can find no reference in SAS to data like that {fixed-column data in .dat, syntax file in .sas}. Has that format been entirely superseded within SAS (perhaps by the SAS7BDAT format)? Or perhaps the {.dat,.sas} format was never used within SAS?? The reason I ask is, now that I have access to SAS and so much data in SAS7BDAT format, I would like to be able to export some of it in {.dat, .sas} format for use with my own tools.
Thanks very much, and cheers - Ed
I don't think this is something built into SAS. You could, however, write such a program pretty easily.
First off, Chris Hemidinger has written something that basically does this (it creates datalines, not .dat file, but that shouldn't be too hard to modify if you know .NET and/or to modify the R module to accept). That is discussed and available here. The title of the post is "Turn your data set into a data step program". This is roughly equivalent to the SQL Server task that creates "Create Table" code out of a table. This would only work in Enterprise Guide, although you should be able to do roughly the same thing in a standalone .NET program.
Second, you can easily write something like this in Base SAS. Creating the datalines is easy, numerous ways to write out to a file.
For a CSV, for example, you can do this.
ods csv file="c:\temp\mydata.csv";
proc print data=mydata;
run;
ods csv close;
If you're going to write a flat file, you might as well make the input/output .sas first - after all it can be almost the same code. You can query dictionary.columns to generate the code, both the input and output code. Create a table with the variable names, lengths, and formats for each variable, then process it in a data step advancing the start variable by the length of each variable (so it moves to the next position after the last one finished). If you need formats for your R project, then proc format cntlout=<datasetname> will generate a dataset that contains those formatted value translations, and you can write that out in whatever format you need as well.

How do I prevent data from displaying in scientific format?

I have this sql server query that I am running in my .net app.
(CONVERT(VARCHAR(8), EventDate, 112)+ substring(RequestedBy,1,1)+right( '0000000' + convert( varchar( 7 ), ContactID ), 7 )) as Contacts
It produces the following results in the following format:
20120731e0000001
20120731f0000002
20120731p0000003
This is the result and format that we want.
Problem is when we click on export icon to export these results to excel, the first one changes to the scientific format like 2.01E+08.
Any date that has e in the middle such as 20120731*e*0000001 turns into scientific data.
The rest is just fine.
Any ideas how to fix this?
I want to apologize in advance if I stick the wrong tag in the Tags section since I am not sure where the fix could come from.
The formatting is happening when Excel opens your exported file. Simply change the column to have "formatted text" of string so that it displays as the original format.
When you open the exported data file directly with Excel, all formats are set as General. In General format, Excel applies formatting to what it recognizes as numbers and dates. So, to be clear, the issue is not with your export, but rather with how Excel is reading your data by default. Try the following to get around this issue.
Export to CSV. Then, rather than opening the CSV in Excel, use Excel's 'Get External Data From Text' tool to import the data into an empty workbook. Here you can specify to treat this field as TEXT rather than as GENERAL.
Note that once Excel applies number (or date) format to a cell, the data in the cell has been changed. No application of a new format will bring back your desired data. For this reason you must specify the format before Excel opens the file.

Resources