XML or text file logging in aspnet with thread safe - asp.net

I need very simple text file logging. I'll only append lines to it. never change existing ones nor delete them. If it would be XML file it would be easier to bind to grids to view them. but question remains for both text files and xml files as they are in file system.
in web server there will be file locking while appending log entries. and maybe also while reading them. So this method has to be thread safe. At the same moment multiple instances can write date to file.
I know there are some third party tools like serilog etc but I want to know:
how can I append (not change) lines to text file (or xml file) without concerning about file locks ?
if I read xml file to dataset, add a new row to it and save it as xml I would use other entries made by other instances.
if I open a text file with streamwriter and append a line to it, other instances would get lock error.
I get the list of logs from admin panel again, file will be locked and instances wouldn't append logs.
any ideas ?

After long reserch hours and experiments I found out that using Nlog is the best option for me. most important thing is people who use it are very happy. I created small example page that writes a log everytime it called and tested it. I have a multithreaded application that calls this sample page again and again. If was fast enough so I could not see the counting numbers of threads. no problem raised so far.
So, I'll stick to Nlog.
best.

Related

How to write encrypted data on Scrapy (using Feed Export)?

I'm quite new to Scrapy, started using a week ago, and I use the -o property in the command line to generate a file and I'd like that file to be encrypted. I believe I need to write a custom Feed Exporter (instead of Item Exporter, because I need to encrypt the whole file and not each item separately), but I honestly have no idea how. I'm kinda lost, I read the scrapy docs, but they are not very clear and detailed.
Note: I also use custom settings like 'FEED_EXPORT_FIELDS' that I'd like to keep working.
For anyone in the future trying to solve this problem, I created a custom pipeline. I instantiate an exporter in open_spider, and in close_spider I call exporter.finish_exporting() and when I access the file after the method all the scraped items are written in it. Before returning I read all the data from the file, truncate the file, encrypt the data and then write it again. Then I just close the file and pass.

MS Word mailmerge like functionality to allow export to Word document from ASP.Net Web application

I have requirement where I need to allow users to upload a Word document with place holders for certain fields which can be found in the database. This will be their template. For example the place holders might be prepended with ## or something. For example
Dear ##Title ##Lastname
They then can grab a record and hit export to Word document. This will then let them choose the template. They can select the template and then click continue. I will then get the template and replace the ##Title with the title field in the database for the selected record. I am not sure where to start or what components I need to do this.
From my initial investigation it seems that I can do this with the new open XML standard for Office 2007. So perhaps I should read in the template and save all the contents to a db table somewhere. Then when the use wants to export I get the contents again and then do a search and replace for the ## placeholders and link them properly. Then save the document to the output stream again which will then bring up the save dialog on their browser.
I am using ASP.Net MVC and am in a hosted environment. I was also maybe contemplating dynamically creating a new View type and dynamically creating new views when the user uploads a template. Not sure that this approach will work though.
Is this a good approach?
What tools should I be looking at?
Any other suggestions?
This is similar to an approach we took for inserting data into word documents and then returning them to the user. We opened the .docx file (it's a zip file so easy to extract) extracted the document (in the word folder called document.xml), did the replace and then put the document back into the .docx file and returned it to the user.
An issue we hit were that word inserted tags at strange places, especially things like spell/grammar errors, so we needed to be careful when we did the search/replace.
We decided not to store the fields from the document in a database to allow the documents to be easily updated.
We used dotnetzip component for opening the .docx files
Something we also did was to combine several documents into a single large document to save on the number of downloads. If I remember we used the open xml toolkit to do this merging. The website has also got loads of other information that may be of use.
Check out Scott Guthries blog post about the new view engine code named "Razor" coming out real shortly from Microsoft. In the comments there is talk about it being able to be used in mail merge scenarios like you talk about with ASP.NET MVC views.

How can I use an XML file to select dyanmic templates for an ASP.net Repeater?

I am developing an application that has a repeater that will use dynamic templates for each row based on the underlying DataItem (in this case a product). What I would like to do is have some sort of XML file that will store which templates are to be used with which templates, and then use a default template if there is not one specified for the product. My product catalog does not contain a particularly large number of products, but having to open and parse an XML file for each row would almost certainly have adverse performance effects. What I would like to do is have the ASP.net engine compile the entries in the XML file into some sort of global collection that can easily be accessed when needed. Ideally, the application would be able to determine when I have made changes to the file and would automatically recompile the collection and restart the application if necessary. If my understanding is correct, this is already how the engine deals with the web.config file.
Does anyone know if an approach like this is possible, and how I might be able to accomplish it?
Thanks,
Mike
Well you could likely open and parse the XML file on each page load without any significantly adverse performance issues. Toss the result in a page-level collection and for each repeater row, read from that. This will at least prevent you from having to manage a global collection with a file change update dependency.
I do use XML in similar ways, albeit for mostly non-critical company Intranet type applications, so I'd certainly say your approach isn't too awful. :) In my specific cases, I have ultimately put the XML in a global application level object, with the trade off being that I have to manually restart the application to re-load the XML, should it change.
If you do want to tackle your ideal scenario, I would look to store the XML templates in the Cache object and set up a CacheDependency on the XML file.

Maintaining same piece of code in multiple files

I have an unusual environment in a project where I have many files that are each independent standalone scripts. All of the code required by the script must be in the one file and I can't reference outside files with includes etc.
There is a common function in all of these files that does authorization that is the last function in each file. If this function changes at all (as it does now and then) it has to changed in all the files and there are plenty of them.
Initially I was thinking of keeping the authorization function in a separate file and running a batch process that produced the final files by combining the auth file with each of the others. However, this is extremely cumbersome when debugging because the auth function needs to be in the main file for this purpose. So I'd always be testing and debugging in the folder with the combined file and then have to copy changes back to the uncombined files.
Can anyone think of a way to solve this problem? i.e. maintain an identical fragment of code in multiple files.
I'm not sure what you mean by "the auth function needs to be in the main file for this purpose", but a typical Unix solution might be to use make(1) and cpp(1) here.
Not sure what environment/editor your using but one thing you can do is to use prebuild events. create a start-tag/end-tag which defines the import region, and then in the prebuild event copy the common code between the tags and then compile...
//$start-tag-common-auth
..... code here .....
//$end-tag-common-auth
In your prebuild event just find those tags, and replace them with the import code and then finish compiling.
VS supports pre-post build events which can call external processes, but do not directly interact with the environment (like batch files or scripts).
Instead of keeping the authentication code in a separate file, designate one of your existing scripts as the primary or master script. Use this one to edit/debug/work on the authentication code. Then add a build/batch process like you are talking about that copies the authentication code from the master script into all of the other scripts.
That way you can still debug and work with the master script at any time, you don't have to worry about one more file, and your build/deploy process keeps everything in sync.
You can use a technique like #Priyank Bolia suggested to make it easy to find/replace the required bit of code.
I ugly way I can think of:
have the original code in all the files, and surround it with markers like:
///To be replaced automatically by the build process to the latest code
String str = "my code copy that can be old";
///Marker end.
This code block can be replaced automatically by the build process, from one common code file.

Excel Upload to database table

I'm looking for the best solution to allow our users to upload XLS spreadsheet so that they can be used to populate tables in our data warehouse (DW).
Our users are heavy Business Object (BO) users, and BO lets you export to XLS. When they have data in a spreadsheet that needs to be loaded to the DW, they need a process to upload the data in the XLS to the DW's db. As a result, we end up with many of these "interfaces" when I think that what we really need is a programmatic automated feed. Using Excel as a data source for inter-system feeds, in my gut, just seems like a bad idea to me.
Question #1: I'd like to see if you agree and why or why not.
OK, there is no swimming against that tide, so I now take as a given that XLS uploads are here to stay for us. Now I need to find the best solution. First, I'll explain what we do now and then what I don't like about it:
Via web pages, we provide empty XLS files (no rows) with a defined set of columns. Each file is intended to be used to update a different target dest table. In each spreadsheet is an "upload" button. Pushing the Upload button results in the macro in the spreadsheet serializing the contents of the file to CSV and FTPing the data to server folder. Periodically, a scheduler fires off an Informatica ETL job that uses the CSV file as input and loads the data into a custom XLS-specific staging table and then, if the records pass edits, into the appropriate target table. Any errors encountered are logged to an error table. For each XLS file uploaded, the data ends up in a separate staging and error table that is specific for the file.
Some of the things I don't like include about our process are:
1) The macro code in the XLS is too exposed, includes passwords for example, can be tampered with and there are issues ensuring that the users are using the latest XLS templates.
2) Business Rule edits are placed in the ETL program, where they should probably be, but because we would like to catch the errors ASAP, i.e, in the spreadsheet, edits are also added to the macro code. This results in duplication of business edits. I want these rules in one place and centrally controlled. IMHO, I think putting any macro code in the XLS introduces a maintenance issue, even calls to stored procedures (some of which we have) or calls to web services (we haven't yet tried to call .NET Web Services from XLS macros.)
3) Every XLS file upload template has its own process with distinct set of staging and error tables and a custom screen for reporting errors encountered. It seems like we need a more generalized re-usable solution.
Besides often getting data exported to XLS from BO, the users like also Excel because it is easier to edit a large number of records and less clunkier than editing individual records via a web interface.
This is the general direction that I am thinking:
First, I want the users to have the ease of editing of Excel with editing, but without including embedded macros in the spreadsheet. I experimented with Farpoint's Grid with Excel compatibility...
http://www.fpoint.com/netproducts/spreadweb/tour/excel.aspx
...and I found that it was quite easy to allow a user the ability to open up an XLS file that resides on their PC and have it open up in a browser and be able to easily access the data read from server-side .NET web code. Excel isn't running locally in their browser, but the functionality of Excel is reproduced, presumably through a lot of client sided scripting that I expect would be a real pain to duplicate myself. You can even cut and paste from a local spreadsheet into the web's spreadsheet. This sounds great, by biggest problem is cost. Our company is near death and won't allow us to purchase any new software.
Next, I want to identify the common components across all spreadsheet upload processing and come up with generic processing code. For example, I imagine a table which defines each of our spreadsheets and the format of each including the column names and data type definitions, perhaps in terms of their destination columns instead of hard coding. Based on this table template definition, I can generate XLS templates for download from this table definition. I can also perform simple generic edits to ensure that the data entered matches the table definition. And one common web page can be used to present the data and allow report data type mismatch errors and allow for the user to correct them. I would also define a common table for storing the data in a "staging" table, using a table with two columns, submission #, row num, name and value, perhaps. No more "custom everything" is the goal.
Next I need to decide where to put the business rules. My dept's mgt firmly believes that all loading of data should be done by Informatica ETL batch processes and therefore the rules/edits belong "in Informatica". I have zero experience with Informatica tools, I am more of a .NET guy. I am therefore unsure as to how these rules are implemented but I suspect that they are not reusable in the sense that they can be used by a .NET web page to validate a particular record against. You see, in some cases, when the user is not performing a bulk upload, they do have the ability to edit a specific record and I would like the same edits that were applied by the ETL bulk insert process to be applied to an individual update attempt to a single record via a web page. If the solution to write a single web service or stored procedure that can be called from either the web page doing an update of a single record or called thousands of times for each record in a bulk upload? The latter sounds inefficient.
Your thoughts on anything above would be very much welcomed.
From a cost perspective, the efforts you'll need to go through to re-create spreadsheet functionality on the web will exceed the cost of Farpoint or other controls. Even if you made $20 an hour, do you think you could complete a working product in under 2 weeks? I think you have the facts on your side when you discussed maintenance issues if you allow ETL functionality to exist in Excel - you have twice the amount of work to maintain the transformation rules. I think you need to convince management that in order to create a maintainable, robust solution you need some flexible utilities.
Farpoint is a good choice. There is also SpreadsheetGear that is a .Net engine that interprets Excel macros and can run on a web server. It has a Win32 control that allows you to create a WinForms solution with very Excel interface functionality. Last time I checked there was no web control for the product. It does an excellent job of providing Excel capability for processing large amounts of data.
Good luck. I think you will find a good solution since you seem to have a good grasp of the pro's and con's of all the different potential solutions.

Resources