About 900 thousand database records are used by some calculation,so I want to fill a datatable when webapi startup ,and the datatable can be used by some controller and method.
What's the proper way?
About 900 thousand database records are used by some calculation,so I want to fill a datatable when webapi startup ,and the datatable can be used by some controller and method.
If the database query and calculation would take long time to complete, you can try to implement a hosted service with background task logic, and cache caculation result in redis etc. Then you can get cached data from redis within your controller and method.
Besides, if query and calculation logic would be not very complex, you can try to create and implement a custom service for querying database and doing calculation, and register your service then use it within your controller.
Related
Im working with ASP.NET WebAPI & EntityFramework.
I noticed that I could create API controller with VS template based one of my entities. The controller automatically has methods PUT, GET, POST, DELETE.
Its very usefull and makes life easier.
But, the db connection initial in every single controller, I mean, every controller has DB connection object.
(for example: ExampleEntities db = new ExampleEntities())
Is that the right way to work?
Or should I create a BL or something to wrap the DB, and the controllers will access the db via BL?
tnx!
You can use UOF + Repository + IoC architecture.
I am in a situation where requirement is to keep an application level object in web api which can be accessed by all requests. I know one can use HttpContext.Current but that is not required since HttpContext is only for the liftime of request. I need a solution where i can keep an object that all requests can access and update as required.
Use a static class to hold your application level objects. static classes and static data members are created once for the application lifetime and all ASP.NET requests can access them.
I learnt it the hard way. Some time back, I mistakenly created a static field to hold customer-specific database connection string, in a ASP.NET Web API project and it became a mess. On each customer's login it was being set (overridden) in the code and the requests from the previously logged customers were using this newly set static SQL connection string for their queries. It was an embarrassing situation when customer's inadvertently saw each other's data.
You could use SessionState (per session).
I.e.
Session["YourDataKey"] = ApplicationLevelObject;
And then check the session state variable on each request that requires it.
However if you require the object for longer, I.e. every single user session, then I would suggest persisting your object to a database. You could use an ORM such as Entity Framework.
Cheers
Actually i want to send a object from controller action to webform load method.
I don't like to use Session or QueryString.
As I understand your question, you want to use some object that you create in the MVC request in a later Webforms request.
In addition to using a Session variable or the QueryString, you can also store the object data in a cookie and retrieve it in the WebForm.
Each of the options has its advantages:
Session-variable: Object can be stored as is, no need to reload, e.g. from database. Decreases scalability because server memory is used per user.
Query String: data is visible, only viable for short strings.
Cookie: data is stored on the client and can be tampered with. Transferred between server and client several times. Size restrictions apply. Cookies might be disabled.
Handling a huge amount of data:
In the comments you mention that the data is huge. Therefore, I'd suggest to store the data once it is generated on the MVC part, e.g. in the database (or even the file system) and just transfer the id that is needed to retrieve the data on the Webforms end via one of the methods above. You might also need to erase the prepared data once they have been used or after some time in order to clean up left-over data.
Recreating the data in the Webform:
If you do not want to store the data in some kind of cache (database, file, server cache) and if you can recreate the data in the Webforms request (obviously you are able to create the huge amount of data in the MVC request), you can also choose to transfer only that bit of data to the Webform that is required to recreate the data. Bad in sense of performance, but good in that the user is always presented up-to-date information and you don't have to clean the cache if the data is not needed anymore.
In order to be able to share the functionality to create the data between the MVC controller and the Webform, you should move that to a dedicated class that is used in both web front ends.
If both the pages are in different domains -
From the MVC controller action make a HttpWebRequest to the webform page, in the Request body of HttpWebRequest send the data you want to send. On the retrieving side you can use Request object and read the data.
If both the pages are in same domain -
You can use Cache (server side), Cookies (client side), Hiddenfields (for Form Post from MVC controller to ASPX Page)
I was looking through an old project and wanted to see if anyone had a suggestion on how to hide certain methods from being called by various layers. This was a 3 tier project, webapplication -> web service -> database
In the application there is a User object for example. When a User was being updated, the webapplication would create a User object and pass it to the webservice. The webservice would use the DataAccessLayer to save the User object to the database. After looking at this I was wondering if instead I should have made a Save method in the User class. This way the service and simply call the Save on the User object which would trigger the db update.
However doing it this way would expose the Save to be called from the webapplication as well, correct? Since the webapplication also has access to the same User object.
Is there anyway around this, or is it better to avoid this altogether?
There is a separation of concerns by keepeing the User object as object that only holds data with no logic in it. you better keep it separated for the following reasons:
As you stated, it is a bad practice since the Save' functionality will be exposed to other places/classes where it is irrelevant for them (This is an important for programming generally).
Modifying the service layer - I guess you are using WCF web service as you can transfer a .NET object (c#/VB) to the service via SOAP. If you put the saving logic in the 'User' object, you can't replace it another webservice that receives a simple textual data structures like JSON or XML or simply doesn't support .NET objects.
Modifying the data storage layer - If you want, for example, to store the data inside a different place like other database such as MongoDB, RavenDB, Redis or what ever you want, you will have to reimplement each class that responsible for updating the data. This is also relevant for Unit Testing and Mocking, making them more complicated to interrogate.
Developing web site (using Entity Framework) i have encountered in following questions:
1.What happens if a lot (lets say 10,000) people trying "to write" simultaneously to the same specific table in DB (SQL Server) via Entity Framework ?
2.In my project i have modules and for decoupling reasons i using singleton class (ModulesManager) which should take Action from each module and execute it asynchronous like following:
public void InsertNewRecord(Action addNewRecordAction)
{
if (addNewRecordAction != null)
{
addNewRecordAction.BeginInvoke(recordCallback, null);
}
}
Is it good approach to use singleton class as only place responsible to write to DB ?
3.Does Entity Framework can provide same speed as using SQL queries ?
What happens if a lot (lets say 10,000) people trying "to write"
simultaneously to the same specific table in DB (SQL Server) via
Entity Framework ?
If you mean inserting to the same table those insert will be processed based on transaction isolation level in the database. Usually only single transaction can hold a lock for insertion so inserts are processed in sequence (it has nothing to do with EF). Having 10.000 users concurrently inserting doesn't seem like sustainable architecture - some of them may timeout.
In my project i have modules and for decoupling reasons i using
singleton class (ModulesManager) which should take Action from each
module and execute it asynchronous like following:
Your manager asynchronously invokes the action so the answer is mostly dependent on what the action is doing. If it opens its own context, performs some changes and saves them, you should not have any technical problem on EF side.
Does Entity Framework can provide same speed as using SQL queries ?
No. EF does additional processing so it will always be slower.