Life cycle of business objects in a web application [closed] - web-deployment

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
This is quite a general question regarding software design in a web application supposed to run on an intranet of a company. The basic design is simple: I have a backend (PHP) on a server, where a database (MySQL) is holding all data for the application. The application is used in a company and shall reflect employee related tasks. The frontend webpage (HTML, CSS) shows a UI for manipulating employee data etc.
A basic business object is here the employee, which is represented by a PHP class. When the frontend asks to show data for an employee, the query goes to the php class, which loads the employee data from database, instantiates an employee object and sends the employee data back to the client. Further queries from the client involve some logic that is done in the employee object, so I wanted to keep the just instantiated employee object in memory and use it again. I wanted to avoid the database access to load the employee data, instantiate object every time the client does a request.
So I created an object manager in PHP on server side which should store already instantiated business objects and provide them on demand or load objects required at the moment they are needed (lazy loading). But I realized that (of course) the server does not keep instances in memory between different http requests from the client, so my object manager does not work.
In a desktop application this would work, but not in a web application. Is this a bad approach for web application design? Is it normal that business objects are loaded and instantiated on every client request?
Another possibility is to describe and instantiate employee objects in javascript class and do this logic on the client side where objects can be kept in memory, isn't it? But I thought it is better to do business logic on the server to not stress the client to much.

I wanted to avoid the database access to load the employee data, instantiate object every time the client does a request.
You are headed for a great deal of complexity if this is really a requirement in your environment. I would benchmark the actual cost of just re-reading the data and judge if the added complexity is really needed.
You are trying to implement an object cache. If you solve the issue of keeping the object in memory between HTTP requests (it can be done), you will soon discover questions of concurrency (multiple clients acting on objects that you have cached) and the issue of transactionality (changes to the object cache must be written to the database in a transactionally consistent manner, among other issues.
If you really need this functionality, look into existing object cache implementations.

Related

How to structure a Client–Server data Model solution?

I need to write a client–server solution. The server will perform scheduled operations and also serve up data from a SQL DB to the client.
The client is yet to be fully defined but it will make requsts to the server and display data for the user and pass data back for persistence.
The whole solution is dealing with entities (Users, Products, etc. with their associated attributes).
In my head, both the server and the client need to be aware of these entities in order for them to be efficiently manipulated in code rather than having to unpack JSON and duplicate code.
My question is, should I make a class library containing models (classes or structs) representing these entities that is referenced by both the client- and server-side projects?
Otherwise, is there some standard way of building such a solution?
Thus far I have a client, a server (based on ASP.NET 2) and a Class Library containing entity Models along with some data access logic. Both the client and server projects reference the Class Library. One day in and I’m already starting to doubt my approach as being too clumsy.
I will be working with VS2019 using C#.
This isn't really a question well suited to StackOverflow which aims to solve specific code/tech problems.
It is possible to use the same model (Entity) in both client and server, but I highly recommend separating the client model (view model) from the domain model. (Entity) The reasons for this are:
Clients rarely need, or should expose every domain field & relationship. Sending models from server to client involve serialization. This can result in either performance issues or errors as the serializer "touches" properties and wants to lazy-load them, or you add the cost of eager-loading everything; Or it results in incomplete models where unloaded relationships are left null. (Not because there aren't any, they just weren't loaded) Client models should be trimmed down to just the data the client needs to see, formatted in a way it can use. Ultimately this is shipping more data than needed to the client and back. Keep the payloads over the wire as small as possible.
Security can be an issue when passing entities from Client to Server. Your UI may only allow users to change a few values in a very particular way, but the temptation can be to take that entity, attach it to a DB Context and update it. (1 line updates) However, this entity sent from the client can very easily be tampered with by the browser which can result in changes being made that you don't expect/allow. (I.e. change a FK relationship)
At best this can allow stale data overwrites where changes made after that record was sent to the client are overwritten silently when the client gets around to submitting their change. Don't trust data coming from a client, especially under the premise of "saving time". Update requests should validate the data coming in and re-load the Entity to check things like the row version before updating allowed values.
Enabling view models can be done using a technique supported in EF called Projection. This can either be hand-written using .Select or leveraging tools like Automapper and its ProjectTo method to easily transform entities and Linq expressions into simple, dumb serializable view models. When a view model comes back to the server, you simply load an entity and associations from the DB by ID, and update values after validation steps and SaveChanges to persist.

ASP.Net MVC and Database Connections [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I currently have an ASP.NET MVC application that has a static class to connect to the database. In some tests, I noticed that when I start using the application from several different sessions, it is getting extremely slow. If I only have one session, it works in an acceptable way, but from 3 sessions the usage starts to become unfeasible.
The static connection is starting in the Application_Start of Global.asax, I believe that this slowness is due to all of them competing for the same connection, right?
Given this, I decided to change the operation of the same, but I have two approaches that I think to follow, but I would like an opinion to know which would be the best:
1) Establish a session connection started in Global.asax, however I am afraid that due to certain actions that the application executes almost simultaneously, this approach is also slow at a given time.
2) Establish a connection for each query action to the database, instantiating the connection, opening the connection, executing the action and closing the connection. But again, due to the high number of actions that the application performs when loading some pages, I'm afraid to pop the connection pool by working this way.
Can you help me? Do you have a vision of another approach that can be used?
Currently, we are using ADO.Net and in some tests, we did with NHibernate even with just one user the gigantic slowness.
I thank the attention.
(Translated Post)
You can try to use Entity Framework.
http://www.entityframeworktutorial.net/what-is-entityframework.aspx
https://learn.microsoft.com/en-us/aspnet/mvc/overview/getting-started/getting-started-with-ef-using-mvc/creating-an-entity-framework-data-model-for-an-asp-net-mvc-application
Assuming that you have the application(ASP.NET MVC) and database already created you need to opne Visual Studio, select the Models folder>Add>New Item an than you need to choose ADO.NET Entity Data Model. After that you will see the Entity Data Model Wizard. From that window choose the first option, Ef Designer from Database. Click next, add the db connection parameters(db server, database name), click next, the the objects from db you want to use(tables, stored procedure) and click Finish
Hope it helps !

When is the session abandoned during a typical session? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm trying to figure out an issue I'm having with sitecore. I'm wondering if my issue is basically a problem with their reliance on Session.Abandon():
For performance reasons Sitecore only writes contact data to xDB (this is mongo) when
the session ends.
This logic seems somewhat flawed (unless I misunderstand how sessions are managed in Asp.Net).
At what point (without explicitly calling Session.Abandon()) is the session flushed in this model? i.e. When will the session_end event be triggered?
Can you guarantee that the logic will always be called or can
sessions be terminated without triggering an Abandon event? for example when the app_pool is recycled.
I'm trying to figure this out as it would explain something that I'm experiencing, where the data is fine in session but is written intermittently into the mongoDb
I think that strategy for building the data in session and then flushing the data to MongoDb fits for xDb.
xDb is designed to be high volume so it makes sense for the data to be aggregated rather than constantly being written into a database table. This is the way DMS worked previously and doesn't scale very well.
The session end in my opinion is pretty reliable, and Sitecore give you various option for persisting session (inproc, mongo, SQL server), MongoDb and SQL Server are recommended for production environments. You can write Contact data directly to MongoDb by using the Contact Repository api but for live capturing of data you should use the Tracker api. When using the tracker api, as far as I am aware, the only way to get data into MongoDb is to flush the session.
If you need to flush the data to xDb for testing purposes then Session.Abandon() will work. I have a module here which you can use for creating contacts and then flushing the session, so you can see how reliable the session abandon is by checking in MongoDb.
https://marketplace.sitecore.net/en/Modules/X/xDB_Contact_Creator.aspx

Static variable across multiple requests

In order to improve speed of chat application, I am remembering last message id in static variable (actually, Dictionary).
Howeever, it seems that every thread has own copy, because users do not get updated on production (single server environment).
private static Dictionary<long, MemoryChatRoom> _chatRooms = new Dictionary<long, MemoryChatRoom>();
No treadstaticattribute used...
What is fast way to share few ints across all application processes?
update
I know that web must be stateless. However, for every rule there is an exception. Currently all data stroed in ms sql, and in this particular case some piece of shared memory wil increase performance dramatically and allow to avoid sql requests for nothing.
I did not used static for years, so I even missed moment when it started to be multiple instances in same application.
So, question is what is simplest way to share memory objects between processes? For now, my workaround is remoting, but there is a lot of extra code and I am not 100% sure in stability of this approach.
I'm assuming you're new to web programming. One of the key differences in a web application to a regular console or Windows forms application is that it is stateless. This means that every page request is basically initialised from scratch. You're using the database to maintain state, but as you're discovering this is fairly slow. Fortunately you have other options.
If you want to remember something frequently accessed on a per-user basis (say, their username) then you could use session. I recommend reading up on session state here. Be careful, however, not to abuse the session object -- since each user has his or her own copy of session, it can easily use a lot of RAM and cause you more performance problems than your database ever was.
If you want to cache information that's relevant across all users of your apps, ASP.NET provides a framework for data caching. The simplest way to use this is like a dictionary, eg:
Cache["item"] = "Some cached data";
I recommend reading in detail about the various options for caching in ASP.NET here.
Overall, though, I recommend you do NOT bother with caching until you are more comfortable with web programming. As with any type of globally shared data, it can cause unpredictable issues which are difficult to diagnosed if misused.
So far, there is no easy way to comminucate between processes. (And maybe this is good based on isolation, scaling). For example, this is mentioned explicitely here: ASP.Net static objects
When you really need web application/service to remember some state in memory, and NOT IN DATABASE you have following options:
You can Max Processes count = 1. Require to move this piece of code to seperate web application. In case you make it separate subdomain you will have Cross Site Scripting issues when accesing this from JS.
Remoting/WCF - You can host critical data in remoting applcation, and access it from web application.
Store data in every process and syncronize changes via memcached. Memcached doesn't have actual data, because it took long tim eto transfer it. Only last changed date per each collection.
With #3 I am able to achieve more than 100 pages per second from single server.

What are the common issues and best practices when using ASP.NET session state? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
For example, I make extensive use of the session in my ASP.NET application but have heard somewhere that objects stored in session can be removed by the system where server memory runs low. Is this true? Is there any session 'callback' functionality to allow you to re-populate scavenged objects?
More generally, what other potential problems of using session state are there, and what are the suggested best practices for dealing with them?
No matter which precautions you use, always assume your Session may disappear and double check:
Dim sessionObj As Object = CType(Session("SessionKey"), Object)
If sessionObj Is Nothing
sessionObj = ReCreateObj()
Session("SessionKey") = sessionObj
End If
object sessionObj = Session["SessionKey"] as object ;
if(sessionObj == null)
{
sessionObj = ReCreateObj();
Session["SessionKey"] = sessionObj;
}
A Session wrapper works well for this so you don't have the check everywhere you access your Session vars.
I've had good luck with custom Session-State Providers. A couple useful tweaks include:
When using a database-backed Session, fetch all of your Session vars at the beginning of the request and store them all at the end versus fetch per access/store per set. This is useful if you're using more than a couple Session vars as it cuts down on round trips to an external server.
Implement a two-step memory/database store. On save, store your var in memory on the local server and save to the database as well. When accessed, check memory first (if you find it here, you save a network hop), then the database.
I'd definitely see if one of three standard Session modes works before implementing something you'll have to support. Custom Sessions are useful once you know more about your app's quirky needs.
And every time the Application pool is restarted as well, for example, create a page that sets a session variable and show it in the page, now, update you BIN, APPCode, Global or Local Resources folder and you will see that the session will be erased.
And in Shared Hosting environments that will be trickier cause they block the session timeout to normally 20 minutes, no matter what Session.Timeout value you override.
For this and more reasons I do use SQL Sessions, the web application is a little bit slowly cause everything (sessions) are in a SQL Server instead in memory, but you have much more control on it and you can set the session timeout to whatever you want! Very good in my case because I want them to continue working while I'm updating the application (was very bad that every time I updated the APP Code folder all the user would loose what they were working and had to login again) and I want to extend the shared hosting 20 minutes to 45.
Read more about SQL Session State
Sorry, I don't know about the removal of items from session state, but in response to your general question, the most important question is whether you anticipate running your web app on just one web server machine or many.
Typically, you are keeping the session state in the memory of one single machine. Your load balancer that fronts your multiple web servers must be configured to make things "sticky", so that the same user keeps being routed to the same machine.
Alternatively, you could store session state in a central location, including SQL Server, but then you pay the performance hit for the network and IO interaction with SQL Server. You give up the speed of local, in memory state.
Since session state is 20 minutes by default, if you need more than that, but your hosting company won't let you override it, you would have to have some ajax or hidden iframe javascript code running in your web pages that would ping the server every (less than) 20 minutes to keep the session aliv

Resources