Creating Global Variable For Web Application - ASP.NET - asp.net

i'm building my web application to connect with db.
so far i have managed to deal with it, (although i didn't build BLL & DAL).
i have a table which has column "id". i know there is a way to declare it in the SQL Server to be incremented automatically. ( but i don't want it).
i want to declare a global application variable that will hold the value.
i have 2 questions:
how i declare it?
where i create it and initialize it ? (i have several login pages).
THANKS!
p.s
it would be helpful if someone will tell me how do i build the DAL with my stored procedures?
and for what i need yo use BLL which i cant do in the DAL?

You can use the Application object - it is part of the HttpContext and is directly accessible on any page.
If you don't want to use it, you may want to write a Globals class (or whatever name you like) that holds static members.
public class Globals
{
public static int Counter { get; set;}
}
// accessed from other classes:
Globals.Counter++;
Either approach will not work of you have a web farm or several web applications and will not survive restarts.
Regardless of these options, the right solution (even if you don't want to use it - can you explain why?), is to use the ID field with the IDENTITY clause.

Storing the variable is the easy part. Managing your own ID generation and the contention and concurrency issues is the hard part. Good luck.

There really is no such thing as a global variable in ASP.NET. Remember, HTTP is stateless.
The closest you can come is storing something in the Application object:
Application["myvar" ] = x;
x = Application["myvar"];
But even here, this variable is lost when the app needs to restart, which it can do from time to time.
A much better solution for what you describe is a database value.

Incrementing an integer and then throwing that incremented ID into the db is fraught with danger. Multithreading? What happens when the application bounces? Do dev and prod deployments share the same set of numbers?
It sounds like you need a globally unique identifier and can be created outside of the database. That sounds like a job for a GUID. Sure, it takes up more space in the db, but it probably isn't the worst thing you are going to do to the database.

Related

EF Caching: How to detach objects *completely* before inserting them into HttpRuntime cache?

Some background:
Working with:
.NET 4.5 (thinking of migrating to 4.5.1 if it's painless)
Web Forms
Entity Framework 5, Lazy Loading enabled
Context Per Request
IIS 8
Windows 2012 Datacenter
Point of concern: Memory Usage
Over the project we are currently on, and probably our first bigger project, we're often reading some bigger chunks of data, coming from CSV imports, that are likely to stay the same for very long periods of time.
Unless someone explicitly re-imports the CSV data, they are guaranteed to be the same, this happens in more than one places in our project and similar approach is used for some regular documents that are often being read by the users. We've decided to cache this data in the HttpRuntime cache.
It goes like this, and we pull about 15,000 records consisting mostly of strings.
//myObject and related methods are placeholders
public static List<myObject> GetMyCachedObjects()
{
if (CacheManager.Exists(KeyConstants.keyConstantForMyObject))
{
return CacheManager.Get(KeyConstants.keyConstantForMyObject) as List<myObject>;
}
else
{
List<myObject> myObjectList = framework.objectProvider.GetMyObjects();
CacheManager.Add(KeyConstants.keyConstantForMyObject, myObjectList, true, 5000);
return myObjectList;
}
}
The data retrieving for the above method is very simple and looks like this:
public List<myObject> GetMyObjects()
{
return context.myObjectsTable.AsNoTracking().ToList();
}
There are probably things to be said about the code structure, but that's not my concern at the moment.
I began profiling our project as soon as I saw high memory usage and found many parts where our code could be optimized. I never faced 300 simultaneous users before and our internal tests, done by ourselves were not enough to show the memory issues. I've highlighted and fixed numerous memory leaks but I'd like to understand some Entity Framework related unknowns.
Given the above example, and using ANTS Profiler, I've noticed that 'myObject', and other similar objects, are referencing many System.Data.Entity.DynamicProxies.myObject, additionally there are lots of EntityKeys which hold on to integers. They aren't taking much but their count is relatively high.
For instance 124 instances of 'myObject' are referencing nearly 300 System.Data.Entity.DynamicProxies.
Usually it looks like this, whatever the object is:
Some cache entry, some object I've cached and I now noticed many of them have been detached from dbContext prior caching, the dynamic proxies and the objectContext. I've no idea how to untie them.
My progress:
I did some research and found out that I might be caching something Entity Framework related together with those objects. I've pulled them with NoTracking but there are still those DynamicProxies in the memory which probably hold on to other things as well.
Important: I've observed some live instances of ObjectContext (74), slowly growing, but no instances of my unitOfWork which is holding the dbContext. Those seem to be disposed properly per request basis.
I know how to detach, attach or modify state of an entry from my dbContext, which is wrapped in a unitOfWork, and I often do it. However that doesn't seem to be enough or I am asking for the impossible.
Questions:
Basically, what am I doing wrong with my caching approach when it comes to Entity Framework?
Is the growing number of Object Contexts in the memory a concern, I know the cache will eventually expire but I'm worried of open connections or anything else this context might be holding.
Should I be detaching everything from the context before inserting it into the cache?
If yes, what is the best approach. Especially with List I cannot think of anything else but iterating over the collection and call detach one by one.
Bonus question: About 40% of the consumed memory is free (unallocated), I've no idea why .NET is reserving so much free memory in advance.
You can try using non entity class with specific properties with SELECT method.
public class MyObject2 {
public int ID { get; set; }
public string Name { get; set; }
}
public List<MyObject2> GetObjects(){
return framework.provider.GetObjects().Select(
x=> new MyObject2{
ID = x.ID ,
Name = x.Name
}).ToList();
);
}
Since you will be storing plain c# objects, you will not have to worry about dynamic proxies. You will not have to call detach on anything at all. Also you can store only few properties.
Even if you disable tracking, You will see dynamic proxy because EF uses dynamic class derived from your class which stores extra meta data information (relation e .g. name of foreign key etc to other entities) for the entity.
steps to reduce memory here:
Re new the context, often
Dont try and delete content from the Context. Or Set it to detached.
It hangs around like a fart in a phone box
eg context = new MyContext.
But if possible you should be
using (var context = new Mycontext){ .... }
//short lived contexts is best practice
With your Context you can set Configurations
this.Configuration.LazyLoadingEnabled = false;
this.Configuration.ProxyCreationEnabled = false; //<<<<<<<<<<< THIS one
this.Configuration.AutoDetectChangesEnabled = false;
you can disable proxies if you still feel they are hogging memory.
But that may be unecesseary if you apply using to the context in the first place.
I would redesign the solution a bit:
You are storing all data as a single entry in cache
I would move this and have an entry per cache item.
You are using HTTPRuntime cache
I would use Appfabric Caching, also MS, also free.
Not sure where you are calling that code from
I would Call it on Application start, then all data is in memory when the user needs it
You are using Entity SQL
For this I would use an Entity Data Reader http://msdn.microsoft.com/en-us/library/system.data.entityclient.entitydatareader(v=vs.110).aspx
See also:
http://msdn.microsoft.com/en-us/data/hh949853.aspx

How to setup single-code multiple-database ASP.NET SaaS application

I am building a SaaS application and I would like to retain the single code base I have. I would like to be in separate sub-domains cust1.saascompany.com, cust2.saascompany.com, etc.
However, I don't have any TenantID's and would prefer for multiple reasons to stay with separate databases for each customer (primary one is that it's already coded that way and doesn't make sense to change it until usage warrants). The database has the user login membership within it.
I'm guessing I would need separate web.configs for connection strings? Or should I create a separate database that stores all the connection strings and any application level variables/constants? Eventually, I would like to be able to automate this provisioning (again, when usage warrants it).
Are there some articles or posts that anyone can point me to regarding how to set this up with steps? I haven't been able to find what I'm looking for.
Technically, this is simple. We do this for years. Although we use a different convention (my.domain.com/cust1, my.domain.com/cust2 plus url rewriting) this doesn't change anything.
So, this is what you do. You create an abstract specification of a connection string provider:
public interface ICustomerInformationProvider
{
string GetConnectionString( string CustomerId );
... // perhaps other information
}
then you provide any implementation you want like:
public class WebConfigCustomerInformationProvider : ICustomerInformationProvider { ... }
public class DatabaseConfigCustomerInformationProvider : ICustomerInformationProvider { ... }
public class XmlConfigCustomerInformationProvider : ICustomerInformationProvider { ... }
and you map your interface onto the implementation somehow (for example, using an IoC Container of your choice).
This gives you the chance to configure the provider during the deployment, for example, a one provider can be used by developers (reads connection strings from a file) and another one in the production environment (reads connection strings from a database which can be easily provisioned).
If you have other questions, feel free to ask.

Security Issue with ASP.NET and SQL Server

A problem appears when two users are logged on to our service system at the same time and looking at the service list gridview. If user1 does a search to filter the gridview and user2 happens to click to another page user2 sees the results from the search performed by user1. That means one company can see another company's data.
It's an ASP.NET application that was developed in house with C#/ASP.NET 3.5. The data is stored in a SQL 2000 database and relies very heavily on stored procedures to update, select, and delete data. There are multiple user types that are restricted to what data they can see. For example, we have a company use that can only see data relavant to that company.
From what I've seen, the security is handled through If statements in the front end. Example, if userlevel = 1 then do this, if userlevel = 2 do this. These statments are used to show or hide columns in a grid, run queries to return data, and any other restrictions needed. For a company user the code behind gets the companyid assigned to the user and uses that in a query to return the results of all the data associated with that companyid (services, ships, etc).
Any recommendations for fixing this will be highly appreciated.
It's hard to say without seeing any implementation details, but on the surface it appears that there maybe some company level caching. Check for OutputCache settings, DataSource caching, explicit caching with Page.Cache, etc.
This article is a little dated, but at a glance it looks like most information is still relevant in ASP.NET 4.0.
ASP.NET Caching: Techniques and Best Practices
In addition to jrummerll's answer, check the Data Acces Layer of our app and make sure that you don't have any static variables defined. Having a static variable defined could cause this sort of issue too, since 2 contending requests may overwrite the value of the CompanyID, for example.
You basic model should work. What you've told us is not enough to diagnose the problem. But, I've got a few guesses. Most likely your code is confusing UserID or CompanyID values.
Are you mistakenly storing the CompanyID in the Cache, rather than the session?
Is the CompanyID stored in a static variable? A common (and disastrous!) pitfall in web applications is that a value stored in a static variable will remain the same for all users! In general, don't use static variables in asp.net apps.
Maybe your db caching or output caching doesn't vary properly by session or other variables. So, a 2nd user will see what was created for the previous user. Stop any caching that's happening and see if that fixes it, but debug from there.
Other variations on the above themes: maybe the query is stored in a static variable. Maybe these user-related values are stored in the cache or db, but the key for that record (UserID?) is stored in a static variable?
You can put that if statements in a thread. Threading provides you the option that only 1 user can access the application or gridview in your case.
See this link: http://msdn.microsoft.com/en-us/library/ms173179.aspx
Here is some sample code that is throughout the entire application that is used for filtering results. What is the best way to fix this so that when one user logs on, the other user doesn't see those results?
protected void PopulategvServiceRequestListing(string _whereclause)
{
_dsGlobalDatasource = new TelemarServiceRequestListing().GetServiceRequestListingDatasource(_whereclause);
if(_dsGlobalDatasource.Tables[0].Rows.Count!=0)
{
gv_ServiceRequest.DataSource = _dsGlobalDatasource;
gv_ServiceRequest.DataBind();
}
else
{
gv_ServiceRequest.DataSource=new TelemarServiceRequestListing().DummyDataset();
gv_ServiceRequest.DataBind();
gv_ServiceRequest.Rows[0].Visible = false;
gv_ServiceRequest.HeaderStyle.Font.Bold = true;
}
}

NHibernate compromising domain objects

I'm writing an ASP.NET MVC application using NHibernate as my ORM. I'm struggling a bit with the design though, and would like some input.
So, my question is where do I put my business/validation logic (e.g., email address requires #, password >= 8 characters, etc...)?
So, which makes the most sense:
Put it on the domain objects themselves, probably in the property setters?
Introduce a service layer above my domain layer and have validators for each domain object in there?
Maintain two sets of domain objects. One dumb set for NHibernate, and another smart set for the business logic (and some sort of adapting layer in between them).
I guess my main concern with putting all the validation on the domain objects used by NHibernate. It seems inefficient to have unnecessary validation checks every time I pull objects out of the database. To be clear, I think this is a real concern since this application will be very demanding (think millions of rows in some tables).
Update:
I removed a line with incorrect information regarding NHibernate.
To clear up a couple of misconceptions:
a) NHib does not require you to map onto properties. Using access strategies you can easily map onto fields. You can also define your own custom strategy if you prefer to use something other than properties or fields.
b) If you do map onto properties, getters and setters do not need to be public. They can be protected or even private.
Having said that, I totally agree that domain object validation makes no sense when you are retrieving an entity from the database. As a result of this, I would go with services that validate data when the user attempts to update an entity.
My current project is exactly the same as yours. Using MVC for the front end and NHibernate for persistence. Currently, my validation is at a service layer(your option 2). But while I am doing the coding I am having feelings that my code is not as clean as I wish. For example
public class EntityService
{
public void SaveEntity(Entity entity)
{
if( entity.Propter1 == something )
{
throw new InvalidDataException();
}
if( entity.Propter2 == somethingElse )
{
throw new InvalidDataException();
}
...
}
}
This makes me feel that the EntityService is a "God Class". It knows way too much about Entity class and I don't like it. To me, it's feels much better to let the Entity classes to worry about themselves. But I also understand your concern of the NHibernate performance issue. So, my suggestion is to implement the validation logic in Setters and use field for NHibernate mapping.

Need to store a static value for the duration of a request. How?

I have an ASP.NET MVC application. I have come to an idea of generating autoincremented values to be used as unique element ids. The question is, how can I have and work with a global variable which should be there for the duration of a request (page generation) but no longer?
I thought of using TempData for this shared variable and then just delete this key when the page is done. But then, where in code to purge this TempData key? Obviously it has to be some very last piece of code where the page has been rendered already.
Any input is highly appreciated.
EDIT: I have a number of HTML helpers that can be called from various views and partial views, so declaring a variable on a page and passing it to each helper is obviously not a good solution. I wish to just use the helpers and know they all are getting unique ids behind the scenes.
Okay, I have googled a little bit and found a solution on ASP.NET forums.
http://forums.asp.net/t/1401685.aspx
Obviously, I can use the HttpContext.Current.Items collection to have my little static variable for the duration of a request.
If all you need is to store a number, the resources that would take to manage its lifestyle would take a lot more than just having a one static integer and always reusing it.
Do not bother deleting the key after each request. Just use a static (I think this is shared in visual basic) integer, use and increment it every time you need a unique value. Also take its mod with a ridiculously high number each time to make sure it will not be reused in a single request and it will never overflow.
Why don't you define your integer variable at the top of the page view file?
Use it throughout the view rendering execution and at the end of it you can easily leave it as is. You don't have to explicitly destroy anything. Your variables live for the duration of request only. IIS is stateless service (if you subtract Session, Cache and Application variables) so it doesn't really remember anything explicitly.
I would imagine you could use the Application_BeginRequest and Application_EndRequest methods in global.asax.cs; Note I can't double check the method names currently, but I think they are close.
You could create a member variable in your controller which would be regenerated for each request:
public class ItemController : Controller
{
private int _UniqueID = 0;
public ActionResult Index()
{
foreach (var item in items)
{
item.UniqueID = _UniqueID++;
}
// etc...
}

Resources