How to recognize client's computer? - asp.net

I have a problem recognizing client's computer data. I'm developing an application that requires some king of recongnision of machine client is currently using. I tried
System.Environment.MachineName;
and also
string hostName = Dns.GetHostName();
var add = Dns.GetHostAddresses(hostName);
, but it always returns data of server computer IIS is working on. Also i tried to get processors and basebords id's but with same result as previous examples.
ManagementObjectSearcher mos = new ManagementObjectSearcher("SELECT SerialNumber FROM Win32_BaseBoard");
ManagementObjectCollection moc = mos.Get();
foreach (ManagementObject mo in moc)
{
serial = mo["SerialNumber"].ToString();
}
ManagementObjectSearcher mbs = new ManagementObjectSearcher("Select * From Win32_processor");
ManagementObjectCollection mbsList = mbs.Get();
string id = "";
foreach (ManagementObject mo in mbsList)
{
id = mo["ProcessorID"].ToString();
}t
System will be used inside a private network(not to be used publicly).
Is there any way to somehow recognize any unique client's computer data in ASP.NET web application development, or .NET simply deny any recognision of client's computer data due to safety?
Thanks

Since the code runs on the Server, "System" is the Server. In ASP.NET you can get information about who is requesting (the client) with the Request object. Try this:
Request.UserHostAddress
Or this:
Request.UserHostName

Related

IIS Worker process using 6gb RAM on web server with ASP.NET MVC web site

I have a web site running in its own Application Pool (IIS 8). Settings for the pool are default i.e. recycle every 29 hours.
Our web server only has 8gb RAM and I have noticed that the worker process for this web site regularly climbs to 6gb RAM and slows the server to a crawl. This is the only site currently on the web server.
I also have SQL Express 2016 installed as well. The site is using EF version 6.1.3.
The MVC site is very straightforward. It has a GETPDF controller which finds a row in a table, gets PDF info stored in a field then serves it back to the browser as follows :-
using (eBillingEntities db = new eBillingEntities())
{
try
{
string id = model.id;
string emailaddress = Server.HtmlEncode(model.EmailAddress).ToLower().Trim();
eBillData ebill = db.eBillDatas.ToList<eBillData>().Where(e => e.PURL == id && e.EmailAddress.ToLower().Trim() == emailaddress).FirstOrDefault<eBillData>();
if (ebill != null)
{
// update the 'Lastdownloaded' field.
ebill.LastDownloaded = DateTime.Now;
db.eBillDatas.Attach(ebill);
var entry = db.Entry(ebill);
entry.Property(en => en.LastDownloaded).IsModified = true;
db.SaveChanges();
// Find out from the config record whether the bill is stored in the table or in the local pdf folder.
//
Config cfg = db.Configs.ToList<Config>().Where(c => c.Account == ebill.Account).FirstOrDefault<Config>();
bool storePDFDataInEBillTable = true;
if (cfg != null)
{
storePDFDataInEBillTable = cfg.StorePDFDataInEBillDataTable;
}
// End of Modification
byte[] file;
if (storePDFDataInEBillTable)
{
file = ebill.PDFData;
}
else
{
string pathToFile = "";
if (string.IsNullOrEmpty(cfg.LocalPDFDataFolder))
pathToFile = cfg.LocalBackupFolder;
else
pathToFile = cfg.LocalPDFDataFolder;
if (!pathToFile.EndsWith(#"\"))
pathToFile += #"\";
pathToFile += ebill.PDFFileName;
file = System.IO.File.ReadAllBytes(pathToFile);
}
MemoryStream output = new MemoryStream();
output.Write(file, 0, file.Length);
output.Position = 0;
HttpContext.Response.AddHeader("content-disposition", "attachment; filename=ebill.pdf");
return new FileStreamResult(output, "application/pdf");
}
else
return View("PDFNotFound");
}
catch
{
return View("PDFNotFound");
}
Are there any memory leaks here?
Will the file byte array and the memory stream get freed up?
Also, is there anything else I need to do concerning clearing up the entity framework references?
If the code looks OK, where would be a good place to start looking?
Regards
Are there any memory leaks here?
No.
Will the file byte array and the memory stream get freed up?
Eventually, yes. But that may be the cause of your excessive memory use.
Also, is there anything else I need to do concerning clearing up the entity framework references?
No.
If the code looks OK, where would be a good place to start looking?
If this code is the cause of your high memory use, it's because you are loading files into memory. And you're loading two copies of each file in memory, once in a byte[] and copying to a MemoryStream.
There's no need to do that.
To eliminate the second copy of the file use the MemoryStream(byte[]) constructor instead of copying the bytes from the byte[] to an empty MemoryStream.
To eliminate the first copy in memory, you can stream the data into a temporary file that will be the target of your FileStreamResult, or initialize the FileStreamResult using a ADO.NET stream.
See https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/sqlclient-streaming-support
If you go to ADO.NET streaming your DbContext, will need to be scoped to your Controller, instead of a local variable, which is a good practice in any case.
In addition to David's advice. I noticed that I was doing the following
**db.eBillDatas.ToList<eBillData>()**
therefore I was getting all the data from the database then fetching it again with the where clause.
I didn't notice the problem until the database started to fill up.
I removed that part and now the IIS worker processing is about 100mb.

"Unable to determine a valid ordering for dependent operations" on production server

I have been working on a WCF web service, which is used by a mobile app that would send some data to it and save to DB.
One of the test case is that we try to append 2 (or more) records in the app, and the service is called to do a batch insert / update action.
Everything goes fine when I test using localhost, but when we test it using production server, only
the first record is saved, while the other record triggers the error message
Unable to determine a valid ordering for dependent operations...store-generated values.
I have no idea what is the cause and how to solve it. I have done some research and I am quite sure that the related model/DB table has NO circular dependency or self dependency.
Below is a snippet of the web service:
public void submit(List<SubmissionParameter> param){
using (var context = ObjectContextManager.AuditEnabledInstance){
foreach (var item in param){
ReadingSubmission readingSubmission = context.ReadingSubmissions.Where(p => p.ReadingSubmissionUniqueIdentifier == item.Readingsubmissionuniqueidentifier).SingleOrDefault();
if (readingSubmission == null){
readingSubmission = new ReadingSubmission();
context.ReadingSubmissions.AddObject(readingSubmission);
}
readingSubmission.ReadingSubmissionUniqueIdentifier = item.Readingsubmissionuniqueidentifier;
readingSubmission.SystemID = item.Systemid;
readingSubmission.UserID = item.Userid;
foreach (var record in item.Readings){
SystemReading systemReading = context.SystemReadings.Where(p => p.SystemReadingUniqueIdentifier == record.Systemreadinguniqueidentifier).SingleOrDefault();
if (systemReading == null){
systemReading = new SystemReading();
readingSubmission.SystemReadings.Add(systemReading);
}
systemReading.SystemReadingUniqueIdentifier = record.Systemreadinguniqueidentifier;
systemReading.MeasurementID = record.Measurementid;
}
context.SaveChanges();
}
}
}
ReadingSubmission and SystemReading is a 1 to many relation
SubmissionParameter is just a transmission object as the mobile client will send the JSON object to this web service.
I use Telerik Fiddler to post the JSON into this web service for testing, so I am quite sure the problem is not at the mobile client side.
Any help is appreciated! Thanks!
Finally I solve the problem though I am not quite sure why it works.
I move the context.SaveChanges() out of the foreach loop then it all works again for
both localhost and production
Hope it can help someone to save some time

Cannot acces users Active Directory password information on production server

I'm developing an MVC application and I have a routine that gets the currently logged on users password info and it works fine on my PC but when I publish my application to a live server on the domain, I don't seem to be able to gain access to the AD information. I have used very similar code in a currently running asp.net web application and it works just fine. I compared security settings on both applications and they look identical. Here is the routine:
public int GetPasswordExpiration()
{
PrincipalContext domain = new PrincipalContext(ContextType.Domain);
string currUserName = WindowsIdentity.GetCurrent().Name;
UserPrincipal currLogin = UserPrincipal.FindByIdentity(domain, currUserName);
DateTime passwordLastSet = currLogin.LastPasswordSet.Value; //here is where it chokes***
int doyPasswordSet = passwordLastSet.DayOfYear;
int doy = DateTime.Today.DayOfYear;
int daysSinceLastset = (doy - doyPasswordSet);
int daysTilDue = (120 - daysSinceLastset);
return (daysTilDue);
}
I am an administrator on the domain so I think I have an application permissions issue, but since the failing application has the same permissions as the working application, I'm not sure where to look next. Any help is appreciated.
I'm answernig my own question because I want to post the code that works. Wiktor Zychla nailed it when asking if WindowsIdentity.GetCurrent().Name applied to the identity of the application pool rather than the logged in user. As a matter of fact it did, thanks Wiktor!
Here is the modified code that works. I did change the way I got the users identity (explained why below).
Controller Code:
using MyProject.Utilities; // Folder where I created the CommonFunctions.cs Class
var cf = new CommonFunctions();
string user = User.Identity.Name;
ViewBag.PasswordExpires = cf.GetPasswordExpiration(user);
Code in CommonFunctions
public int GetPasswordExpiration(string user)
{
PrincipalContext ctx = new PrincipalContext(ContextType.Domain);
UserPrincipal currLogin = UserPrincipal.FindByIdentity(ctx, user);
DateTime passwordLastSet = currLogin.LastPasswordSet.Value;
int doyPasswordSet = passwordLastSet.DayOfYear;
int doy = DateTime.Today.DayOfYear;
int daysSinceLastset = (doy - doyPasswordSet);
int daysTilDue = (120 - daysSinceLastset);
return (daysTilDue);
}
The one thing that clued me in was that I decided to just run all the code in my controller and when I did, I got a red squiggly saying "The name WindowsIdentity does not exist in this context":
string currUserName = WindowsIdentity.GetCurrent().Name;
Also, the reason I retrieved User.Identity.Name in the Controller and passed it to the function is because once I got things working and wanted to thin out my controller, I tried to get User.Identity.Name in the function but I got another red squiggly with the same message under User in this line:
string user = User.Identity.Name;
So I figure this is a .net thing and just went with getting the User.Identiy.Name in the controller, pass it to the function and all is well. This one really tested my patience and I hope this post can help someone else.

Small performance test on a web service

I'm trying to develop a small application that tests how many requests per second my service can support but I think I'm doing something wrong. The service is in an early development stage, but I'd like to have this test handy in order to check from time to time I'm not doing something that decrease the performance. The problem is that I cannot get the web server or the database server go to the 100% of CPU.
I'm using three different computers, in one is the web server (WinSrv Standard 2008 x64 IIS7), in other the database (Win 2K, SQL Server 2005) and the last is my computer (Win7 x64 Ultimate), where I'll run the test. The computers are connected through a 100 ethernet switch. The request POST is 9 bytes and the response will be 842 bytes.
The test launches several threads, and each thread has a while loop, in each loop it creates a WebRequest object, performs a call, increment a common counter and waits between 1 and 5 milliseconds, then it do it again:
static int counter = 0;
static void Main(string[] args)
{
ServicePointManager.DefaultConnectionLimit = 250;
Console.WriteLine("Ready. Press any key...");
Console.ReadKey();
Console.WriteLine("Running...");
string localhost = "localhost";
string linuxmono = "192.168.1.74";
string server = "192.168.1.5:8080";
DateTime start = DateTime.Now;
Random r = new Random(DateTime.Now.Millisecond);
for (int i = 0; i < 50; i++)
{
new Thread(new ParameterizedThreadStart(Test)).Start(server);
Thread.Sleep(r.Next(1, 3));
}
Thread.Sleep(2000);
while (true)
{
Console.WriteLine("Request per second :"
+ counter / DateTime.Now.Subtract(start).TotalSeconds);
Thread.Sleep(3000);
}
}
public static void Test(object ip)
{
Guid guid = Guid.NewGuid();
Random r = new Random(DateTime.Now.Millisecond);
while (true)
{
String test = "<lalala/>";
WebRequest req = WebRequest.Create("http://"
+ (string) ip + "/WebApp/" + guid.ToString()
+ "/Data/Tables=whatever");
req.Method = "POST";
req.ContentType = "application/xml";
req.Credentials = new NetworkCredential("aaa", "aaa","domain");
byte[] array = Encoding.UTF8.GetBytes(test);
req.ContentLength = array.Length;
using (Stream reqStream = req.GetRequestStream())
{
reqStream.Write(array, 0, array.Length);
reqStream.Close();
}
using (Stream responseStream = req.GetResponse().GetResponseStream())
{
String response = new StreamReader(responseStream).ReadToEnd();
if (response.Length != 842) Console.Write(" EEEE ");
}
Interlocked.Increment(ref counter);
Thread.Sleep(r.Next(1,5));
}
}
If I run the test neither of the computers do an excessive CPU usage. Let's say I get a X requests per second, if I run the console application two times at the same moment, I get X/2 request per second in each one... but still the web server is on 30% of CPU, the database server on 25%...
I've tried to remove the Thread.Sleep in the loop, but it doesn't make a big difference.
I'd like to put the machines to the maximum, to check how may requests per second they can provide. I guessed that I could do it in this way... but apparently I'm missing something here... What is the problem?
Kind regards.
IMO, you're better off using SoapUI for the test. You can easily adjust the test case for the number of threads, number of iterations, etc.. And it'll graph the results. When you hit the plateau where you overwhelm the server, you'll see it on the graph. If one PC isn't enough, just run more of them on other PCs. You can do all of this with the free version.
There are a lot of limiting factors besides the CPU on a web server. There are a lot of IIS settings which throttle the number of connections can be served.
I would read this:
http://www.eggheadcafe.com/articles/20050613.asp
I know it is for IIS 6, but there are things that will still apply.
If you have access to MSDN and have VS 2010 ultimate, I would check out their load testing tools. Purchasing the load testing program can be expensive, but if you need to test something specific, you can use the trial version to accomplish what you need. You can use it to monitor and response time, server utilization, etc. Well worth looking into.
I agree with Chris, and would go a step further to recommend JMeter, as it can also test the database and webapp, all within the same script.

Use WMI to create IIS Application Directory with C#

We have a web application that is installed on Windows 2003 and Windows 2008 systems. In the past, our install code used ADSI to create a couple of application directories in IIS, but this requires the IIS 6 management components to be installed in Windows 2008. I have been trying to use WMI to create the application directories so we can support both operating systems.
I have been trying this code
public static void AddVirtualFolder(string serverName, string websiteId, string name, string path)
{
ManagementScope scope = new ManagementScope(string.Format(#"\\{0}\root\MicrosoftIISV2", serverName));
scope.Connect();
string siteName = string.Format("W3SVC/{0}/Root/{1}", websiteId, name);
ManagementClass mc = new ManagementClass(scope, new ManagementPath("IIsWebVirtualDirSetting"), null);
ManagementObject oWebVirtDir = mc.CreateInstance();
oWebVirtDir.Properties["Name"].Value = siteName;
oWebVirtDir.Properties["Path"].Value = path;
oWebVirtDir.Properties["AuthFlags"].Value = 5; // Integrated Windows Auth.
oWebVirtDir.Properties["EnableDefaultDoc"].Value = true;
// date, time, size, extension, longdate ;
oWebVirtDir.Properties["DirBrowseFlags"].Value = 0x4000003E;
oWebVirtDir.Properties["AccessFlags"].Value = 513; // read script
oWebVirtDir.Put();
ManagementObject mo = new ManagementObject(scope, new System.Management.ManagementPath("IIsWebVirtualDir='" + siteName + "'"), null);
ManagementBaseObject inputParameters = mo.GetMethodParameters("AppCreate2");
inputParameters["AppMode"] = 2;
mo.InvokeMethod("AppCreate2", inputParameters, null);
mo = new ManagementObject(scope, new System.Management.ManagementPath("IIsWebVirtualDirSetting='" + siteName + "'"), null);
mo.Properties["AppFriendlyName"].Value = name;
mo.Put();
}
}
However, I get path not found errors on known directories. If anybody has some references I can use, I would greatly appreciate it. Any other suggestions on how to go about this are also welcome.
Using the code above, you will still need the IIS6 compatibility bits on Windows 2008/IIS7. The reason for this is that the calls to set properties such as DirBrowseFlags, AccessFlags and so on are IIS 6 metabase properties that are not supported in IIS7 without the IIS6 management components.
For IIS7 I'd recommend programming directly against the Microsoft.Web.Administration namespace, but if you really need to use WMI then see this article:
Managing Sites with IIS 7.0's WMI Provider (IIS.NET)

Resources