From a security stand point what are a couple major points that would aid in the result of using a .master file versus a .aspx file?
From a security standpoint, there isn't really a difference in a .master file and a .aspx file. They do have a unique execution path when it comes to a page life cycle, but they are executed in the same way and would be prone to the same security flaws & protections.
That said, the reduction of code that a .master file allows (as well as forcing you to think about generaliation) will go a long way in helping you develope a reliable (related to security) website.
Technically, there's no major difference from a security standpoint. You could implement your security logic from within the Master page, which would insure that it's included on every page that uses that MasterPage. You could argue this makes things more secure because there is less chance for human error :).
Both master and content page make up 1 rendered page in your browser. The master page is there to create a common look and feel throughout your entire site, or parts of it. You can even nest master pages if needed.
ASP.NET as a whole already has quite some security hooks built in by default. RequestValidation to prevent malicious input, parametrized queries are possible to prevent SQL injection, Membership for authentication, Roles for authorization, UrlAuthorization to prevent people of guessing urls and might be able to see sensitive data, ...
Be sure to also check out patterns & practices Security How Tos Index.
Its a trick homework question -- neither should have anything to do with security.
Related
I want to create a quick page load response in ASP.NET MVC .
If I use [outputCache] then it saves the whole page with the dynamic parts and then a new client will see previous client information.
What is the Best Practice to Do It?
I saw that there is a Cache Tag Helper but will it be faster?
Because I still have to go into the Action and and rendering the page except for the section of the Cache Tag Helper.
Many thanks to those who have an optimal and fast solution.
In the docs for response caching, Microsoft has a prominent warning:
Disable caching for content that contains information for
authenticated clients. Caching should only be enabled for content that
doesn't change based on a user's identity or whether a user is signed
in.
As you indicate, your scenario involves dynamic authenticated content. Thus you should avoid caching the rendered output as a whole, and maybe consider caching specific data or elements within a page only if you're very careful and performance requires it. Otherwise, safer to leave defaults. ASP.NET Core is very fast -- it's unlikely the rendering is the bottleneck in most cases.
Is there a way to programmatically prime the asp.net output cache? I've investigated the caching API and can't seem to find an obvious way to do this. Has anyone tried something like this? If so, what method did you use?
I gave some thought to this last year and ended up concluding that it was not that important for the case, but if it's important for you website, all you have to do is to simply call the webpages from somewhere like Application_Start (after all code has run) event but you shouldn't stop there!
The cache will eventually expire and to avoid that you should set up some way to cache the pages again before any clients requests that page.
Make the outputcache dependent on someother object in cache and set an expiration callback.
Thus, when that cache object expires, so does your pages and you should make http requests to the pages you want to recache and so on.
I'm answering to this question, but the amount of effort and question marks I still have in my mind lead me to advise not to go through with this...
UPDATE
The only kind of dependency you may set in outputcache is sql dependency. Use it if you want, but if you would need to depend your outputcache on some other business object, then this might get very difficult. I could tell you that you could set a database object and depend your database on it and expire it yourself using some kind of timer.
Man, the longer I write the more solutions and difficulties I find! I can't write a book for something that is not worthy your precious time. Believe me you that the usefulness for this will be nearly zero.
Priming the cache is as others have suggested as easy as requesting the pages you want cached. Of course if you do this programmaticly it will only request the HTML and not all the linked resources (CSS, JavaScript, Images...) which is a good thing to avoid wasted bandwidth.
For many websites the items that are cached which consume the biggest performance penalties are common to many or all pages. For example a navigation system on a large CMS or storefront may query the database and do a bunch of rendering work which can then be cached for all pages. Also a big part of the initial load in ASP.net is when the website if first accessed and loaded into memory. Both of these issues can be addressed by even calling a single page on your site, but there is nothing stopping you from making a list of URLs and calling each one periodically.
If your cache policy is set for a 20 minutes timeout, maybe request each page once every 17-18 minutes.
Here are some resources with source code to help you get started:
Good Simple Primer on requesting web URL in C#
Website Monitoring Windows Service
Asyncronous Website Monitor
As I mentioned before, you can easily extend these to "foreach" over an array or list of URLs to be requested.
I've been doing the html and css for a site, sending it off to a guy to implement in a web server. I get a call from the designer freaking out about the progress, saying the clients aren't happy. He wants me to personally integrate my css with what's on the site. The site is done in ASP.net, time is short, and I'm a little in over my head. I have an understanding of how php works, but have never worked extensively with it.
Looking at the stuff on the ftp, I can't even find equivalent of the index.html file (I know that when I go to the site itself, there is nothing after the base url, i.e., www.site.com/ brings me to the homepage.)
Can anyone give me a few tips or links as to what I am to do with this, or where to even being navigating this site?
EDIT: It's -not- a .Net Web Application, from the looks of it.
ASP.Net can be run in a compiled or a scripted environment. It is important to understand which environment your client has. If it is completely scripted, then you are likely looking for the default.aspx file and it's contents. If it is a compiled environment, you may be in for a ride. A compiled site may incorporate "master pages" as a templating engine, and then you'll need to apply your html/css modifications in several places.
You should start with the default.aspx page if there is one. Look for master page directives (it'll be named something like masterpage.master). If there isn't one, then you're in luck you'll just need to implement your changes on a page by page basis. The aspx page will be in a templated xml format so avoid touching tags that involve touching
If you are making changes to divs and structures of that nature, you may need to modify the CssClass attribute of the controls. I would recommend however that you make a back up, give it a shot, and under no circumstances attempt to do something that you aren't really ready to do. You will only anger the client and ruin your rep. It may actually be prudent to contact an actual ASP.Net developer to analyze the files separately and determine what you need to do.
I suggest that you read the Wikipedia article about ASP.NET to get familiarized with it as it summarizes the basic building structures.
Then, just to get you started: take a look at the more recent ASP.NET MVC (Model-View-Controller) paradigm. There's also development in what is called ASP.NET WebForms.
For example: when you go to www.site.com/ (known as friendly URLs) it may be redirecting you to an action method inside a controller. It's called routing. There's also URL rewrite.
In the MVC world a Controller can send/redirect the user to a specific View/Page.
A View (.aspx form/page) that contais HTML markup and CSS on the server side is basically an HTML page (.htm) page that'll be rendered on the client side.
I've recently started embedding JavaScript and CSS files into our common library DLLs to make deployment and versioning a lot simpler. I was just wondering if there is any reason one might want to do the same thing with a web application, or if it's always best to just leave them as regular files in the web application, and only use embedded resources for shared components?
Would there be any advantage to embedding them?
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
I doubting and questioning the validity of Easement's comment about how browsers download JavaScript files. I'm pretty sure that the embedded JavaScript/CSS files are recreated temporarily by ASP.NET before the page is sent to the browser in order for the browser to be able to download and use them. I'm curious about this and I'm going to run my own tests. I'll let you know how it goes....
-Frinny
Of course if anyone who knew what they were doing could use the assembly Reflector and extract the JS or CSS. But that would be a heck of a lot more work than just using something like FireBug to get at this information. A regular end user is unlikely to have the desire to go to all of this trouble just to mess with the resources. Anyone who's interested in this type of thing is likely to be a malicious user, not the end user. You have probably got a lot of other problems with regards to security if a user is able to use a tool like the assembly reflector on your DLL because by that point your server's already been compromised. Security was not the factor in my decision for embedding the resources.
The point was to keep users from doing something silly with these resources, like delete them thinking they aren't needed or otherwise tamper with them.
It's also a lot easier to package the application for deployment purposes because there are less files involved.
It's true that the DLL (class library) used by the pages is bigger, but this does not make the pages any bigger. ASP.NET generates the content that needs to be sent down to the client (the browser). There is no more content being sent to the client than what is needed for the page to work. I do not see how the class library helping to serve these pages will have any effect on the size of data being sent between the client and server.
However, Rjlopes has a point, it might be true that the browser is not able to cache embedded JavaScript/CSS resources. I'll have to check it out but I suspect that Rjlopes is correct: the JavaScript/CSS files will have to be downloaded each time a full-page postback is made to the server. If this proves to be true, this performance hit should be a factor in your decision.
I still haven't been able to test the performance differences between using embedded resources, resex, and single files because I've been busy with my on endeavors. Hopefully I'll get to it later today because I am very curious about this and the browser caching point Rjlopes has raised.
Reason for embedding: Browsers don't download JavaScript files in parallel. You have a locking condition until the file is downloaded.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
Regarding the browser cache, as far as I've noticed, response on WebRecource.axd says "304 not modified". So, I guess, they've been taken from cache.
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
You know that if somebody wants to tamper your JS or CSS they just have to open the assembly with Reflector, go to the Resources and edit what they want (probably takes a lot more work if the assemblies are signed).
If you embed the js and css on the page you make the page bigger (more KB to download on each request) and the browser can't cache the JS and CSS for next requests. The good news is that you have fewer requests (at least 2 if you are like me and combine multiple js and css and one), plus javascripts have the problem of beeing downloaded serially.
A customer is asking if there is anything we can do to remove "/Pages" from his Internet-facing MOSS publishing site. Some Googling reveals that some clever use of HTTPModules may be able to hide the presence of Pages, but I've yet to see an end-to-end working solution. Have any of you come up against this particular requirement, and if so, how did you resolve it?
The customer's main concern with /Pages is the SEO impact of it - if anyone has any way to mitigate those issues or can explain why having this extra level in your URL would not be a concern, that would be appreciated as well (and probably better, in the long run!)
Check out this posting. http://blog.mastykarz.nl/semantic-urls-in-moss-2007-imtech-sharepoint-semantic-urls-free-feature/
The main issue you'll have is that Microsoft won't provide support for a SharePoint instance that has "hidden" the pages library.
Yes, you can use a URL re-writer to exclude the /pages section of the path, and you will also need to perform a search and replace on the response stream to strip it out of all generated URLS - this will obviously have a performance hit on the server - but with careful use of caching, it might not be that noticable.
PSS will require you to remove the setup before they will investigate any issues with your site, so you (or your client) will need to weigh up the perceived benefits with the performance and support issues.
I believe we've done it for one of our clients in the past, but most are happy to stick with the /pages element - it really doesn't have that much effect.
I know ASP.NET 3.5 SP1 has the URL routing engine that ASP.NET MVC uses built in. If you wanted to run against that version of the .NET framework, you could use routes to eliminate the /Pages part of the URL. But I'm not positive about running MOSS on that version of .NET. That's the first place I'd check, though.
You can get a list of public facing websites using MOSS here. You can see they use the "page" libraries and you can check your favorite search engines against the content.
Hopefully this will be enough to demonstrate that the "Page" libraries aren't going to be too much of an issue and you can save them a bunch of cash.
You can change the name (and the url) of the /Pages library.