When I search my problem I find this old question.
How to find out what technology a program is using?.
Its define how to show but not how to hide ? I have several asp.net sites .Some of my sites use CKEditor and others JavaScript Libraries.
when I type my url in :
http://builtwith.com/
its show :
Is there any web.config setting or another setting not to show technology or program used by my site.I search a lot but unable to find any thing.Any help would be much appreciated .Thanks.
By default ASP.NET shouts about itself a lot. It sends HTTP headers with each response telling the world and dog what version of ASP.NET your site is hosted on and even what version of MVC you are using. Below is an example of the extra headers needlessly being sent with every request:
To fix this problem you need to do a few things. The first is to set the enableVersionHeader setting on the httpRuntime section to false.
<!-- enableVersionHeader - Remove the ASP.NET version number from the response headers. Added security through obscurity. -->
<httpRuntime targetFramework="4.5" enableVersionHeader="false" />
Then you need to clear the custom headers as shown below.
<httpProtocol>
<customHeaders>
<!-- X-Powered-By - Remove the HTTP header for added security and a slight performance increase. -->
<clear />
</customHeaders>
</httpProtocol>
for more read this post: Securing the ASP.NET Web.config
And also there is project in github which called NWebsecand NWebsec lets you configure quite a few security headers, some are useful for most applications while others are a bit more specialized. Here's the project link:
Getting started with NWebsec.
In addition to obfuscating your scripts, your website may also give away information in the form of http headers and html meta tags. For example one of my sites shows these http response headers:
Server: Microsoft-IIS/8.5
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
These show my site is running iis8.5 and which .net version which is the first information shown on builtwith.com. Most if not all web servers have a way of suppressing these and of course you can control the meta tags.
Also the url can contain clues as well. If you have urls that end in .aspx, .jsp, .php that is a dead giveaway. You can solve this using SEF urls or by using some sort of url rewriter for whatever server technology you are using
Please, note that there are already exact same questions about it:
StackOverflow
Nick says that if you're running apache, you can set your ServerTokens to Prod and set your ServerSignature to Off to remove the used modules from http requests. I hope you could find a similar property for your asp.net application.
Authentictech also says on behalf of gary that you can ask them to remove your sites from their lookup service on this link. Looking at that link, you (as a domain owner) can remove your sites' entries from their lookup index forever.
WebMasters
Su' says that there's a BuildWith's page that tells:
The technology has to be discoverable in either the page body, cookies
or server headers.
It also mentions Security through obscurity, but concludes that (on a security point of view) the modules' security is much more important than hiding what you're using
Assuming that you are interested on general explanations (since there might be other builtwith-like sites doing similar things):
Those kind of applications probably analyzes also hints like the html structure, the html attributes, the html meta tags, the http headers, the built url and its file extensions, and the html view state, which enables it to analyze and (given a few technology-specific patterns/standards) infer which technologies you use like asp.net, jsf, and others. So, even you could some way really obfuscate the generated script, customize the built url, and handle the http headers, I think you won't be fully able to hide the used technologies due to html structure including its attributes, meta tags and view state controls.
Related
I wish to hide IIS server version and ASP.NET version from response headers. For web pages, was able to do by making changes in web.config file. But that does not solve the issue for css or image files.
Could anyone suggest how to hide these header fields for all the requests including css/image files?
We tried below solutions. It does not hide those version parameters for css or image files from response header.
httpRuntime maxRequestLength="4096" targetFramework="4.5" enableVersionHeader="false"
Note : Looking for a solution where changes should be minimal to minimize the time of testing the code. If not from code side, what can be done from server side as well or any other configuration?
I stumbled upon some strange behaviour today on a website at work. Our SEO consultant wanted some strange looking links taken away from Googles index, a seemingly straight-forward task. But it turned out to be very difficult.
The website was a .net MVC 5.2.3 application. We looked at routing, our own libraries etc. Nothing strange. After a while we gave up and tried simply redirect request to these urls by setting up a rule in web.config. Turns out these URL:s are unmatchable! Somehow under the right conditions the critical part of the URL seem to avoid matching rules as well as routing later on in the MVC application.
We narrowed down the mystical URL:s to the format (T(anything)) where T can by any capital letter and anything can be eh, anything. This is placed in the beginning of the URL as if it were a directory. In regex: \([A-Z]\([a-zA-Z0-9]*\)\)
I've tested and found the same behaviour on:
.net MVC5 sites
.net MVC3 sites
.net Web Forms sites
http://asp.net
http://stackoverflow.com
Some examples from stackoverflow.com:
Bypasses routing: https://stackoverflow.com/(K(jonas))/questions
Routes normal (404): https://stackoverflow.com/jonas/questions
Bypasses routing: https://stackoverflow.com/(G(hello))/users/1049710/jonas-%C3%84ppelgran
Routes normal (404): https://stackoverflow.com/gandhello/users/1049710/jonas-Äppelgran
It doesn't seem to affect the whole web, so it shouldn't be a browser or HTTP issue. Some examples:
Routes normal (404): http://php.net/(T(testing))/downloads
Routes normal (404): https://www.iana.org/(T(testing))/domains/reserved
Can anybody explain what is going on?
And what I can do to prohibit these URL:s to bypass routing?
Apparently this is a feature called a "cookieless session" in ASP.NET. See "Cookieless SessionIDs" section here in the MSDN docs.
The basic idea is that instead of storing the session id (if session state is enabled) in a cookie, it's now embedded in the URL.
We (Stack Overflow) disable session state entirely (by setting sessionState mode to off). As far as I know, the end result is that any time one of the URLs that match the session id format is used, that information is simply discarded.
None of the links leading to us in Google include it either, which makes me think that your site may be configured to actually generate session IDs in URLs? Short of disabling the feature, there's probably not much you can do here. Although, see "Regenerating Expired Session Identifiers" on the MSDN page I linked above to see how to at least prevent accidental session sharing if that's not already done.
I have way too many pages in the application that basically load the same set of xml and js files for client side interaction and validation. So, I have about dozen lines like this one <script type="text/javascript" src="JS/CreateMR.js"></script> or like this one <xml id="DefaultDataIslands" src="../XMLData/DataIslands.xml">.
These same files are included in every page and as such browser sends request to read them every time. It takes about 900ms just to load these files.
I am trying to find a way to load them on just the login page, and then use that temp file as source. Is it possible to do so? If yes, how and where should I start?
P.S. A link to a tutorial will work too, as I have currently no knowledge about that.
Edit:
I can't cache the whole page, because the pages are generated at runtime based on the different possible view modes. I can only cache the js and xml file. Caching everything might be a problem.
Anyway, I am reading through the articles suggested to figure out how to do it. So, I may not be able to accept any answer right away, while I finish reading and try to implement it in one page.
Edit:
Turns out caching is already enabled, it is just that my server is acting crazy. Check the screenshot below.
With Cache
Without cache
As you see, with cache, it is actually taking more time to process some of the requests. I have no idea what that problem is, but I guess I should go to the server stack exchange to figure this out.
As for the actual problem, turns out I don't have to do anything to enable caching of xml and js files. Had no idea browsers automatically cache js files without using specific tag.
Totally possible and in fact recommended.
Browsers cache content that have been sent down with appropriate HTTP caching headers and will not request it again until the cache has expired. This will make your pages faster and more responsive and your server's load much lighter.
Here is a good read to get you started.
Here is ASP.NET MVC caching guide. It focuses on caching content returned from controllers.
Here is a read about caching static content on IIS with ASP.NET MVC.
Basically, you want to use browser caching mechanism to cache the src files after the first request.
If you're using F12 tools in your browser to debug network requests, make sure you have disable cache option unchecked. Otherwise, it forces browser to ignore cached files.
Make sure your server sends and respects cache headers - it should return HTTP status 304 Unmodified after first request to a static file.
Take a look at Asp.Net Bundling and minification - if you have for example multiple js source files, you could bundle them into one file that will be cached on the first request.
Additionally, if you use external js libraries, you could download them from a CDN instead of your server - this will both offload your server and enable user browser to use cached script version (meaning - if some other page that user has visited also used the same script, browser should already have it cached).
One approach is caching static files via IIS by adding <clientCache> element in web.config file. The <clientCache> element of the <staticContent> element specifies cache-related HTTP headers that IIS and later sends to Web clients, which control how Web clients and proxy servers will cache the content that IIS and later returns.
How to configure static content cache per folder and extension in IIS7?
Client Cache
for more info on client side caching read this part of Ultra-Fast ASP.NET 4.5 book:
Browser Cache and Caching Static Content
Other approach is caching portions of page.
if your are using Web Form:
Caching Portions of an ASP.NET Page
and if you are using MVC, use Donut Hole Caching
ASP.NET MVC Extensible Donut Caching
Donut Caching and Donut Hole Caching with Asp.Net MVC
The browser has to ask the server if the file has been modified or not since it put it to the cache, therefore the http statuscode 304. Read more from https://httpstatuses.com/304.
As this is asp.net please make sure you are first running it with
<compilation debug="false"/>
as enabling debugging has some side effects which include.
"All client-javascript libraries and static images that are deployed via
WebResources.axd will be continually downloaded by clients on each page
view request and not cached locally within the browser."
More read from https://blogs.msdn.microsoft.com/prashant_upadhyay/2011/07/14/why-debugfalse-in-asp-net-applications-in-production-environment/
I am just curious to know if there is a specific reason why the .Net Framework adds the 'X-Powered-By:ASP.NET' Http Header in its responses? Do other web servers (Apache, httpd) do the same thing?
EDIT: I know that it can be changed. I want to know if there is a reason to keep it or leave it as it is?
I know that PHP does this. I guess there is no real purpose, other than marketing and making it easier for script kiddies to find suitable victims. For PHP it's better to disable the flag entirely since it shows the PHP version and therefore makes the server more vulnerable to attacks.
Edit: Who knows, it might also lead to better search results on bing... ;-)
It is a default custom header when using IIS. It is a setting in IIS, you can change it if you wish.
Using IIS6 -
Click on the HTTP Headers tab
You can edit or remove the header in the Custom HTTP Headers box.
It is probably there so that sites like Netcraft can pull together statistics for the number of servers running IIS and ASP.NET. This used to be considered an important thing when .NET was released. By stating that n number of sites started using ASP.NET Microsoft could provide metrics for companies that only adopt technology based on the number of other users out there.
I don't believe there is a strong technical reason for having it since a PHP app could imitate an ASP.NET application, by setting the same header in Apache. I could imagine some naive client applications like FrontPage 2003, or SharePoint Designer might use headers like this to validate that they are indeed connecting to an ASP.NET enabled site but that is speculation on my part.
It is fairly common to see a signature for the server/executing engine sent with the headers of a page whether you're running Apache and PHP or IIS and ASP.NET. Just acts as some free publicity, I suppose.
"X-Powered-By:" isn't a standard header, but "Server: " is (and it clearly serves the same purpose).
In a world of SaaS and Cloud services, Web frameworks are 'strategic' assets, and every little piece of real-estate is advidly conquered... sometimes the cheating way.
Tomcat, Apache, WebSphere, JBoss, you name it..
Appearantly, it's not actually a standard HTTP header field.
If "Why" used in context of "how to change it" - go to IIS properties of your site ant open tab "HTTP Headers" and correct Custom HTTP Header.
Any suggestions on how to do browser caching within a asp.net application. I've found some different methods online but wasn't sure what would be the best. Specifically, I would like to cache my CSS and JS files. They do change, however, it is usually once a month at the most.
Another technique is to stores you static images, css and js on another server (such as a CDN) which has the Expires header set properly. The advantage of this is two-fold:
The expires header will encourage browsers and proxies to cache these static files
The CDN will offload from your server serving up static files.
By using another domain name for your static content, browsers will download faster. This is because serving resources from four or five different hostnames increases parallelization of downloads.
If the CDN is configured properly and uses cookieless domain then you don't have unnecessary cookies going back and forth.
Its worth bearing in mind that even without Cache-Control or Expires headers most browsers will cache content such as JS and CSS. What should happen though is the browser should request the resource every time its needed but will typically get a "304 Unmodified" response and the browser then uses the cached item. This is can still be quite costly since its a round trip to the server but the resource itself isn't sent so the bytes transfered is limited.
IE left with no specific instructions regarding caching will by default use its own heuristics to determine if it should even bother to re-request an item its cached. This despite not being explicitly told that it can cache a resource. Its hueristics are based on the Last-Modified date of the resource, the older its the less likely it'll have changed now is its typical reasoning. Very wooly.
Frankly if you want to make a site perfomant you need to have control over such cache settings. If you don't have access to these settings then don't wouldn't worry about performance. Just inform the sponsor that it may not perform well because they haven't facilitated a platform that lets you deliver that.
You best bet to do this is to set an Expires header in IIS on the folders you want the content cached. This will tell most modern browsers and proxies to cache this static content. In IIS 6:
Right click on the folder (example CSS or JS) you want to be cached by the browser.
Click properties
Go to the HTTP Headers Tab
Check "Enabled content expiration"
Set some long period for expiration, like "Expires after 90 days"
Yahoo Developer's Blog talks about this technique.
Unless you configure IIS to give asp.net control of js/css/image requests it won't see them by default, hence your best plan (for long term maintainability) is to deliberately tweak the response headers at your firewall/trafficmanager/server or (better and what most of the world does at this point) to version your files in path, i.e:
Instead of writing this in your mark-up:
http://www.foo.com/cachingmakesmesad.css
Use this:
http://www.foo.com/cachingmakesmesad.css?v1
..and change the version number when you need to effectively clear cache. If that's every time then you could even append a GUID or datestamp instead, but I can't think of any occasion where I would want to do that really.
I thought your question was anti-cache but re-reading it I see I wasted a good answer there :P
Long story short, browsers are normally very aggressively pro-caching "simple" resources so you shouldn't have to worry about this, but if you did want to do something about it you would have to have access to a firewall/trafficmanager/IIS for the reasons above (ASP.NET won't be given a chance by default).
However... there is no way you can absolutely force caching, and nor should you. What is and isn't cached is rightfully the decision of the end-user, all you can do is strongly request.
In .net you can set up your JavaScript, CSS and Images as embedded resources.
.Net will then handle the file expiration for you.
The downside to this approach is that you have to do a new build for each set of changes (this might be an upside, depending on your deployment and workflow).
You could also use ETags, but from what I understand in some cases it doesn’t work well if you have mix of IIS and apache Webservers hosting your images, (or if you plan to switch in the future).
You can just make sure the file date is newer, and let the server handle it for you, but you’ve got to make sure the server is configured correctly.
You can cache static content by adding following code in web.config
<system.webServer>
<staticContent>
<clientCache httpExpires="Tue, 12 Apr 2016 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>
</system.webServer>
See the clientCache documentation for more details.