NODE.JS versus IIS (Why node) [closed] - asp.net

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
Whats the point of node.js creating its own server and tries listening on it, isn't IIS/apache give us all of that? I understand its based on I/O Completion but we have web-servers technology in place. Can someone explain what can be achieved via node (apart from the java script on server side, which can be also achieved via SignalR) that can't be done via ASP.NEt and why we should focus so much on node when we have a tone of technology under asp.net stack
Any classic example of node? typically for a enterprise dev. shop
Most web programming are for data display & eCommerce applications which are mostly database intensive, though lately it has been mash-up as well with web services, yes mobile web is a different game due to hardware sensors I agree but what is Node giving us which ASP.NET with SignalR can't give us?
TIA

What I find very interesting with Node is that everything is event based, which is different than programming ASP.NET or PHP where behavior is more sequential. Not a bad thing, just a different way of doing things.
You can program the server itself (as opposed to programming applications that run on the server) to do more than serving files, the typical example with Node is the chat room application where you broadcast messages to all participants and each can send messages to the server. By programming your own server events (like listen, error, connect etc) you have a lot of control over how things go server side.
Then of course npm, the node package manager, is definitely a plus over having to manually work the dependencies if you want to use 3rd party libs.
To host an ASP.NET site/app you need IIS which is a proprietary system, whereas Apache and Node are more open. Granted though, Node hosting is not as widespread as Apache based hosting.
Hope this answers some of your questions

Each technology can achieve anything. If you prefer ASP.NET over Node, use it. ASP.NET is extremely powerful and there is no reason to use Node over ASP.NET when you have the expertise and software/money to run your services. Node is just different; it has a different execution model (no threading whatsoever) and above all, it is open source and free. It is easy to get started on any OS, and easy to deploy on any OS. But in the end, it comes down to; what do you prefer?

Related

Best tool: Distributed load test for asp.net applications [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
We want a high performance testing tool for a distributed scenario
We want to collect data from clients and from server (memory usage, cpu usage, response time, .net calls etc).
Most of our applications are using .Net 4.0 or Classic Asp.
We have 4 servers. We want 1 controller and three agents working together for testing, collecting data.
What's the best tool for this scenario?
ps: We've tried Visual studio 2012 ultimate and it seems promising. I don't know other tools that fits the scenario.
Give Load Tester a try: http://www.webperformance.com/load-testing/ (disclaimer: I work there). It has a monitoring agent that will run on your Windows servers to collect the metrics you mentioned and a lot more. It also collects client-side metrics such as page load time. The LITE version is free and can run simple tests with unlimited users.
Take a look at Rational Performance Tester. I was about to purchase a license for one of our projects but didn't push through for reasons not related to the software. Looked promising back then.
I would split things up to keep it simple.
First I would check what the average requests per seconds is when using your servers to generate load. For that there is a small tool included in Apache Http Server called ab.exe. It's easy to setup to generate requests.
If you think that you get acceptable response times all is well.
If not, use something like Jetbrains DotTrace (in your app) to collect data when generating load from one server.

Various Websites / Appilcations - Changing the architecture [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have a number of websites and web based applications running on a dedicated web server. The same box currently also runs the database (and this is unlikely to change.)
I have noticed that the requirements I am getting across different projects are often duplicating each other. So, whilst I am quiet I am trying to replan the architecture of the various sites.
Ideally I'd like to extract the duplicate functionality (logins, some of the user reports, error reporting and so on) into a core library, which got me thinking.
If I make a core assembly, adn add it to the bin of each website, then it will communicate with whatever database is in the app config file. But that gives me versioning / maintenance headaches when things change.
I can put this core librar in the GAC, which means I need to regster / unregister it, but all apps can inculde and use it as necessary.
Or the 3rd way I can see of doing this is to use WCF Webservices and add another internal tier to my aps, where they hand off the core work to a seperate set of webservices. the advantage of this is if / when we expand, all the interfaces can stay as a set of webservices, leaving my apps just making http or tcp calls rather than me having to worry about moving bin files or gac'ed assemblies.
Basically I am here to see if anyone has any thoughts / comments / criticisms of either approach, as I'd hate to start going down one road then have to re do it all going the other way, as murphys law states it will go wrong just as we have a major piece of work come in :)
I suggest using the library as a reference as you need it(so it resides in the bin of your application).
What if you need a slight change to one of your reports? After you make the changes, do you plan to do regression tests on all existing apps?
Keeping a different copy of the dll allows you to upgrade it as an when necessary.
Make sure you can find out which app is running which version of the dll.
From a maintenance point of view, you do not always want to change what works.
"What can go wrong, will go wrong", so limit the area where it can go wrong.

SQL Server vs Access Database for Web: Compelling Arguments [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I know you're think "hands down SQL Server" (as am I) but I'm finding myself in delicate situation that requires I "sell" this to my new supervisor (not a developer).
What I'm looking for are compelling arguments for non technical people and some that are "slightly" technical but don't really understand the differences. I'm having a hard time convincing my current shop that this is not only extremely inefficient but dangerous in so many ways. I won't be able to give them a dissertation however to convince them. What arguments can I give them "quickly" that will make them understand how serious this could be?
Thanks!
It depends really. I'd suggest sql express if money is the problem though.
Also there is this:
http://support.microsoft.com/kb/303528
Microsoft Jet is not intended for use with high-stress server
applications, high-concurrency server applications, or 24 hours a day,
seven days a week server applications. This includes server
applications, such as Web applications, commerce applications,
transactional applications, and messaging server applications. For
these types of applications, the best solution is to switch to a true
client/server-based database system, such as Microsoft Data Engine
(MSDE) or Microsoft SQL Server. When you use Microsoft Jet in
high-stress applications such as Microsoft Internet Information Server
(IIS), you may experience any one of the following problems: Database
corruption Stability issues, such as IIS crashing or locking up Sudden
failure or persistent failure of the driver to connect to a valid
database that requires re-starting the IIS service
You don't provide any info to really answer this. what is your application all about? what load will it need to handle? how much data will it retain? what are the backup and availability requirements? etc...
if you are building a little web page for internal use only, Access may get you there. for anything else, or for future expansion, for better tool integration, SQL Server is the right tool. Just download the free express version and build you application. the available features and compatibility with the purchased version are worth it alone. When you outgrow access you'll have to throw away everything and start again, with sql server express you can migrate without changing anything.

Are there any guides on configuring ASP.NET Trust levels on IIS [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am looking for either a best practice, supported, guide from Microsoft or a bloggers/developers guide of the same. Or both.
I am setting up some servers for hosting and I want to configure them with just enough permissions. I have done this before where I modified the Medium trust and gave it database permissions etc but I only briefed over it.
I want to setup solid machines with the respective, common, permissions that people use. Is there maybe a resource that explains in detail what each trust level has by default? That way I could compare and go from there.
To start the security, I have made a rule on my machines that I only create dedicated application pools per site/user. I know Microsoft say that each website is virtually seperate, even in the shared application pool space, but I just don't trust it.
I also know I shouldn't run in Full Trust as I am opening up my server to all kinds of attacks.
I have a bit of knowledge on this but not enough so hopefully you lot can help me. I'm not wanting to be spoon fed what to do, I have no problem figuring it out, I just can't find the info to start with.
I appreciate your help.
Anthony
I'm running:
Windows 2008 RC2 64 bit with IIS7.5 and a combination of 2.0/3.5 and 4.0 application pools.
The strict best practice is "don't let anything do anything to anything" but that is counterproductive in general -- if you aren't taking HTTP requests, you don't have a working HTTP application server.
That said, your question is very general and very nebulous. The first key question is "what sort of hosting scenario is this?" For example, full trust isn't necessarily a bad thing in a dedicated scenario, or even a shared server between "friendly" apps that should trust each other. But it is bad in a hotel server situation where you've got random guests sharing space.
The second question is what sorts of apps are you hosting? You've got completely different frontages depending on what you are doing -- spammers don't try as hard as thieves. Spies try even harder.

Load Testing tool for web applications [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I am trying to load test a web application, but I am having a hard time finding good tools that are affordable. I cam across Web Performance Load Testing Tool which is pretty cool, but limits you to 10 users and after that it costs thousands.
Does anyone know any good techniques for load testing a web application?
Thanks
JMeter is definitely worth a look:
http://jmeter.apache.org/
Relatively straightforward to learn and pretty easy to get up and running.
We've been using a site called http://loadimpact.com/ - it allows tests from multiple locations and scripted paths. As long as you understand what it's doing and what's it's telling you it's pretty reasonable.
Trying to do Load Testing on a budget is pretty tough to do and achieve meaningful results as you'll run into bottlenecks on networks if you're not using multiple injectors etc to simulate real users.
John
I reviewed here some more tools
I went with Visual Studio
If you want loadtesting to be 100% compatible with your website, i would recommend www.browsermob.com. they use REAL browsers in their load tests. All you need to do is record a selenium script using the firefox selenium-ide addon, upload the script to browsermob, and execute.
However, there are also other tools, that works on different layers. browsermob/selenium works on the gui layer with real browsers, i believe they are the only load testing tool which does this. HP Loadrunner (click and script), Neotys Neoload also works on the gui layer. However they dont use real browsers, they have their own built in browser that really isn't suited for next gen web applications (web2.0/ajax and alike).
On the other hand, you have the tools that works on the http layer (sniffs the http traffic and records the http get and post requests). This toolset includes Grinder, jMeter. HP Loadrunner also has an url based protocol to handle this (which also is the most popular way of using HP Loadrunner afaik). This route however forces you to include application logic in your tests if you are testing a rich internet application (ex ajax push/poll).
So basically it's all about what application you are going to test and what your budget is for a testing tool.
For rich internet applications i would probably recommend browsermob.com which is based on selenium, or roll your own selenium testing platform using selenium-grid (requires alot of hardware). The only downside is that you will have to implement/purchase your own serverside monitoring toolset, as browsermob don't have this atm. And also if you need more monitoring on the client side, you probably want to have a proxy between browsermob/selenium.
Monitoring is where loadrunner really does a good job, they have all kinds of monitors and graphing (client side and server side) where you correlate graphs and data and easily spot patterns and problems in your application and server systems. However HP Loadrunner doesn't come cheap.
Neoload is a tool im not very familiar with, but im guessing it's like the little brother of HP Loadrunner, much cheaper but with less functionality and monitoring.
jMeter and Grinder are opensource tools for loadtesting, they are powerfull tools. But the drawback is that they only work on the http layer. However, it's only a drawback if you have to handle ajax applications. for a basic website with no ajax/web2.0 features where a single http request is a new page load. then this probably is the tool for you.
Disclaimer: I am a founder of Cloud Assault
https://www.cloudassault.com is a service that provides load and scalability testing services for web-sites, APIs, and Internet infrastructure. If you're doing continuous integration check us out. You can drive everything from creating, monitoring, and retrieving results of tests through our API.

Resources