How to distribute ASP.NET app on lightweight hardware (such as NAS)? - asp.net

I want to ship a piece of hardware to clients that they plug in to their network via Ethernet or USB. This device contains an ASP.NET web application that they access via a web browser on any PC in their network.
This needs to be a small device that costs less than $500, meaning it can't be a full server with a Win2008 server license. This would be repeated hundreds or thousands of times - once for each new customer.
Are there external hard drives or NAS devices that can run as an IIS/ASP.NET web server?
Thanks,
Roger

If you stick with a PC setup, you might be able to use a desktop OS and IIS Express. It should support everything you want, you might even be able to get this on a cheap netbook.

I'm sure you could build a small PC based an an embedded motherboard, or even a mini-itx board. But, this is a programming Q&A site and not really the place to ask about building servers.

If you're looking into keeping it cheap I would highly recommend looking into Mono which is free and runs ASP.Net very well. If you have any Windows-specific things you'd need to possibly change those but hopefully you wouldn't have those on a website.

You should look into converting your app to Mono.Net running on a virtualized environment. The OS and the runtime environment would be open source and would allow you to freely distribute it.
Mono.Net
Virutal Box - VM Enviornment
Ubuntu Linux OS

Buy a netbook with windows 7 home premium on it as that bundles IIS7. If you need any more "capacity", then you should look at bigger hardware anyway.

Related

How to Protect program from using on the SERVER?

I have a progam this is a converter for .NET that can be used in other .NET projects.
I have two kinds of license:
Developer license for DESKTOP software
Developer license for WEB server deployed software.
How I can protect my program if client buy (1) license he CAN NOT use it on the SERVER.
Disclaimer: I don't know anything about .Net, other than how to spell it, and I'm not completely sure about that.
It seems like one difference between a person using your file converter on their desktop and using it on a web server is that only a single instance will be running at a time on the desktop; a web page will probably have multiple instances, once per concurrent request. This seems like something you could enforce in software, and also something you could easily write into a license agreement.
Does IIS run with a graphical console on Windows? If it doesn't, and your desktop version does, maybe you could detect that?
Ultimately, though, if someone wants to get around your server/desktop distinction enough, they're going to; they could, for example, have the web server send the document to a desktop machine, and have the desktop send it back to the server. So, at some point, you'll have to give in and either ignore it or to say that's a problem for legal to handle.
If it is desktop software (I'm not sure by the question with the tag), you could use the Environment object to check what OS the code is running on and stop it running on Server Technology. This won't help if they run a server using XP or the like though, but it's a start.

VMWare - network applications

I am developing a distributed file system using Java, I cannot give many details at this moment. I need to test some things on Linux, I will use WMWare server an install Linux inside a virtual machine. Is there any difference between the simulated network card and a real ethernet interface?
I am developing a distributed file system ... I will use WMWare server an install Linux inside a virtual machine.
VMware is great for this sort of thing. There should be no difference except, as RichieHindle said, in performance, especially if you're planning to run multiple vms on the same server.
Use real hardware if you want usable performance benchmark results.
Java is it's own 'VM'... on top of a layer of virtualization in the guest OS... on top of VMware... on a virtual execution model CPU. Take a little virtualization here, add a little virtualization there, and pretty soon we're talking about some real abstraction!
From the point of view of application code, no, there's no difference.
The only visible difference might be in performance - the speed of response and exact timings of things might be different, but you're talking microseconds.
There's so much general-purpose software that works flawlessly under VMs that the answer to almost every question of the form "Are VMs different from real machines?" at the application level is "No".
(Things might be different if you were talking about kernel-level driver software.)

Are you using virtual machine as your primary development environment?

Recently I have purchased a notebook that came with Windows Home Basic (that don't have with ASP.Net/IIS. I thought in upgrade the Windows version to one with ASP.Net/IIS, but I thought in another possibility:
I have an Hard Disk Case with a 360Gb HD. I thought in create a virtual machine with Windows Ultimate (installing too ASP.Net, IIS and Visual Studio 2008) in this HD Case, then I can access my "development environment" in any computer that I will work on (my desktop machine and my notebook).
But I was worried about the performance. I don't have experience working in virtual machines (I use it just to quick compatibility tests).
Are you using virtual machine as your primary development environment? What your finds?
Edit
Thanks for your answers! It really did help me!
I would like to know too about portability i.e., will the virtual machine that I created in my laptop work in the desktop? Will I need to re-activate Windows?
I use VMWare and Microsoft VPC-based VMs quite a lot, hosted in a Quad 6600-based XP Pro box.
My use of VMs was initially to test in different environments, and for debugging I've had to install SQL Server and VS2008 in one or two of them.
For those purposes, VMs are very convenient.
But based on that experience, I wouldn't make a VM my primary dev environment, simply for performance reasons. VM performance is surprisingly good, but the difference (for pretty much everything), although not huge, is enough to notice.
When I'm compiling dozens of times a day, running big queries, etc, etc, I don't want my dev machine to be any slower than it absolutely has to be.
Working with a virtual machine is fine as long as you have enough RAM for both operating systems.
You should also be aware that virtual machines have some limitations e.g. when it comes to supporting graphic cards, so you'll want to make sure that whatever you are developing does not depend on a feature that is not available on your virtual machine.
I have been using VMWare as my primary development environment for a couple of years now.
Some Environments I typically switch between
Windows Forms / WPF Development (XP, .Net 3.5)
Ruby Development - did one website in ruby (XP, Instant Rails)
ASP.net 3.5 - for playing with new stuff (XP, .net 3.5, IIS, SQL Server)
ASP.net 2.0 - sometime places are still stuck on 2.0 (XP, .net 2.0, IIS, SQL Server 2005)
Some things I have learned
Use XP not Vista. When you are running multiple vms, the extra fluff is really noticeable
Give each VM around 1gb (sometimes as little as 512MB). You want to give them the least possible that prevents swapping for what you regularly do.
Keep a snapshot of the base install for your stack, before doing any development.
Quad Core + 8gb ram is cheap now. I typically have several vms running while developing. If you have less ram or cores, keep the number of running vms down.
Turn off software mouse cursors and run in full screen mode (most people don't realize it's a vm until I show them).
Benifits
I can be up in running in any of my major stack in 5 minutes on any pc I own.
I can move my entire development environment onto a laptop or another pc in a pinch.
I can keep separate dev stacks around easily that can otherwise step on eachothers toes.
Hard Drives:
Your first bottleneck is going to be RAM, but RAM is cheap now so there is no reason not to have 8-16gb. Your next major bottleneck is hard drives in a major way. I now try to have one hard disk per active virtual machine (used in a desktop workload, not server stuff). Raid setups can help tremendously and SSD's completely solve the problem if you can afford it.
I have been using VMware since 2002. My first use was to create a development environment in a guest VM, then replicate it for my teammates. When it came time for a hardware upgrade, I switched to my new desktop in about an hour (install VMware, copy VM).
I use VMware constantly, on desktops, on notebooks, and on servers. I use them for development, testing, and production. I have tried playing games inside a VM, but most games just don't cut it (and VMware says so, but I tried anyway). However, the newest VMware Workstation versions just might play a few games okay.
I particularly like VMware on my laptop, and I really like to use it for Ubuntu Linux. I find it best so far to use Windows (2000 Pro SP4 or XP) for the host OS because of the superior device drivers. However, I prefer to actually work on Ubuntu for my development, but that works great in a VM. I have installed Ubuntu directly on various hardware, but so far have not been satisfied enough to leave it for more than a few months before reverting to Windows again.
However, my laptop does run Ubuntu nicely, and I only reverted back to Windows XP because I want to load an eSATA card that will give me high-performance access to an external hard drive to...run VMs!!! I have not yet done that install yet due to distractions at work.
Speaking of work...I have "acquired" three old orphaned desktops that I am turning into VM hosts. I am about to attempt loading VMware ESXi, although I just finished loading two with Windows 2000 Pro SP4 and VMware Server 1.0.8. I manage a development team, and I am primarily targeting these VMs for development environments since our company cripples our primary desktops/laptops too much for real work.
Drink the Koolaid!!! VMware is awesome, and there are lots of other good VM options as well!
Best wishes.
EDIT: more goodies...
In particular, check out VMware ESXi, VMware Server, and Ubuntu JeOS. Yummy stuff!!!
No, but use it as test machine when I am testing web pages in IE.
Yes, I use VMWare workstation 6.5 and ESX 3.0 for my servers. Works like a charm. No noticeable performance penalties.
I have used VM's for development in the past, and I use them a lot for testing of various sorts. Using a VM for development works quite well, the only thing I would caution you on is that some external hard drives are quite slow, which may give you a problem, but fast drives work well.
I'm using Ubuntu as my linux development environment on top of Vista 64.
The machine has a 10k drive, lots of fast memory, and a dual core CPU, so it runs very well. I ended up with this hodge-podge because, at the time, I built a machine ubuntu wouldn't run on, and going VM was the easiest way to deal with that. I've found it's quite convenient, though, so even though ubuntu would likely be fine with the machine now, I'm staying with the VM.
Makes it trivial to fully back up my dev environment and take it offsite or distribute it as well (ie, GPL compliance is a cinch - no need to work with people trying to get a dev environment set up for them and deal with the quirks of versions of software, etc)
Needed for embedded ARM linux development.
-Adam
I use VMWare Fusion on a Mac to run Windows Server 2003, Visual Studio and all my IIS requirements.
I have no problems, but my Macbook pro does have 4GB of RAM with 2GB allocated to WMVare when it's running.
My primary Windows development environment is a native Vista x64. For the graphics card reasons mentioned above and (possibly unfounded) concerns about the VM environment and debugging, I decided I still wanted a native environment for my .Net and Windows Win32/64 development. I'm working a lot more in GUI development at present.
However, one very important kind of development I've used VM's for in the past is Python-based programs, whether pure Python with wxWindows or embedded Python called from C++. Using a VM allowed me to control the Python environment and work against different installs that were guaranteed to match the deployment environment. I'd suggest this for anyone using a dynamic language where you tend to install lots of external packages and it's hard to cleanly revert to earlier versions.
Another thing to consider is using a VM as your target with a remote debugging nub. Many REALbasic developers on Macintosh do this for their Windows testing (REALbasic has fabulous cross-platform debugging) but I've also used it in the past for Visual Studio.
Snapshots are handy. You can use multiple VM's for testing on different OS's.
Our engineers run a Windows VM on VMWare esx. We probably have 12 Windows VM's running on a single Dell Poweredge(Yes, it's beefy, but still). They almost seem snappier over the network then my local install of XP on a Core2 Duo!
And on a local machine, as long as you have the RAM for it, it can still perform very well. A stripped down VM of XP(something like TinyXP) performs as well as my 6 month old native install!
Regarding portability; assuming the same architecture (and operating system), then the virtual machine should run fine on both physical computers. Provided the hardware configuration of the VM doesn't change too much, you shouldn't need to re-activate Windows within the VM.
I run Ubuntu as my primary OS, and then use a Windows 2003 virtual machine (using VirtualBox) to develop in Windows. Mainly use it for Visual Studio 2008 web development. Been doing so at home for 3 years, but now in the process of trying it out at work.
Works fine, even with ReSharper and a solution of around 50 projects. It's not quite as fast as if I run it all via Windows directly, but having one nicely setup virtual machine means I can share it with other developers, plus easily switch between vms (we're looking at trying out Windows Server 2008).
Also means I can use Windows but then let Linux take care of things like IM, Firefox, Music (of course), Indexing (tracker), FTP, etc... Plus I have the terminal at my disposal (grep, ssh you name it).
We tried this with ghost images of Windows but found that as people have different hardware the image wouldn't always work.
I run Windows 7 64bit on my lappy with 3 GB Ram (Yes it's low).
I find running my dev environment on VM's a pain in the butt really.
When I have 5 IDE's open, SQL Server 2008, NotePad++, OxyGenXML editor, and Chrome/IE/FireFox all open in my VM, then my main machine has Outlook, OneNote, and a few other programs running it turns to crap.
Using multiple monitors isn't easy with VM's either, especially if you take your laptop away lots ofo times, then hook it back up. the resync is time consuming.
My other co-workers have the same issues even with 5-6GB ram.
If I added up all the time wasted waiting for my the extra processing the VM Causes, it would be more than what it would be if I had to re-do my computer from scratch - which takes under 8 hours.
This depends on situation really. Most horrible environments I've encountered in corporate world is Windows laptop + Linux virtual environment (where the laptop itself isn't top of the breed, I dislike having laptop as a development machine in the first place). I'm mostly a java developer, and like to write much unit tests, and usually with this combination (I'm really not wanting to use one, but well, I'm just lazy to complain all that much) running unit tests takes hell of a lot time.
Of course this depends on types of test, but in this case my guess is that disk I/O is just slowing things down. I just like to compile and test much, so that's the main reason I'm in favor of native environments: speed. Even a little notch on that sometimes feels too much. Sorry for not answering on a bit broader scope and very subjectively.
I use Parallels on a Mac - have no issues.
At work we use VMs for most of our test environments - they work very well
I used to use a virtual machine as just a sandbox to keep unruly applications from doing unruly things. (Sandboxie is an awesome program for this) but I always ended up forgetting to keep them contained and would just reinstall windows every now and then, much easier than constant maintenance really.
As for a virtual machine you're screwed on the hardware you have available as your bios and hardware are all emuated. Makes for writing something low level practically impossible in certain circumstances. However when using a hosted server on a virtual machine through a remote desktop connection... Absolutely wonderful, so easy! I can be in and out in a couple clicks, so I guess that's the number one virtualization I use.
VMware to debug device and filter drivers. VirtualBox is nice and fast for occasional Ubuntu, and an XP holding IE6, FF2, Chrome, Opera, & Safari for testing and installs of apps i don't trust.
I have a Windows 2008 Hyper-V machine that runs a couple of my development environments. It's not slow at all (that I can notice). Some of my environments are not virtualized though. Usually if the setup requires something where rollback is difficult, then I'll use a virtual machine, if not, I'll just use a desktop.
Keep in mind licensing costs. If you're going to virtualize a copy of windows server, you're going to need a license for windows server as well. It's probably cheaper for you to upgrade your Vista Home Basic to a version that runs IIS (although my suggestion would be to run Win 7 beta since it's free and then upgrade to the final version of win7 when it's ready)
Now that it's been almost a year, are you guys still using virtual machines as your development environments? I used to, but have stopped since performance is getting bad. Just wondering.

What program can I use to remotely help clients?

I have a lot of people that ask me to fix their computers. Usually it is "slow computer" or "my computer has pop-ups," etc. In other words they have viruses and spyware. I thought I could use a remote program to do it, instead of them brining their computer to me or me traveling to their house..
I thought of UltraVNC, though I'm not sure how I would get them to use it. What I would like to have is a program they can download from my website.
What program would you recommend for this? Remote Desktop? VNC? Something else? I'm happy to pay a small fee if necessary to make things as seamless as possible. Word of mouth is valuable and a good referral for an easy to work with computer person (me) is worth that monthly or one time fee.
I have Vista, most will have Vista Home Premium or XP Home. I have Vista Home Premium and Mac OS X. I can use Linux if necessary. I just don't have it installed right now.
Thanks.
EDIT: Is there an alternative to copilot? I like it but I'm afraid to stake everything on one provider.
https://www.copilot.com/
It's made to be simple so even the most novice computer users can figure it out.
Copilot helps you fix someone's computer problems by letting you connect to their computer, see what they see, and control their mouse and keyboard to help fix the issue.
It's nice because they just go to the site and enter the code you give them. The installation is simple from there.
(Modified)
LogMeIn has a free version that works very well. It runs in the user's system tray and you can login and control their computer as long as they have the program running. The free version has a few less features, but they're mostly luxuries instead of necessities.
Team Viewer is a desktop sharing remote control support tool. It is free for non-commercial, personal use.
There are a few different options:
Remote Desktop: Nice interface, integrates with Windows very well (I had no trouble connecting to my Vista desktop from my XP laptop). I think your client would need to have Windows XP pro; XP home does not have the Remote Desktop Server.
RealVNC: Nice interface, the free version is very useful. Encrypted connections are available with the non-free version.
There are others (like Copilot), but I have only used Remote Desktop and RealVNC.
With either of these, you need to make sure port-forwarding is setup if they have a router, and that the firewall whitelists the program.
Windows XP has built-in "Windows Assist" which lets you send an invite to another Windows machine (typically via e-mail) and allows you to remotely control the machine with them watching. This is a nice option because it is already built into Windows (albeit not as well known as RemoteDesktop or LogMeIn).
The advantage over Remote Desktop is that the user can see what you are doing to their machine and control can be passed back and forth.
This link has the steps to do this.

Can Windows Web Server 2008 be used to host games?

I'm currently using a linux server, we run a couple of web sites of it, PHP apps with MySQL, the usual. Since the server is privately owned by some friends and myself (we do have it hosted at a professional datacenter though), from time to time we also use it to host our smallish counter-strike source and call of duty 4 matches by running the released dedicated game server packages.
I've recently subscribed to DevExpress' excellent WinForms and ASP.Net component suite, and is contemplating moving to Windows to make use of those ASP.Net components. I'm currently trying to decide between the Web and Standard editions of Windows Server, since there is a difference of nearly a thousand bucks (where I come from)
For Windows Web Server 2008, Microsoft has softened the database server restrictions and made it clear there is no need for CALs. But would one be able to run the above mentioned web servers? I've been googling and searching through forums to no avail.
Need some help before I plunk in the cash.
Thanks.
Before I give any opinion, I'll start by answering your core questions:
Yes, you can run dedicated game servers on Windows Server Web ed.
The differences between web and standard:
Web only supports 2 gigs of ram. Standard in 64bit mode can support 32gigs (and more?).
Standard comes with more things that are better suited to local server environments (eg: active directory). If you want LDAP controlled Exchange email, you'll need Standard. Most web server don't need these.
Web (apparently) won't support full-on SQL server versions. Express should run though.
Opinion time.
Dedicated and virtual dedicated monetary overheads on Windows servers are a lot... To the degree where you're paying more for the software than the hardware costs, at least for the first year.
Renting the software (as part of a managed dedicated server or VPS) is initially a lot cheaper, but over the course of a couple of years, will cost you about the same and if you run it longer, it'll eventually cost you more.
Shared Windows hosts can be good. I've been with a company called Hostek (Florida-based) and they've bent over backwards to make hosting a fairly busy site (around 6000 uniques a day) very cheap for me. It can also be atrocious. I've had bad hosting companies too. Shop around.
About a year ago, I dropped Windows at home in favour of Linux. I'm not going to enumerate the many benefits and drawbacks; I'll just tell you that that's when I stopped doing .NET in favour of more open Frameworks. I'm not using Django (a Python-based web framework). While you might not like it (or other frameworks - eg Ruby on Rails), I plead that you do check out what's happening in the open-source world before you plonk for anything Windows related since you already have the infrastructure available for hosting Django/Rails/et al.
If you wanted your own Linux server, VPSs start from around $20pcm. As I said before, severely cheaper than Windows counterparts. I now use Linode to host everything new I make. Highly affordable and they'll easily run dedicated games like your current set-up does.
Mono isn't an option for you. Not yet anyway. It does go some length to help people migrate their applications but it's still pretty sketchy on the ASPNET front. And as a comment says on another answer: the controls you want to use are strictly Windows-only for the moment.
Linux will consume fewer baseline resources than Windows will. On an old server (Windows 2000, IIRC) I had to administer, the core of Windows would consume anywhere from 100-200 megs of RAM. My current Ubuntu server eats 40megs. I'm not sure how much RAM you have to play with on your server but if it's a lower amount, you're going to fit a lot more on a Linux host. (Remember that if you have more than 2gigs, you don't have the choice of the Web Server edition)
It's clear from this that I'm a complete Linux super-enthusiast, but I know my needs differ from yours. ASP.NET is a great platform but it costs a lot of money even if you're splitting it between friends. You could opt for Windows... Or you could go Linux, donate a bit to the projects you use and buy a new plasma or something shiny for the lady.
SPLA? Isn't that for service providers? My friends and I use the hosted services for ourselves (games, email and web), though of course our web sites are publicly viewable by all; but I think that hardly qualifies as "providing a service"?
Unfortunately, staying with Linux would make it such that I would not be able to use my DevExpress components, which is my reason for considering Windows Server in the first place. .NET may be partially supported by Mono, but not fully, and DevExpress makes use of certain features of .NET that aren't (at least as yet) supported by Mono.
We also already own our own dedicated server, so are only looking for a suitable OS.
Still, your reply is appreciated.

Resources