I have the same problem as the poser of this question:
System.Data.OracleClient requires Oracle client software version 8.1.7
I have made the changes to the security settings on the oracle folder, and have to wait for the server to reboot overnight.
My question is why is this reboot necessary? I am getting the same error after making the changes without rebooting, so I don't doubt that it is. Is there an alternative to rebooting the server, like IISRESET? (Although I wouldn't be allowed to run IISRESET during the day either)
Perhaps not an answer to your specific question, but for the record it is for this kind of reasons that I always favor Oracle Instant Client :
You don't have to install anything on the target machines (including dev boxes !). So no tricky manual setup and goat sacrificing.
You can make sure that your application will run with the specific client you picked (version, x86/x64).
You could even easily have multiple applications work with different client versions on the same computer.
As a downside, it adds a significant weight to your application (~19Mb minimum), and you can't participate in distributed transactions.
If you still can switch, this is the way to go IMHO. Check What is the minimum client footprint required to connect C# to an Oracle database? for more information.
Starting with Server 2003 (hosting IIS6) it is enough to restart the service to bring environment changes and security changes into effect.
But this is done with iisreset. What is not allowed too.
Thats a pity, I see no other way as wait.
Related
I've got a Windows 2008 Enterprise R2 Server running Ektron 8.02 SP3 that is causing me some trouble that I can't diagnose.
So my question isn't for a solution but simply how to better profile a .net application / windows server. Whenever you try to POST a form built by the software, it takes 8 seconds and change to return the page (on this specific server, it doesn't do it on other machines with the same codebase). It appears that it's trying to make a connection to something for 8 seconds, fails, then returns the page without error. Some more facts:
This is a beefy VM that is not being over utilized
The database is running on the same machine, so there is no lag there.
The 8 second delay even happens when submitting from the server itself
The event viewer for the server doesn't report any errors that seem related
Profile MSSQL doesn't report any issues either
Microsoft Network Monitor doesn't report any glaring networking issues, though its hard to say as the tool doesn't report long connection attempts clearly (from what I've seen of it)
I feel sufficiently confident that there is some process during the form POST on the server that is trying to make a connection somewhere, failing, and continue through the rest of the process. It doesn't report any errors and the forms all submit fine.
So, all that being said, is there anything else I can do to debug this? I feel like I'm shooting in the dark. Thanks for any help you can offer.
It sounds like this isn't an application you wrote. If so, the potential solutions are different.
A first level step would be to use Windows' Performance Monitor. Select the options (process and/or performance attribute to watch), then run the process through its paces.
If Performance Monitor can't help, you'll need to get into the real guts of the app. Most likely this will be more difficult than you want unless you're writing the app. You can debug a .NET app that is not your own using Reflector (paid) or ILSpy (free). HOWEVER, this can violate the license agreement (and likely does if the app has been purchased).
Apart from that, you're going to need to contact the product developer if you want to both stay sane and find this problem quickly.
Sometimes a visual FoxPro App doesn't find files in a FileShare, which are there.
for example when checking in a loop File() on a existing file on a Network share about 5% of the tries don't find the file.
This works on most machines but sometimes it doesn't work. In the curren scenario I've a Windows Server 2K8 as file server (perhaps a SMB2 issue?)
I would patch your 2K8 server to SP1 (and any Windows 7 clients too), this will take care of any SMB2 issues. Those issues were around CDX index file corruption, though.
It's also possible that this is due to the caching that SMB2 uses, which can produce 'File Not Found' errors. The client registry settings involved are:
FileInfoCacheLifetime
FileNotFoundCacheLifetime
DirectoryCacheLifetime
There is a discussion regarding this on Alaska Software's website, and a useful MSI installer which can be run per workstation to adjust the settings. This company produces a product called Xbase++ but I would guess it is close enough to Visual FoxPro in terms of low-level file IO and locking.
Not positive if its an issue of Fox, or your network. Going way back in time, I had a client that had problems somewhat similar. Took Foxpro out of the equation and just used Windows Explorer and it would hang for a moment. It ended up that their network cards were set to energy saving mode and would basically time-out / shut down due to inactivity. The network drive share would apparently be released. Until the network card would re-connect and get established again, they had issues. By changing so the network card NEVER went into energy save mode, problem went away for them.
Yes. I have versions of fox pro deployed on various different servers, with various versions of windows server and never experienced an issue as described.
Maybe you could try a similar test using a different programming discipline, .Net , access, Ruby ...., etc
Post you test loop, just out of interest ?
My company has been using Hamachi to access our SVN repository for a number of years. We are a small yet widely distributed development team with each programmer in a different country working from home. The server is hosted by a non-techie in our central office. Hamachi is useful here since it has a GUI and supports remote management.
This system worked well for a while, but recently I have moved to a country with poor internet speeds. Hamachi will no longer connect 99% of the time - instead I get a "Probing..." message that doesn't resolve. It's certain to be a latency issue, as the same laptop will connect without problems when I cross the border and connect using a different ISP with better speeds.
So I really need to replace Hamachi with some other VPN/protocol that handles latency better. The techie managing the repository is not comfortable installing and configuring Apache or IIS, so it looks like HTTP is out. I tried to convince my boss to go for a web hosting company, but he doesn't trust a 3rd party with our source.
Any other recommended options / experiences out there for accessing our SVN repos that would be as simple as Hamachi for setup; but be more tolerant of network latency issues?
Perhaps it's a bit much to ask of your team, but if you have a distributed team then you could switch to a distributed version control system (eg. Mercurial or Git). These don't need to use the network so much and you won't suffer from latency problems. It is an entirely new paradigm though and your team's development processes will have to change, so you might not consider it appropriate in your case.
First I should ask why you need a VPN in the first place. Subversion can operate over HTTPS, so as long as you open the proper port on the server there shouldn't be any security or connectivity issues.
Assuming that you do need a VPN, I find it difficult to believe that an administrator uncomfortable with Apache would be more comfortable installing a whole new VPN system (much more complicated and tricky, in my estimation).
I have a progam this is a converter for .NET that can be used in other .NET projects.
I have two kinds of license:
Developer license for DESKTOP software
Developer license for WEB server deployed software.
How I can protect my program if client buy (1) license he CAN NOT use it on the SERVER.
Disclaimer: I don't know anything about .Net, other than how to spell it, and I'm not completely sure about that.
It seems like one difference between a person using your file converter on their desktop and using it on a web server is that only a single instance will be running at a time on the desktop; a web page will probably have multiple instances, once per concurrent request. This seems like something you could enforce in software, and also something you could easily write into a license agreement.
Does IIS run with a graphical console on Windows? If it doesn't, and your desktop version does, maybe you could detect that?
Ultimately, though, if someone wants to get around your server/desktop distinction enough, they're going to; they could, for example, have the web server send the document to a desktop machine, and have the desktop send it back to the server. So, at some point, you'll have to give in and either ignore it or to say that's a problem for legal to handle.
If it is desktop software (I'm not sure by the question with the tag), you could use the Environment object to check what OS the code is running on and stop it running on Server Technology. This won't help if they run a server using XP or the like though, but it's a start.
It's one of those things I see a lot but never really think of. Do you think for the purpose of web application development (specifically ASP.NET WebForms/MVC). Do you think it's advantageous to do such a thing and if so, what kind of advantages come out of it?
By virtualization I mean using products like Hyper-V to separate the server context like your SQL and Web Server, etc.
First question is, virtualization of what? Do you mean server virtualization? Do you mean running VMWare on each dev's laptop with multiple OSes? Do you mean moving everything to the cloud?
Virtualization of servers, in web app context, is not really different from that in general IT - most of the servers on the Internet, including StackOverload's, are bought to handle peak loads and spend most of the time idling away the cycles, so virtualizing them makes sense when you have more than a certain amount.
VMWare on the desktop (or other parallels on other operating systems) is superb because a) your devs can run a full instance of your server environment, including multiple virtual servers connnected in a virtual network - this is about as close to the real thing that you can get, minus hardware costs and minus devs messing with each other's servers. For clients, you can use Linux and multiple Windows installs to test various browsers, font sizes, etc. quickly - also a big win.
Moving everything to the cloud makes sense in many cases, but is probably a topic for a separate full-sized question :)
One big advantage I see is, that every developer can have his/her own sandbox to work on. If someone messes up his/her sandbox he/she can take a clean image and all is OK again. So I guess that means that there is room to experiment without losing valuable time getting back to the normal setup, you can simply do a rollback.
I'm in doubt a bit on whether you should use virtualisation for production environments. Depending on the application of course.
The only time I would use a virtual for ASP.Net development was if the app required specific setup, such as relying on installed software, wierd settings or particular shares. Every developer has their own webserver and can run their own database so if it's a "basic" webapp I don't see much value in virtuals.. it's pretty hard to break anything with a basic web app deployment :)
With a virtual server, you can test your code in a production-like environment. It is also possible to quickly revert back to the original setup. For many applications, it is useful in that time period just after you write the code, but before it goes to production.
I'm a fan of virtualizaion and use it in testing and production (VMWare and Hyper-v) but over the last year I find it less important on a dev machine. TFS provides me with all the backup/rollback ability that I need, multiple versions of .net can now exist on the same machine and VS2008 can target all those versions.
In a development environment a virtual environment is useful to put several different servers on one box, you can have an instance for your web app, one for your services, one for database, etc. That way it mimics your production environment if you are using separate servers.
One of the benefits of using virtualization in production is that your application is not tied to a specific machine. If you wanted to move your web server instance to another box, it is trivial to do so. You don't need to install or configure things on the new server and hope that everything is set up properly.
One problem I have had though in testing virtual instances is that it can run slower for some applications, specifically engineering apps that like running the CPU at 100%. So test before you leap.