Our customer uses an SSO module so that sets the current user ID into HTTP_IV_USER server variable. I have inherited code from an application that was patched to allow automatic login and I have done many modifications to it in the past year, making it work fine for customers with a standard setup.
Now I am asked to deploy the application at customer site with SSO plugin, so I need to test whether the patch still works. Unfortunately, the customer won't allow us to access their systems for testing (they will install our package to QA and do their QA independently from us). Neither they will provide us details on the SSO (e.g. SiteMinder version xxxx) because "you [us] have already the code".
I just need to perform a few login tests with my development version to assure that it will work in their environment, nothing more.
I am simply asking how can I emulate Request.ServerVariables.GetValues("HTTP_IV_USER")[0]? How does IIS (which is going to be upgraded to 7.5 as part of the software upgrade process) set that variable?
I have tried to do my homework and found that HTTP_IV_USER is not a request header (that could be set using Firebug or other tricks) but an environment variable.
Answering for posterity.
The trick is that HTTP_IV_USER is really an HTTP header, exactly like Request.Headers["IV_USER"]
Being able to emulate IV_USER header opens the gate to emulating HTTP_IV_USER.
This is because IIS translates all HTTP headers to HTTP_ prefixed server variables
Related
I've recently created two C# console applications. The first transforms a bunch of command outputs into an XML, and the second transforms the XML into a Word document using a template.
I'd like to know how I could get this onto the web, i.e having a web page where the command output can be uploaded, the two step conversion executed, and finally the Word document made available for download.
Should the web page be created in ASP.NET or are there other (better) options? Do I need to rewrite the console applications in some other format?
This question is fairly broad, with plenty of room for novel sized explanations, but here's a brief highlevel walk through of what likely needs to happen to achieve the proposed results (language agnostic):
Get a hosting provider that allows users to spin up their own machine (i.e. AWS).
Spin up a machine that is compatible with the "console" programs in question.
Install "console" programs on machine.
Install a programming language (i.e. Node.js, PHP, ASP.NET, even C# could do) on the machine.
Install a web server (i.e. NGINX, Apache) on machine, configure it to serve public requests and run with chosen language.
On server request, execute appropriate commands from within the chosen language. Languages typically come with a exec method (i.e. in node.js: require('child_process').exec(command,options,callback))
Get the results of said commands and send it back to the client. Alternatively (for downloads), write the result to a path on the system that is publicly available to the internet and redirect the user to that url (additional configuration might be required to make sure the browser downloads the file as oppose to just serving it).
The steps above should get you pretty close to that you want. As for your questions:
Should the web page be created in ASP.NET or are there other (better)
options?
The "better" options is whatever you feel most comfortable with at the moment, you could always change it later with reasonable effort (assuming that your "console" apps are not unsuspecting unicorns).
Do I need to rewrite the console applications in some other format?
No, unless you have strong reasons to do so (i.e. multi environment compatibility). You could also rewrite to significantly simplify (i.e. bypass working with a CLI and do everything in C#).
Try thinking through these high level steps, begin working on a implementation, and post more specific questions here on StackOverflow when you get stuck.
I hope that helps!
I am wondering if there exists a web analytics product that meets these criteria:
Can be installed on private server, not using anything cloud-based
Can be installed on IIS/.Net and does not require PHP, or any server side language beyond ASP.NET
Can use a local SQL Server as its datastore (not MySQL or any kind of cloud storage)
Can work with an internal intranet web application without a fully qualified domain name
Can track page views and button clicks using simple JavaScript and/or C# APIs
Is free or at least cheap
I am trying to avoid installing PHP on IIS to run Piwik or something similar, and this is a last ditch effort. My searches are turning up nothing.
The answer to this question is no.
Until a few years ago Webtrends (www.webtrends.com) was providing exactly what you are looking for (even if it is not for free).
I am not sure, though, if the On Premise version of their software is still available.
Hope this helps!
We are migrating from WebSphere BPM 8.0.1.3 to 8.5.6, our plan is to move application by application rather than in a big-bang. The idea would be that when we move an application to the new server, we would create an IHS rule which redirects the related URLs to the new server. That would mean that we keep some applications running on the old server while some are already migrated to the new one.
Is this possible to achieve? Or any other idea alternate to re writing IHS rules? Like make use of WebServer plugins?
Unfortunately, I don't think that your current approach is going to work well for you. I've outlined the various options for IBM BPM upgrades here. I see several major problems with your approach, all of which come down to the fact that many of the URLs used by IBM BPM contain no details about the context for the request.
The first issue I see IBM uses a portal for a given user's work. That is all their tasks across the various BPM solutions will appear in the same web UI. This URL is not different across the Process Applications in the install. This means that all your users are trying to get their task list by going to a url like - https://mybpmserver/portal. There isn't a way to understand the process app a given user may be working with in this context, so you don't know who to redirect to the new server.
The second issue is that users are able to work with multiple process apps, so even if the context was known in the above url, you would enter complexities with respect to users working in 2 different process apps unless both have been migrated.
The third issue is that BPM is essentially a state engine. IBM does not supply a way to "migrate" that state from an old install to a new install on a per Process App (PA) basis, you have to migrate all or none. Assuming "none" because it feels like you want to follow the drain approach in my article, then you have the problem that the URLs for executing a task do not have the PA context and therefore you won't know which server to direct which task to. That is for a given PA you will have tasks on both the old server which existed before the upgrade, and the new server which were created after the upgrade, but the URLs for these tasks will look essentially the same.
There are additional issues, but the main one comes down to properly understanding how the run time BPM engines work. Some of the above issues may be mitigated if you have a separate UI layer for presenting the tasks the users (my company make a portal replacement that can do this) which would permit it to understand the context of the tasks, but if you have this, then you can get the correct behavior in that code and not worry about WAS configuration settings.
You could use the plugin-cfg.xml merge tool on the two generated plugin-cfg.xml's. That way the WAS Plugin would always know which server had which applications.
In WebsiteAzure we have an Staging feature. So we can deploy to one staging site, test it, fill all caches and then switch production with stage.
Now how could I do this on a normal Windowsserver with IIS ?
Possible Solution
One stragey i was thinking about is having a script which copies content from one folder to an other.
But there can be file locks. Also as this is not transactional there is some time a kind of invalid state in the websites.
First Poblem:
I've an external loadbalancer but it is externally hosted and unfortunately currently not able to handle this scenario.
Second problem As I want my scripts to always deploy to the staging i want to have a fix name in IIS for the staging site which i'm using in the buildserver scripts. So I would also have to rename the sites.
Third Problem The Sites are synced between multiple servers for loadbalancing. Now when i would rebuild bindings on a site ( to have consistent staging server) i could get some timing issues because not all Servers are set to the same folder.
Are there any extensions / best practices on how to do that?
You have multiple servers so you are running a distributed system. It is impossible by principle to have an atomic release of the latest code version. Even if you made the load-balancer atomically direct traffic to new sites some old requests are still in flight. You need to be able to run both code versions at the same time for a small amount of time. This capability is a requirement for your application. It is also handy to be able to roll back bad versions.
Given that requirement you can implement it like this:
Create a staging site in IIS.
Warm it up.
Swap bindings and site names on all servers. This does not need to be atomic because as I explained it is impossible to have this be atomic.
as explained via Skype, you might like to have a look at "reverse proxy iis". The following article looks actually very promising
http://weblogs.asp.net/owscott/creating-a-reverse-proxy-with-url-rewrite-for-iis
This way you might set up a public facing "frontend" website which can be easily switched between two (or more) private/protected sites - even they might reside on the same machine. Furthermore, this would also allow you to actually have two public facing URLs that are simply swapped depending on your requirements and deployment.
Just an idea... i haven't tested it in this scenario but I'm running a reverse proxy on Apache publicly and serving a private IIS website through VPN as content.
I am the webmaster at our company and we are in the process of picking a new web hosting company. The old company sold us a hosting package years ago and has since left us on the hardware we were given back then: Pentium 3 box, 1GB RAM, Windows 2000 server. They told us that we would have to pick a new hosting package and pay more money to get newer hardware. I only found out about this because Microsoft's site server which we use to replicate our site from dev to prod now givs us trouble because it uses an unsigned java app, which is soon to be no longer supported. all this and the company pays over $300 a month. Ouch.
The problem I am having is this: On the windows 2000 server machine there is an indexing service that is leveraged to generate a catalog of the site that is used as part of a site search feature. I've contacted several web hosting companies and when I ask about the indexing service am told that they can't provide me the same catalog. Some hosts tell me I can get the service if I purchase a vps account as opposed to the cheaper sharred service.
What I'd like to know is if there is a different way to go about developing a search feature for my site. Is there a way to create a search feature that does not us e the indexing service?
If your website content comes from a database then it would be possible to develop your own search facility in ASP and SQL.
If your content is static pages then there are external website services available to index and search these too, similar to what you are using now but external services. A Google search for "search your website" will bring up many products similar in functionality to what you are using now.
Another similar option is you could create a Google Custom Search (paid and free options available) which will index your site and it is easy to add a form to your pages to add this search function.
I imagine you are referring to the Microsoft indexing service which has actually been a built in component since the release of Windows Server 2003. Referred to as Microsoft Windows Search Service, it is installed by default on some versions of Windows Server and is an optional component on others (optional just like IIS is considered an optional component of Windows Server at installation). Previous to Windows Server 2003, it was a separate download on microsoft.com as Windows Search Server. Once installed, depending on the size and number of documents on the server it may takes several hours before an initial search index is built. Before the index is finished building, the search feature will not return all or any expected results.
I mention all this as I have actually found that this is installed by default on hosts we have used in the past without asking. So I am assuming the hosts you may have inquired with may not be realizing you are referring to this built in component of Windows Server and you might want to clarify that with your preferred host(s).
I looked into several alternative methods for developing the search function but none of them would work for sharred hosting that integrates with the site the way it is hosted now. I've decided to focus on VPS hosting as I can install the indexing service and have the page function as it does now on my old host that is running my site on a win 2k server machine. To test the indexing service's functionality, I installed the service on my win 7 PC. After installing indexing service on my windows 7 dev machine, I learned 2 things:
The search page only functions in a 64 bit environment. This means that I have to move the search page to a new folder and use a 64 bit application Pool to get that page to run.
In 64 bit mode, the code line "Set rstResults = objQuery.CreateRecordset("nonsequential")" was returning an error, "No such interface supported". Googling this returned the fact that a windows update breaks functionality and that a hotfix was provided to fix this error. The hotfix, #2740317, is located here: http://support.microsoft.com/kb/2740317
Now my search function works and I get results. The only problem though is that the results point to file:///c:/Inetpub/... instead of website/path/page.html I had to extract the path field from the recordset and use the replace function to remove the physical path up to the folder containing my site. I now get a relative link that points to the correct files on the site.