Replacement for Microsoft Index Server? - asp.net

We're moving a legacy .ASP application to a new hosting provider that doesn't support Microsoft Index Server, on which one portion of the site depended. The application has a directory tree containing around 10,000 documents (text, MS Word and PDF) whose contents need to be indexed and to be searchable.
The application is staying classic .ASP for now but the search portion could be written in anything. We tried a tool called SiteSearchASP.Net but that number of documents was outside its reach.
A Google appliance is outside the client's budget, and these documents need to stay private so Google search isn't an option.
Anyone have experience with anything that might work?

Microsoft Search Server 2008 Express is free, much like the other great express products. Easy to configure, powerful and definitely within your budget ($0).

Try Lucene.NET Lucene.Net and SQL Server

I have been investigating the same question for my own line-of-business application that uses Index server, since it has been dropped from Windows Server 2012
Alas Windows search is not really the successor to Index Server, lacking interfaces for configuring multiple catalogs, amongst other things. The interface provided is oriented toward searching all the content on a workstation, rather than a platform for content searching on a line of business app.
MS Search Server is more complex to set up than Index server, and oriented toward URL crawling rather than file searching. The versions I looked at did not seem to provide the flexible API of Index Server.
The Lucene.Net toolkit is attractive, but you have to write a lot of infrastructure around it to make it work. It is not an out-of-the-box tool in the way Index Server was. It does offer the potential of a much better integrated solution than you could achieve with index server if you have the time to invest.
dtSearch is quite close to the concept of Index Server, but costs significant money. This is probably the easiest option if it is cost-effective.
Index Server was the unsung hero the original Cairo project. Perhaps some of the underlying engine lives on in the 'successor' products, but it is sad to lose it from Windows Server 2012. Microsoft have been very effective in recent years in monetizing their server business. I feel this may be one of the causalities of that strategy.

Perhaps "Windows Search"? It's the successor to Indexing Service.

You want the Windows Search Service, which is the current incarnation of Index Server. Search Service is available on Win 2008 and Win 2012, however it is not installed by default.
Note that Search Service is distinct from Search Server. Search Server is a different animal, with different api's (but a similar overall product goal).
Search Service details:
https://stackoverflow.com/a/23742911/147637

Related

Are there any non-cloud web analytics platforms for IIS/.Net?

I am wondering if there exists a web analytics product that meets these criteria:
Can be installed on private server, not using anything cloud-based
Can be installed on IIS/.Net and does not require PHP, or any server side language beyond ASP.NET
Can use a local SQL Server as its datastore (not MySQL or any kind of cloud storage)
Can work with an internal intranet web application without a fully qualified domain name
Can track page views and button clicks using simple JavaScript and/or C# APIs
Is free or at least cheap
I am trying to avoid installing PHP on IIS to run Piwik or something similar, and this is a last ditch effort. My searches are turning up nothing.
The answer to this question is no.
Until a few years ago Webtrends (www.webtrends.com) was providing exactly what you are looking for (even if it is not for free).
I am not sure, though, if the On Premise version of their software is still available.
Hope this helps!

windows 2008 server and classic ASP catalog queries

I am the webmaster at our company and we are in the process of picking a new web hosting company. The old company sold us a hosting package years ago and has since left us on the hardware we were given back then: Pentium 3 box, 1GB RAM, Windows 2000 server. They told us that we would have to pick a new hosting package and pay more money to get newer hardware. I only found out about this because Microsoft's site server which we use to replicate our site from dev to prod now givs us trouble because it uses an unsigned java app, which is soon to be no longer supported. all this and the company pays over $300 a month. Ouch.
The problem I am having is this: On the windows 2000 server machine there is an indexing service that is leveraged to generate a catalog of the site that is used as part of a site search feature. I've contacted several web hosting companies and when I ask about the indexing service am told that they can't provide me the same catalog. Some hosts tell me I can get the service if I purchase a vps account as opposed to the cheaper sharred service.
What I'd like to know is if there is a different way to go about developing a search feature for my site. Is there a way to create a search feature that does not us e the indexing service?
If your website content comes from a database then it would be possible to develop your own search facility in ASP and SQL.
If your content is static pages then there are external website services available to index and search these too, similar to what you are using now but external services. A Google search for "search your website" will bring up many products similar in functionality to what you are using now.
Another similar option is you could create a Google Custom Search (paid and free options available) which will index your site and it is easy to add a form to your pages to add this search function.
I imagine you are referring to the Microsoft indexing service which has actually been a built in component since the release of Windows Server 2003. Referred to as Microsoft Windows Search Service, it is installed by default on some versions of Windows Server and is an optional component on others (optional just like IIS is considered an optional component of Windows Server at installation). Previous to Windows Server 2003, it was a separate download on microsoft.com as Windows Search Server. Once installed, depending on the size and number of documents on the server it may takes several hours before an initial search index is built. Before the index is finished building, the search feature will not return all or any expected results.
I mention all this as I have actually found that this is installed by default on hosts we have used in the past without asking. So I am assuming the hosts you may have inquired with may not be realizing you are referring to this built in component of Windows Server and you might want to clarify that with your preferred host(s).
I looked into several alternative methods for developing the search function but none of them would work for sharred hosting that integrates with the site the way it is hosted now. I've decided to focus on VPS hosting as I can install the indexing service and have the page function as it does now on my old host that is running my site on a win 2k server machine. To test the indexing service's functionality, I installed the service on my win 7 PC. After installing indexing service on my windows 7 dev machine, I learned 2 things:
The search page only functions in a 64 bit environment. This means that I have to move the search page to a new folder and use a 64 bit application Pool to get that page to run.
In 64 bit mode, the code line "Set rstResults = objQuery.CreateRecordset("nonsequential")" was returning an error, "No such interface supported". Googling this returned the fact that a windows update breaks functionality and that a hotfix was provided to fix this error. The hotfix, #2740317, is located here: http://support.microsoft.com/kb/2740317
Now my search function works and I get results. The only problem though is that the results point to file:///c:/Inetpub/... instead of website/path/page.html I had to extract the path field from the recordset and use the replace function to remove the physical path up to the folder containing my site. I now get a relative link that points to the correct files on the site.

Migrate Access to ASP.NET

The current application is a kind of CRM application built upon MS Access. The application is for internal use. My job is to migrate it to ASP.NET web-based application. Now boss requires to keep Access as database and develop ASP.NET code against it.
My question is, is there any disadvantages of using Access as database in ASP.NET application? (e.g. optimistic concurrency issue?) Should I persuade boss to upgrade Access to MS-SQL?
Many thanks!
We've used Access as a backend for web sites with good success. It's cheap, can be used effectively by moderately skilled programmers, and you can store the MDB on a document server so it gets backed up.
Most IT people dislike Access, but from a business perspective, Access can be very valuable.
MS Access is notoriously unstable in multiuser environments. A WEB app is by definition heavily multi-user.
So IMHO leaving MS Access as underlying DB is a call for trouble. At least use SQL Express (it is free)
The problem you are going to face in upgrading from Access to MS-SQL is that there is a major cost investment for the application. If your company already has the infrastructure in place(licensing, hardware...) then you won't have such a hard fight to pursuade your boss.
As for a technical answer:
I'd say you need to let you boss know that access databases aren't ideal for concurrent usage which a web application suggests is the intended goal of the application. My view is that Access is for database information that a SMALL set of users will be simply using for small data entry and querying. NEVER use Access to build an enterprise-level solution.
If you are planning to upgrade a Microsoft Access database to SQL Server 2008, use the SQL Server Migration Assistant (SSMA) rather than the upsizing wizard built into MS
10+ tips for upsizing an Access database to SQL ServerAccess.
Your boss probably likes to do ad-hoc stuff with access / excel. If you move the DB to SQL Server Express you can use Access and it's linked table feature to let your boss keep doing his ad-hoc needs through Access while keeping the data in SQL Server Express. If you keep the linked tables named the same as the old physical ones all his reports and queries will should keep working.
I'm an Access promoter, but not for use on websites because Jet/ACE is not threadsafe (though Michael Kaplan once said that is is threadsafe if you access it via ADO/OLEDB; I don't quite understand how a database abstraction layer can wash away a characteristic of the underlying database engine it's calling, but if MichKa said, it's 99% likely to be true).
Now, the exceptions would be if you're using it for prototyping something that will use a different database, or if it's read-only, or is read-write but will only ever have a very small number of users.
Michael Kaplan's website, trigeminal.com, used to use a Jet database as the back end (it may still -- I don't know that MichKa ever changed it), and when that was his main website he reported getting 100K hits a day. But it's a read-only site, so fits my restrictions.
There are so many different alternatives and they are mostly easy to use that I just don't see the point of trying to use Jet/ACE as back end for a website. I'd never do it myself (all the websites I'm responsible for use MySQL).
Simply put, go with MSSQL. Express edition is free, and will give you everything you need to migrate away from Access. These articles are talking about Access applications specifically, but the same issues will plague you.
http://resources.zdnet.co.uk/articles/features/0,1000002000,39285074,00.htm
https://web.archive.org/web/1/http://techrepublic%2ecom%2ecom/5208-6230-0.html?forumID=102&threadID=205509&messageID=2136367

Is there an ASP.NET full text search system for websites?

We host websites in a shared hosting environment where Microsoft SQL Server full text searching is not allowed. We would love an ASP.NET API that allowed similar functionality to get around this restriction.
We can't easily install software on the shared servers, so the API would have to be written in ASP.NET.
SQL "like" queries are our alternative and they are fast enough (our websites never exceed more than 50Mb of text) but they don't rank results well, have a dictionary, do stemming etc
For this type of circumstance I'd rely on Google and create a proper sitemap. You can integrate google search right into your website too with Google SiteSearch.
If you need more control over full-text search, you can use features of the RDBMS to support this. You don't say which brand of RDBMS you're using. I assume it's likely Microsoft SQL Server if you're using ASP.NET.
See the docs for Full-Text Search at MSDN.
For other brands of RDBMS, see my answer: How best to develop the sql to support Search functionality in a web application?
Lucene is what we were looking for http://incubator.apache.org/lucene.net/

Profiling/Optimizing (Sharepoint 2007) Web Parts

I just wonder what options there are to properly measure/profile/optimize ASP.net 2.0 Web Parts, especially the ones for Sharepoint 2007?
As Web Parts are a layer on another layer of technology, getting resource usage, open handles and stuff only for the web part seems to be a bit difficult.
Does anyone know some good tools or practices for profiling and optimizing web parts?
I've had success profiling SharePoint 2010 with EQATEC Profiler. Bonus is that they have a free edition. Since it worked in SharePoint 2010, I expect it will work with SharePoint 2007.
Here's how I got it working with SharePoint 2010: http://blogs.visigo.com/chriscoulson/performance-profiling-a-sharepoint-2010-project-using-eqatec-profiler/
Back when we started with SP2003, we used to worry about not closing connection in apps or web parts. We used the following query to check if the base number of connections (not counting the initial spike) would increase as the app is used on the development server:
SELECT hostname, sysdatabases.name , sysprocesses.status, last_batch from sysprocesses, sysdatabases where sysprocesses.dbid = sysdatabases.dbid and nt_username = 'SP Service Account' and (hostname='WFE1' or hostname='WFE2') and sysprocesses.dbid = 10 order by last_batch desc
(replace the bolded values with those appropriate for your environment)
We haven't tried this since the upgrade to MOSS though.
I have found seperating out all the business logic in to a seperate DLL that is easily unit testable has been the easiest method for me. But to be honest there is really no good way that I have found, besides what I have just mentioned. The same has been true for me with Facebook applications recently. I think this is common for any application that runs inside of another platform. Especially when performance and testing where never a goal when the platform developers started to build the system.

Resources