partial keyword / n gram search in drupal - drupal

how does one implement partial keyword search in drupal since by default drupal does not search for partial keywords.

If you want to search on a single field, you can create a view with an exposed filter and set it to "contains" that will allow partial keyword searching.
You can also patch the core search module, which is going to make maintenance a pain in the backside later on. Patch at http://drupal.org/node/103548 .
The Porter Stemmer module attempts to extrapolate partial searches into full keyword matches (http://drupal.org/project/porterstemmer)
For real fine-grained control over search, you'll need to ditch the Drupal core search functionality completely and upgrade to Apache SOLR search, which is exceptionally powerful but requires a fair amount of configuration.
You can also drop Drupal search entirely and use a Google Custom Search Engine if you just want to do text matches across the site (http://drupal.org/project/google_cse)

Related

Search multiple plone site indexes

I need to implement a central search for multiple plone sites on different servers/machines.If there is a way to select which sites to search would be a plus but not the primary concern.Few ways I came upon to go about this:
-Export the ZCatalog indexes to an XML file and use a crawler periodically to get all the XML files so a search can be done on them,but this way does not allow for live searching.
-There is a way to use a common catalog but its not optimal and cannot be implemented on the sites i am working on because of some requirements.
-I read somewhere that they used solr but i need help on how to use it.
But I need a way to use the existing ZCatalog and index and not create another index as i think is the case with using solr due to the extra overheads and the extra index required to be maintained.But will use it if no other solution possible.I am a beginner at searching so please give details as much as possible.
You should really look into collective.solr:
https://pypi.python.org/pypi/collective.solr/4.1.0
Searching multiple sites is a complex use case and you most likely need a solution that scales. In the end it will require far less effort to go with Solr instead of coming up with your own solution. Solr is build for these kind of requirements.
As an alternative, you can also use collective.elasticindex, an extension to index Plone content into ElasticSearch, for this.
According to its documentation:
This doesn’t replace the Plone catalog with ElasticSearch, nor
interact with the Plone catalog at all, it merely index content inside
ElasticSearch when it is modified or published.
In addition to this, it provides a simple search page called
search.html that queries ElasticSearch using Javascript (so Plone is
not involved in searching) and propose the same features than the
default Plone search page. A search portlet let you redirect people to
this new search page as well.
That can be and advantage over collective.solr.

drupal 7 internationalisation il8n load node

I'm a part-time developer (ranked Enthusiastic Amateur) and am trying to develop my first multilingual site in Drupal 7 using the Internationlisation (il8n) suite.
After following the initial setup directions, I'm now trying to implement a "featured article" column where Articles flagged as Featured appear in a list in the right-hand column of the page.
Ordinarily I'd simply do a "db_query" to return a list of node ids and then do a node_load() for each result.
However, how do I do this using il8n to ensure that I'm pulling out the correct language versions of the Articles?
Any and all help is greatly appreciated,
~Matt Bridger
Use the i18n_node_get_lang() function to check the language of the nodes and load only the ones with the right language.
Typically you would use Views module (http://drupal.org/project/views). To create a block listing the nodes you want, and then under Filter Language, use Content: Language set to Current User's Language to filter for only the language being viewed. With this solution, there is no need to write any code, unless you need some advanced theming for your list.

How to index & search hierarchical nodes with solr + drupal + cck

My Drupal 6 site uses 3 custom node types that are hierarchically organized: page, book, library. I want to use Solr to index and search the content.
I want the search to return only Book nodes in the results.
But I want the search to use the contents of children (pages) and parents (libraries) when performing the search.
Can Solr be configured to index & search in this way?
Thanks!
You are going to have a couple of issues with this:
Solr isn't hierarchical by nature, it's denormalized so indexing a heirarchy is hard.
You're going to have to figure out how to boost various terms/fields based on where in the hierarchy they are (is the library more important than the book, so to speak).
Drupal has a specific configuration related to nodes and modifying that, by default, wouldn't be the easiest.
The Solr implementation is tightly tied to the database, so modifying the configuration would probably take a lot of effort on your part.
I would recommend you don't try to implement this, but if you did you could look at the Apache Solr Attachments module. You would have to do something similar... basically:
hook_modify_query to modify the actual indexing of the node
custom theme your search results to display this hierarchy
Or you could create a single giant field with a bunch of searchable text and use that as part of your searches.

Using Solr for multiple sites

I have setup a Solr server, now, I have two sites that I want to index and search using SolrNet.
How do I differentiate the two sites' content in Solr?
You may want to take a look at this document: http://wiki.apache.org/solr/MultipleIndexes
I think the best approach is to use Multiple Solr Cores.
Another option is you can simply add a new field that indicates the item's Web site. For example, you can add a field called type.
Searches on website1.com would require you to filter on the type field.
&fq=type:website1.com
That way you only need to deal with one core and one schema.xml file. This works if the pages of both sites have a very similar field set, and it will make it easier to search across both sites if you plan on doing that.
http://wiki.apache.org/solr/MultipleIndexes#Flattening_Data_Into_a_Single_Index

Intelligent Keyword Searching

I have a taxonomy vocab assigned to a content type in Drupal 6. I've then exposed (using "is one of") it as a field in views which allows a user to search via keywords.
The problem is when it runs the query it is first referencing the term so instead of using a like statement it looks the term up in the taxonomy table and brings back it's ID. So if a user searched for one term that exists and another that doesn't the whole thing fails rather than bringing back relevant results for the first term that exists.
Is there anyway to do partial matching using views?
Thanks
For anyone looking for the answer the best implementation I could find was Apache Solr.
Relatively easy to setup if you have your own server.
More information can be found here - http://drupal.org/project/apachesolr

Resources