Making autocomplete search faster in asp.net - asp.net

I have implemented an auto-complete search on my website using ajax autocomplete control.It uses web service which returns results from database.
I have a stored procedure which searches for all text values in all columns of all tables for this purpose.
The problem here is results taking long to show up in autcomplete control.
I have also applied indexing on the most frequently searched table columns, but that to didn't help much. Can this be because of the load on the server, since the server is not a dedicated one. If not how can I fetch the results faster?

You can always optimise the queries to load data faster and use server side caching to cache the data.
Also on UI I would recommend you to use jQuery autocomplete plugin
<script>
$(function() {
var availableTags = [
"ActionScript",
"AppleScript"
];
$( "#tags" ).autocomplete({
source: availableTags
});
});
</script>

Related

How to do pattern searching in fire base real time DB [duplicate]

I am using firebase for data storage. The data structure is like this:
products:{
product1:{
name:"chocolate",
}
product2:{
name:"chochocho",
}
}
I want to perform an auto complete operation for this data, and normally i write the query like this:
"select name from PRODUCTS where productname LIKE '%" + keyword + "%'";
So, for my situation, for example, if user types "cho", i need to bring both "chocolate" and "chochocho" as result. I thought about bringing all data under "products" block, and then do the query at the client, but this may need a lot of memory for a big database. So, how can i perform sql LIKE operation?
Thanks
Update: With the release of Cloud Functions for Firebase, there's another elegant way to do this as well by linking Firebase to Algolia via Functions. The tradeoff here is that the Functions/Algolia is pretty much zero maintenance, but probably at increased cost over roll-your-own in Node.
There are no content searches in Firebase at present. Many of the more common search scenarios, such as searching by attribute will be baked into Firebase as the API continues to expand.
In the meantime, it's certainly possible to grow your own. However, searching is a vast topic (think creating a real-time data store vast), greatly underestimated, and a critical feature of your application--not one you want to ad hoc or even depend on someone like Firebase to provide on your behalf. So it's typically simpler to employ a scalable third party tool to handle indexing, searching, tag/pattern matching, fuzzy logic, weighted rankings, et al.
The Firebase blog features a blog post on indexing with ElasticSearch which outlines a straightforward approach to integrating a quick, but extremely powerful, search engine into your Firebase backend.
Essentially, it's done in two steps. Monitor the data and index it:
var Firebase = require('firebase');
var ElasticClient = require('elasticsearchclient')
// initialize our ElasticSearch API
var client = new ElasticClient({ host: 'localhost', port: 9200 });
// listen for changes to Firebase data
var fb = new Firebase('<INSTANCE>.firebaseio.com/widgets');
fb.on('child_added', createOrUpdateIndex);
fb.on('child_changed', createOrUpdateIndex);
fb.on('child_removed', removeIndex);
function createOrUpdateIndex(snap) {
client.index(this.index, this.type, snap.val(), snap.name())
.on('data', function(data) { console.log('indexed ', snap.name()); })
.on('error', function(err) { /* handle errors */ });
}
function removeIndex(snap) {
client.deleteDocument(this.index, this.type, snap.name(), function(error, data) {
if( error ) console.error('failed to delete', snap.name(), error);
else console.log('deleted', snap.name());
});
}
Query the index when you want to do a search:
<script src="elastic.min.js"></script>
<script src="elastic-jquery-client.min.js"></script>
<script>
ejs.client = ejs.jQueryClient('http://localhost:9200');
client.search({
index: 'firebase',
type: 'widget',
body: ejs.Request().query(ejs.MatchQuery('title', 'foo'))
}, function (error, response) {
// handle response
});
</script>
There's an example, and a third party lib to simplify integration, here.
I believe you can do :
admin
.database()
.ref('/vals')
.orderByChild('name')
.startAt('cho')
.endAt("cho\uf8ff")
.once('value')
.then(c => res.send(c.val()));
this will find vals whose name are starting with cho.
source
The elastic search solution basically binds to add set del and offers a get by wich you can accomplish text searches.
It then saves the contents in mongodb.
While I love and reccomand elastic search for the maturity of the project, the same can be done without another server, using only the firebase database.
That's what I mean:
(https://github.com/metaschema/oxyzen)
for the indexing part basically the function:
JSON stringifies a document.
removes all the property names and JSON to leave only the data
(regex).
removes all xml tags (therefore also html) and attributes (remember
old guidance, "data should not be in xml attributes") to leave only
the pure text if xml or html was present.
removes all special chars and substitute with space (regex)
substitutes all instances of multiple spaces with one space (regex)
splits to spaces and cycles:
for each word adds refs to the document in some index structure in
your db tha basically contains childs named with words with childs
named with an escaped version of "ref/inthedatabase/dockey"
then inserts the document as a normal firebase application would do
in the oxyzen implementation, subsequent updates of the document ACTUALLY reads the index and updates it, removing the words that don't match anymore, and adding the new ones.
subsequent searches of words can directly find documents in the words child. multiple words searches are implemented using hits
SQL"LIKE" operation on firebase is possible
let node = await db.ref('yourPath').orderByChild('yourKey').startAt('!').endAt('SUBSTRING\uf8ff').once('value');
This query work for me, it look like the below statement in MySQL
select * from StoreAds where University Like %ps%;
query = database.getReference().child("StoreAds").orderByChild("University").startAt("ps").endAt("\uf8ff");

Firebase and Angularfire nightmare migration for Update

I am new to firebase and I am having a bit of a nightmare trying to adapt old code to what is now deprecated and what is not. I am trying to write a function which updates one "single" record in my datasource using the now approved $save()promise but it is doing some really strange stuff to my data source.
My function (should) enables you to modify a single record then update the posts json array. However, instead of doing this, it deletes the whole datasource on the firebase server and it is lucky that I am only working with testdata at this point because everything would be gone.
$scope.update = function() {
var fb = new Firebase("https://mysource.firebaseio.com/Articles/" + $scope.postToUpdate.$id);
var article = $firebaseObject(ref);
article.$save({
Title: $scope.postToUpdate.Title,
Body: $scope.postToUpdate.Body
}).then(function(ref) {
$('#editModal').modal('hide');
console.log($scope.postToUpdate);
}, function(error) {
console.log("Error:", error);
});
}
Funnily enough I then get a warning in the console "after" I click the button:
Storing data using array indices in Firebase can result in unexpected behavior. See https://www.firebase.com/docs/web/guide/understanding-data.html#section-arrays-in-firebase for more information. Also note that you probably wanted $firebaseArray and not $firebaseObject.
(No shit?) I am assuming here that $save() is not the right call, so what is the equivalent of $routeParams/$firebase $update()to do a simple binding of the modified data and my source? I have been spending hours on this and really don't know what is the right solution.
Unless there's additional code that you've left out, your article $firebaseObject should most likely use the fb variable you created just before it.
var article = $firebaseObject(fb);
Additionally, the way in which you're using $save() is incorrect. You need to modify the properties on the $firebaseObject directly and then call $save() with no arguments. See the docs for more.
article.Title = $scope.postToUpdate.Title;
article.Body = $scope.postToUpdate.Body;
article.$save().then(...

What keeps caching from working in WebMatrix?

I have a number of pages in a WebMatrix Razor ASP.Net site where I have added one line of code:
Response.OutputCache(600);
From reading about it I had assumed that this mean that IIS would create a cache of the html produced by the page, serve that html for the next 10 minutes, and after 10 minutes when the next request came in, it would run the code again.
Now the page is being fetched as part of an timed jquery call. The time code in the client runs every minute. The code there is very simple:
function wknTimer4() {
$.get('PerfPanel', function(data) {
$('#perfPanel').html(data);
});
It occasionally appears to cache, but when i look at the number of database queries done during the 10 minute period, i might have well over 100 database queries. I know the caching isn't working the way I expect. Does the cache only work for a single session? Is there some other limitation?
Update: it really shouldn't matter what the client does, whether it fetches the page through a jQuery call, or straight html. If the server is caching, it doesn't matter what the client does.
Update 2: complete code dumped here. Boring stuff:
#{
var db = Database.Open("LOS");
var selectQueryString = "SELECT * FROM LXD_funding ORDER BY LXDOrder";
// cache the results of this page for 600 seconds
Response.OutputCache(600);
}
#foreach (var row in db.Query(selectQueryString) ){
<h1>
#row.quotes Loans #row.NALStatus, oldest #(NALWorkTime.WorkDays(row.StatusChange,DateTime.Now)) days
</h1>
}
Your assumptions about how OutputCache works are correct. Can you check firebug or chrome tools to look at the outgoing requests hitting your page? If you're using jQuery, sometimes people set the cache property on the $.get or $.ajax to false, which causes the request to the page to have a funky trailing querystring. I've made the mistake of setting this up globally to fix some issues with jQuery and IE:
http://api.jquery.com/jQuery.ajaxSetup/
The other to look at here is the grouping of DB calls. Are you just making a lot of calls with one request? Are you executing a db command in a loop, within another reader? Code in this case would be helpful.
Good luck, I hope this helps!

Creating a search functionality in ASP.NET

I have a website, that content (HTML) is generated using ASP.NET C# from an SQL Server database.
Now I want to add a search function on the website so users can search the content. It would bring up a results page with results.
What is the best way to do this?
The 2 best solutions:
Google Custom Search (GCS)
SQL Server (manual)
GCS:
Here you will rely totally on Google. If they index your webpage in 60 days, then good luck. You won't find info which isn't stored, publically like a webpage. Therefore, any content within the login, forget it.
You will also rely on Search Engine Optimization. if you don't optimize your page titles, meta descriptions ect, the search won't be of much use.
Custom SQL Server:
If you put a full text index on your data fields, you can search for your keywords. This is a decent solution, but remember the indexes (otherwise it will be very slow).
I would search for "SQL Server Full text search" for help on this solution.
The benefit here is you have full control and you can access everything.
EDIT:
There are of course many other solutions. I would also suggest looking into Lucene, or some implementations on top of Lucene such as Solr. However all search functionality is usually very difficult and timeconsuming, henceforth my first two suggestions.
In the company I work at we've previously used FAST, and use Apptus today.
EDIT 2:
Today I would advice one solution only: ElasticSearch. It's a great solution; easy to work with; works on all platforms; based on a nice REST api and JSON and is performing very well.
Microsoft Index Server: http://www.c-sharpcorner.com/UploadFile/sushil%20saini/UsingIndexServer11262005045132AM/UsingIndexServer.aspx
or ...
Google Custom Search: http://www.google.com/coop/cse/
Your pages are generated from SQL database. I think its safe to assume that the relevant data also lies in the SQL DB rather than the asp templates or the C# code. To search that data you could write multiple queries to the database, based on the contains("search term") function.
You could have a simple search that executes all those queries and also have advanced search where you can provide checkboxes based on which queries to execute to refine the search.
That would make more sense than doing a raw search over generated content, imo.
Use Lucene (The Apache Lucene project develops open-source search software).
http://lucene.apache.org/
http://ifdefined.com/blog/post/Full-Text-Search-in-ASPNET-using-LuceneNET.aspx
If you are using the SOL DB Try Enable your own code for Search box in it. For example i'm creating a Video Portal, i'm searching videos by my own Search box by using the following Code,
<script type="text/javascript">
$(document).ready(function () {
SearchText();
});
function SearchText() {
$(".autosuggest").autocomplete({
source: function (request, response) {
$.ajax({
type: "POST",
contentType: "application/json; charset=utf-8",
url: "Home.aspx/GetAutoCompleteData",
data: "{'username':'" + document.getElementById('txtSearch').value + "'}",
dataType: "json",
success: function (data) {
response(data.d);
},
error: function (result) {
alert("Error");
}
});
}
});
}
$(".autosuggest").autocomplete({
source: function (request, response) {
$.ajax({
type: "POST",
contentType: "application/json; charset=utf-8",
url: "Home.aspx/GetAutoCompleteData",
data: "{'username':'" + document.getElementById('txtSearch').value + "'}",
dataType: "json",
success: function (data) {
response(data.d);
},
error: function (result) {
alert("Error");
}
});
}
});
</script>
/// <summary>
/// To AutoSearch. . .
/// </summary>
/// <param name="userName"></param>
/// <returns></returns>
public List<string> GetAutoComplete(string userName)
{
List<string> lstStr = new List<string>();
sqlCon = new SqlConnection(strCon);
sqlCmd=new SqlCommand("select DISTINCT OldFileName from UploadedVideo where OldFileName LIKE '%'+#SearchText+'%'", sqlCon);
sqlCon.Open();
sqlCmd.Parameters.AddWithValue("#SearchText",userName);
SqlDataReader reader=null;
reader = sqlCmd.ExecuteReader();
while(reader.Read())
{
lstStr.Add(reader["OldFileName"].ToString());
}
return lstStr;
}
I've Created a auto Complete box. The Main thing here is we can use our own code. . .
It's a little difficult knowing which direction you prefer to go on with search functionality, and not knowing what languages you prefer/and are comfortable in using are..
So, how about something simple? and use hosted search?
This site here, for free, will index up to 1000 and you get all sorts of reporting with it too. Looks like you just have to add some simple HTML into your site to get it all working.
you can also re-index on demand and also set up a schedule to do it for you. No need to wait for Google..
The site is Site Level
Use Google Search
*If possible you can use Sharepoint for website development and Search is already there for each website.
Here you can found a best tutorial.
http://www.codeproject.com/Articles/42454/Implement-Search-Functionality-into-your-ASP-NET-M
If your content is stored in SQL database and you need to search for it inside that DB - then you need some kind of a query builder.
There are a few of them on the market. I can remember Aspose Query and EasyQuery but you will find more if google for "query builder asp.net" or something similar.

Retrieve comments from website using disqus

I would like to write a scraping script to retrieve comments from cnn articles. For example, this article: http://www.cnn.com/2012/01/19/politics/gop-debate/index.html?hpt=hp_t1
I realize that cnn uses disqus for their comment discussion. As the comment loading is not webpage-based (ie, prev page, next page) and is dynamic (ie, need to click "load next 25"), I have no idea how to retrieve all the 5000+ comments for this article.
Any idea or suggestion?
Thanks so much!
I needed to get comments via scraping a page that had disqus comments via ajax. Because they were not rendered on the server, I had to call the disqus api. In the source code, you will need the identifier code:
var identifier = "456643" // take note of this from the page source
// this is the ident url query param in the following js request
also,look in the js source code to get the pages public key, and forum name. Place these in the url where appropriate.
I used javascript nodejs to test this, ie :
var request = require("request");
var publicKey = "pILMw27bsbJsdfsdQDh9Eh0MzAgFL6xx0hYdsdsdfaIfBHRvLGqFFQ09st";
var disqusUri = "https://disqus.com/api/3.0/threads/listPosts.json?&api_key=" + publicKey + "&thread:ident=456643&forum=nameOfForumFromSource";
request(disqusUri, function(res,status,err){
console.log(res.body);
if(err){
console.log("ERR: " + err);
}
});
The option for scraping (other then getting the page), which might be less robust (depends on you're needs) but will offer a solution for the problem you have, is to use some kind of wrapper around a full fledged web browser and literally code the usage pattern and extract the relevant data. Since you didn't mention which programming language you know, I'll give 3 examples: 1) Watir - ruby, 2) Watin - IE & Firefox via .net, 3) Selenium - IE via C#/Java/Perl/PHP/Ruby/Python
I'll provide a little example using Watin & C#:
IE browser = new IE();
browser.GoTo(YOUR CNN URL);
List visibleComments = Browser.List(Find.ById("dsq-comments"));
//do your scraping thing
Link moreComments = Browser.Link(Find.ByClass("dsq-paginate-append-text");
moreComments.click();
//wait util ajax ended by searching for some indicator
Browser.WaitUntilContainsText(SOME TEXT);
//do your scraping thing
Notice:
I'm not familiar with disqus, but it might be a better option to force all the comments to show by looping the Link & click parts of the code I posted until all the comments are visible and the scrape the List element dsq-comments

Resources