I have a web page that loads a URL and does the processing using
... Server.createobject("Microsoft.XMLDOM")
every 5 minutes. It processes the data and stores it in a database.
Basically, it needs to be in a loop and run every 5 minutes and PARSE the xml.
Is there a better way to do this?
I'm open to suggestions.
TIA
Steve42
There are two ways to do this. One would be to make a call from client to server using JavaScript code and the other one would be to separate that call to a web service and place it into a Windows Service or something similar.
The first option could be something like:
$(document).ready(function() {
setTimeout(functionToCall, (1000 * 60 * 5)); // Make a call every five minutes
});
function functionToCall() {
$.ajax({
url: 'call to your page',
type: 'GET
}).done(function() {
alert("Things processed");
}).fail(function() {
alert("Error occurred");
});
}
This is something from top of my mind to give you an idea. It might not be the most ideal solution, but it should give you some guideline.
Related
Could you please give me some hints, websites, books or research papers that would explain how to calculate the URL dwell time.
in case you don't know what is dwell time : dwell time denotes the time which a user spends viewing a document after clicking a link on a search engine results page.
Thanks in advance
One crude way to do this on a page would be to use a small GET request on a timer, going to a server - an "I'm still here". The frequency of this would be a trade off. This would be relatively easy to do with jquery or a similar framework.
You would not know if it is actually in an abandoned tab or that it is open but not actually being looked at.
A sample for the client end (using jquery):
$session = Math.floor((1 + Math.random()) * 0x10000);
function still_alive() {
$url = $server_url + "/still_alive";
$.get($url, {location: location.href, session: $session});
}
// call it once to prime it
still_alive();
// Set it up on a timer
window.setTimeout(function() {
still_alive();
}, 1000);
1000 is the interval in milliseconds - so this is on a 1 second interval. $server_url is the server to register this at - I am adding "/still_alive" as an endpoint to register this at. $session - this can be some way of identifying the current session - set to something once when the page loads - it could be the result of a uuid function.
The next line is a Jquery GET request to that whole url. It is being passed a plain object - with the key location holding the url of the current location. It may be more appropriate to be a POST instead of a GET - but the principle is still the same.
I have a mobile app written using Apache Cordova. I am using Azure Mobile Apps to store some data.
I created Easy Tables and 1 Easy API. The purpose of the API is to perform delete / update more than 1 record. Below is the implementation of the API.
exports.post = function (request, response){
var mssql = request.service.mssql;
var sql = "delete from cust where deptno in ( ? )";
mssql.query(sql, [request.parameters],{
success : function(result){ response.send(statusCodes.OK, result); },
error: function(err) { response.send(statusCodes.BAD_REQUEST, { message: err}); }
});
}
Is there any other way to implement it ? The del() method on table object on takes id to delete and I didn't find any other approach to delete multiple rows in the table.
I am having difficulty to test the implementation as the changes in the API code is taking 2-3 hours on average to get deployed. I change the code through Azure website and when I run it, the old code is hit and not the latest changes.
Is there any limitation based on the plans we choose?
Update
The updated code worked.
var sql = "delete from trollsconfig where id in (" + request.body.id + ")";
mssql.query(sql, [request.parameters],{
success : function(result){ response.send(statusCodes.OK, result); },
error: function(err) { response.send(statusCodes.BAD_REQUEST, { message: err}); }
});
Let me cover the last one first. You can always restart your service to use the latest code. The code is probably there but the Easy API change is not noticing it. Once your site "times out" and goes to sleep, the code gets reloaded as normal. Logging onto the Azure Portal, selecting your site and clicking Restart should solve the problem.
As to the first problem - there are a variety of ways to implement deletion, but you've pretty much got a good implementation there. I've not run it to test it, but it seems reasonable. What don't you like about it?
I'm developing a web-application in which I have a constant stream of data that is being received every 5 seconds or so in a java servlet (being read from a file written by another application). I want to push this data onto an html page and get it read in javascript so I can graph it in the d3 library.
At the moment I'm using a javascript function that calls the 'doGet' function of the servlet every 5 seconds. I'm worried this is creating a lot of overhead, or that it could be performed more efficiently.
I know it's also possible to run "response.setIntHeader("Refresh", 5);" from the servlet.
Are there any other better ways?
Thanks in advance for the help!
Short polling is currently probably the most common approach to solving the problem you describe
If you can cope with a few seconds lag in notification, then short polling is really simple, here is a basic example:
On page load, call this in your JS:
setInterval(checkFor, 30000);
The above will call a function checkFor() every 30 seconds (obviously, you can change the 30 seconds to any length of time - just adjust the 30000 in the above line according to how regular you want users to be updated).
Then, in your checkForNotifications function, just make an ajax call to your server asking if there are any updates - if the server says yes, then just display the alert using JS, if not (which will be most of the time most likely) then do nothing:
function checkFor(){
$.ajax({
url: "your/server/url",
type: "POST",
success: function( notification ) {
//Check if any notifications are returned - if so then display alert
},
error: function(data){
//handle any error
}
});
}
I have a number of pages in a WebMatrix Razor ASP.Net site where I have added one line of code:
Response.OutputCache(600);
From reading about it I had assumed that this mean that IIS would create a cache of the html produced by the page, serve that html for the next 10 minutes, and after 10 minutes when the next request came in, it would run the code again.
Now the page is being fetched as part of an timed jquery call. The time code in the client runs every minute. The code there is very simple:
function wknTimer4() {
$.get('PerfPanel', function(data) {
$('#perfPanel').html(data);
});
It occasionally appears to cache, but when i look at the number of database queries done during the 10 minute period, i might have well over 100 database queries. I know the caching isn't working the way I expect. Does the cache only work for a single session? Is there some other limitation?
Update: it really shouldn't matter what the client does, whether it fetches the page through a jQuery call, or straight html. If the server is caching, it doesn't matter what the client does.
Update 2: complete code dumped here. Boring stuff:
#{
var db = Database.Open("LOS");
var selectQueryString = "SELECT * FROM LXD_funding ORDER BY LXDOrder";
// cache the results of this page for 600 seconds
Response.OutputCache(600);
}
#foreach (var row in db.Query(selectQueryString) ){
<h1>
#row.quotes Loans #row.NALStatus, oldest #(NALWorkTime.WorkDays(row.StatusChange,DateTime.Now)) days
</h1>
}
Your assumptions about how OutputCache works are correct. Can you check firebug or chrome tools to look at the outgoing requests hitting your page? If you're using jQuery, sometimes people set the cache property on the $.get or $.ajax to false, which causes the request to the page to have a funky trailing querystring. I've made the mistake of setting this up globally to fix some issues with jQuery and IE:
http://api.jquery.com/jQuery.ajaxSetup/
The other to look at here is the grouping of DB calls. Are you just making a lot of calls with one request? Are you executing a db command in a loop, within another reader? Code in this case would be helpful.
Good luck, I hope this helps!
I have an asp.net website to manage projects. I realized that when I click for the next project really fast and often it overrides all the next projects with the data of the first one. I call a method to save before to go for the next project. I use also a session variable for the id of the project.
EDIT:
It looks like the sever stacks the save method and the ids but keeps the values of the first project in the controls
Am I right?
this is the ajax that calls a server method to get the id and set it in an hidden field:
function NextClick() {
var tabvalue = $("#<%=TabsToFocus.ClientId%>").val();
$.ajax(
{
type: "POST",
url: "Projet.aspx/NextProj",
data: "{tab:'" + tabvalue + "'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function(d) {
if (d.d != "False") {
$("#<%=hid_selProjetID.ClientID%>").val(d.d);
var btn = $("#<%=btnClickLstProjet.ClientID%>");
fillHidden();
btn.click();
}
}
});
}
And btn.click() calls this method on the server side to save
Private Sub Button1_ServerClick(ByVal sender As Object, ByVal e As System.EventArgs) Handles btnClickLstProjet.ServerClick
If HttpContext.Current.Session.Item("isUser") IsNot Nothing AndAlso HttpContext.Current.Session.Item("isUser") = True Then
If HttpContext.Current.Session.Item("curProjetID") IsNot Nothing Then
btnSaveIndicateurs_Click()
btnSaveEnTete_Click()
btnSaveGen_Click()
btnSavePlanif_Click()
End If
End If
HttpContext.Current.Session.Item("curProjetID") = hid_selProjetID.Value
Response.Redirect("Projet.aspx")
End Sub
Thank you
The very first thing you should do is STOP using session.
Seriously, back away from the session object.
A proper use of session is long term, fairly unchanging data. Data that has to change on literally every post back etc belongs in the page itself.
Here is what's happening.
You click on a link to load up the project. The Session variable is being set with the current project id.
You then click on a link to get the next project,
Then quickly click on the link to get the one after that.
The server, meanwhile, is multithreaded. #3 basically interupted #2's execution.. and ran before #2. This means your session variable is jacked up.
Why would the 3rd request run before the 2nd? Well, you are executing a number of queries. It's likely that the queries for request 2 are taking slightly longer to exec than the ones for request 3.
Solution: Stop using Session.
Why: You cannot predict the order in which IIS is going to respond to requests. IIS is a parallel (not serial) engine and requests might very well happen out of the sequence you think they should.
Finally, the guy who said that session is locked by the first requestor wasn't entirely accurate. It is WRITE locked.. but that only occurs when the page starts writing to session. Reads are not locked.
So, when request 3 executes it is using the ID of request 1, or 2. Depending on which one is still active by the time it hits the write code.