Environment : ASP.NET, C#
I am writing a web program that will export 12 tables to another database. When the user click the "Export" button, the export process will get started. The thing is, I like to show some messages to client such as "preparing..., deleting old records..., exporting..., export completed."
But now, It is showing all status once 12 table are completed.
Please guide me how to do this.
Doing long processes like this is on a web page is probably not a good idea, you will need change the timeout settings on pages etc
Rather build a windows service that does all the work. You can then use the Page to initialte the process. The service could update some status that is read by the page. This way the page does very little work, ie initiate the process and poll for status updates. This can be done via simple jquery/ajax calls.
try this
put this on c#
and call ajavascript function per process
Page.RegisterClientScriptBlock("1","alert('add to table now')");
Use a javascript timer to periodically load content from a status page or service.
The timer is fairly simple:
setTimeout('checkProcessStatus()', waitTime);
The update itself depends on your choice of tech - for instance if using MVC and jQuery:
function checkProcessStatus() {
//dynamically add the html from the status page to <div id="taskStatusPanel"
$('#taskStatusPanel').load('export/status/' + taskId);
//set timer to call this again
setTimeout('checkProcessStatus()', waitTime);
}
Then the ExportController.Status action would return a simple view that displayed the HTML for the current status.
Related
I have a few feature files in my project and I need to execute only the specific cucumber tags (#Regression) from the feature file using Terminal. I could able to run the feature file using the tags. But the test/Browser window gets closed and open for each feature file. In this case, I have to write a login script in all the feature files to avoid this problem.
Expectation: Test/Browser should not be closed each time and Login should happen only at the beginning of the script execution.
Can someone help me to overcome this problem?
Explanation
That you have to run the login for each Scenario in your Feature respectively is the expected behavior, since each test should be as independent as possible in itself.
In order not to have to add a login step for each Scenario again and again, there are so-called Backgrounds in Cucumber. Backgrounds describe steps that apply as a precondition for all Scenarios in a Feature.
Backgrounds behave like normal Scenarios, so for example you can create a Background in each of your Features with a Given step for the login so that it is automatically executed before each scenario.
Example
Each Feature would receive the following Background, which is then automatically executed once before each Scenario:
#SomeTag
Feature: Some Feature
Background: User is logged in
Given the user is logged in
Scenario: Some first scenario
Given ...
When ...
Then ...
Scenario: Some second scenario
Given ...
When ...
Then ...
The implementation of the step definition is then the same as for steps for your normal Scenarios and can be reused in all Features:
import { defineStep, Given } from 'cypress-cucumber-preprocessor/steps';
Given('the user is logged in', () => {
// logic for login
});
// or more generic using defineStep
defineStep('the user is logged in', () => {
// logic for login
});
Regarding the logic for the login it is often suitable to use Cypress Custom Commands (Example Login for Azure AD)
We have a notification which will post data to an application using the application end point.
notification ABC{
post = savedetailsurl
body = {{.|json}}
useBody = true
}
So the end point will save all the details in mysql DB.
Now in our template we call another end point to get the details which we saved using the webhook in notification.
template ABC {
use the " getDetailsUrl" and use the details in forming the email
}
Now the problem is race condition. Sometimes the details are not saved yet in the backend (mysql), and getDetailsUrl is called. So we get the empty result.
Is there are way to solve the race condition.
Bosun's notification system is designed to be very basic. If you want something more advanced you will need to use a separate system to generate the notification details and/or handle the alert workflow. Some people have used pagerduty or other monitoring systems like Shinken to do more advanced notifications or alert management.
Your best bet is to skip the built in notifications and do everything in a external system. You can still use the http://bosun.org/api to integrate with the various alert states (crit/warn/ack/close/etc) or you can change your alerts to use log = true to bypass all the built in states and create your own workflow.
I am working with offline support of Meteor Application. I have researched about this support but all are giving one answer 'ground:db'. I looked into that solution its really nice effort by #raix. I started with that package, Its already working project so first task i have done that all collection i have grounded with following syntax
var Users = Meteor.users;
if(Meteor.isClient){
SmtGroundCollections.Users = Ground.Collection(Users);
}
After that i have tried with my offline application but still its showing loading and i am not getting my dom elements after that i have tried with that all waitOn subscription i have put on condition
if(Meteor.status().connected){
/* my subscriptions */
}
After that i am able to see my dom and if i visited that page when i am online then after i am going offline i am able to see my data.
Now i am explaining my problems.
1) When i am calling my methods its not updating my ground collection if i am offline. I used below code for resume my methods
if(Meteor.isClient){
Ground.methodResume([
'addProfie',
' editProfile' ,
' deleteProfile ' ,
]);
}
Its working fine when i am coming from offline to online its syncing my data to server but i am not able to immediate effect.
2) If i want full application offline then i need to visit every page of my mobile application and then i can get that data in offline but its not possible so i want one centralise thing where i will press button and i can grounded my all data which i want offline.
So can anyone help me to solve above problem
Thanks in advance
I have a number of pages in a WebMatrix Razor ASP.Net site where I have added one line of code:
Response.OutputCache(600);
From reading about it I had assumed that this mean that IIS would create a cache of the html produced by the page, serve that html for the next 10 minutes, and after 10 minutes when the next request came in, it would run the code again.
Now the page is being fetched as part of an timed jquery call. The time code in the client runs every minute. The code there is very simple:
function wknTimer4() {
$.get('PerfPanel', function(data) {
$('#perfPanel').html(data);
});
It occasionally appears to cache, but when i look at the number of database queries done during the 10 minute period, i might have well over 100 database queries. I know the caching isn't working the way I expect. Does the cache only work for a single session? Is there some other limitation?
Update: it really shouldn't matter what the client does, whether it fetches the page through a jQuery call, or straight html. If the server is caching, it doesn't matter what the client does.
Update 2: complete code dumped here. Boring stuff:
#{
var db = Database.Open("LOS");
var selectQueryString = "SELECT * FROM LXD_funding ORDER BY LXDOrder";
// cache the results of this page for 600 seconds
Response.OutputCache(600);
}
#foreach (var row in db.Query(selectQueryString) ){
<h1>
#row.quotes Loans #row.NALStatus, oldest #(NALWorkTime.WorkDays(row.StatusChange,DateTime.Now)) days
</h1>
}
Your assumptions about how OutputCache works are correct. Can you check firebug or chrome tools to look at the outgoing requests hitting your page? If you're using jQuery, sometimes people set the cache property on the $.get or $.ajax to false, which causes the request to the page to have a funky trailing querystring. I've made the mistake of setting this up globally to fix some issues with jQuery and IE:
http://api.jquery.com/jQuery.ajaxSetup/
The other to look at here is the grouping of DB calls. Are you just making a lot of calls with one request? Are you executing a db command in a loop, within another reader? Code in this case would be helpful.
Good luck, I hope this helps!
I have a shopping cart based application in asp.net 2.0 which uses the concept of group buying.My requirement is that when user checks out a particular product,he should do it with the latest price of that item at that time.
Now there is a scenario.
I have a product with price 50.I did a checkout.50 is displayed in my cart.At the same time some other user is accessing the product and now based on some business logic,we calculate the price.the second user did some activity which reduced the price to 45. I have a trigger which updates all shopping cart items with this new price.
I want to show this updated price on the frontend of the first user without a postback. or i want to give him a message that this price has changed so do a page refresh.
I have the following options.
1) The repeater control which shows the cart should be put under an update panel and that update panel should be refreshed after some interval using a timer.
2) use SQL Server notification services and invalidate the cache as soon as the data changes in database.
I do not want to use notification services as I am not caching the data.Secondly problem with update panel and timer control in that it will be a overhead to refresh it and refresh time is hard to find.
Please tell me a way to accomplish this scenario.
I would generate JavaScript that updates all Repeater items (html fields with prices), and has setTimeout() for periodical checking to web service (wcf, asmx or ashx) if price has changed. If it has, then i would retrieve all prices and update HTML fields! User don't need to prees anything, you can show him some notice that price has changed.
With jQuery and JSON for object serialisation this could be easily accomplished.
This is usually done via ajax from the client (browser) side of things. Perhaps this approach will work for your requirements as well?
Using ExtJs core, you can make an ajax call as follows:
Ext.Ajax.request({
url: 'ajax_demo/sample.aspx',
params : {'requestType':'check-sql-prices'}
success: function(response, opts) {
var obj = Ext.decode(response.responseText);
// process the response JSON object.
},
failure: function(response, opts) {
// this writes the the firebug console.
console.log('server-side failure with status code ' + response.status);
}
});
Ext Core also handles "Timed Code Execution", so you could run the check, say, every 30 seconds. Round trip timing is usually less than 100 milliseconds, but that would definitely depend on your "SQL price checking" logic.