I've been using OpenStack on and off for a while now. After about a year I installed Mitaka (with keystone+swift) and noticed that the openstack command (CLI client), as well as the swift command are taking a really long time to respond - like 5 or 7 seconds. It didn't use to be like that.
Has there been a significant change in the design that is causing this issue? Maybe something wrong with my setup? Or any thoughts to debug this?
Thanks
I think the best way to debug this is using curl do check if it's something on keystone(?), swift, your machine or the openstack client.
Since I didn't say which authentication method you are using, I will recommend this awesome post from Swiftstack to help you to authenticate:
https://www.swiftstack.com/docs/cookbooks/swift_usage/auth.html
If you get more info, it's going to be easy to find your bottleneck.
Related
is there a way to log something into the server console with Jetty, specifically, when a person connects is there a way to print that into the console?
Anyone have any clue how to do any of this or can link me to websites explaining it?
I haven't found much on google about it at all.
Thanks!
Which version of jetty? You can simply enable debug logging on the connectors or the like. Whatever it is that you exactly want/need to get logged.
However if you're simply looking for a request log have a look at this:
http://www.eclipse.org/jetty/documentation/current/configuring-logging.html#configuring-jetty-request-logs
I am sending out a nightly email through rules scheduler, when I manually execute it sends out one email as it should, however when it runs on the schedule it sends me 10 duplicate emails. I've looked all over and can't seem to find any solution to the problem.
Thanks in advance for any suggestions
Use Job scheduler module. In this module you first insert the data in job_schedule and create a queue for each schedule . when crons run it start executing each queue and send mails then it delete its entry from the job_scheduler table. hence it will not send same mail again and again to the same person.. There is proper documentation in job_scheduler module in drupal7. Just go through it.
This sounds like a bug in the Rules module; it has its quirks. I see you have reported this issue in the Rules issue queue: http://drupal.org/node/1314916, which is what I was first going to suggest. So now I know your issue is for Rules 7.x-2.x dev integration with Views 7... both of which have more than a few bugs. I strongly suspect this issue has as much to do with Views as with Rules. (The 10x repetition seems unlikely to be a coincidence since 10 is a default value for results-per-page in Views, etc)
When you report an issue, it's helpful to include all pertinent information (Drupal version, steps to replicate, what's written to the log, etc). I'd personally suggest seeing if you can replicate your issue in a clean installation of Drupal with just the modules necessary to run your test. If you can replicate it that way, it's easier to provide enough information for the developers to identify the issue and resolve it. (e.g. use Devel generate to create some nodes and dummy users, then create a very simple view, e.g. just titles of the five most recent nodes, and use that view as the source for your email content. Does it send 5 copies? You may need to configure a localhost mail server to test this.
I'm looking to create a webpage that will reflect the status of one of my company's servers automatically. Frequently there will be a minor error that only lasts 2-3 minutes, and it would be great to have this reflected on a self-generated page, which might prevent 50-60 unhappy clients from calling in simultaneously and asking what's wrong.
I'm not quite sure where to begin - would anyone have a suggestions for good resources to study? Programming examples? I'm not referring to the basics of writing an ASP.NET page, of course, but rather process interaction in Windows.
Thanks.
To pull this off, you'd need a separate page that essentially runs server diagnostics, otherwise the page wouldn't know if it was up or down. Also, the page would need to be isolated from the sort of problems that are kill other people's requests, such as cache hit problems, memory starvation, high CPU usage, insufficient bandwidth. So ideally the diagnostics would run in a separate app-pool, separate virtual directory, separate machine.
Many of the interesting diagnostics would require a WMI call, but some you can get from the My.Computer namespace.
Also, are you going to do this on every server, or do you want one web server to display the status of several different servers?
It also depends on the type of errors your servers are encountering.
If they are going down completely, or are losing internet connection, then pinging them after an interval of time will let you know if they are up or not.
If you have a specific process running on a server that becomes unavailable, that can be a little more tricky.
Your best bet is to find a way to do a simple request from the services/applications that are important and see if you get a response, if you do, the server is likely up, if not, then it is likely not.
Anything you can do to reduce the number of support calls you get is a good idea, but I'd also focus some time and try to figure out why your servers are going down so often.
Also, telling your users that the server is down, but not giving a reason why may not give the effect you are looking for. Users will still be confused and frustrated when they can't get their work done.
I know you were looking to build a webpage to display the server diagnostics, but there are plenty of server monitoring tools that produce webpages for an easy dashboard view of the history.
A quick google returned the following link:
http://www.webdesignbooth.com/10-really-useful-server-monitoring-tools/
I know that similar questions have been asked all over the place, but I'm having trouble finding one that relates directly to what I'm after.
I have a website where a user uploads a data file, then that file is transformed and imported into SQL. The file could be up to 50mb in size, and some times this process can take 30 minutes or sometimes even longer.
I realise I need to palm off the actual work to another process, and poll that process on the web page. I'm wondering what the best approach would be though? Being a web developer by trade, I'm finding all this new Windows Service stuff a bit confusing, and I just wanted somewhere to start.
So:
Can I do / should I being doing this with a windows service? if so, how?
Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
clarifications
The data is imported into sql, there's no file distribution taking place.
If there is a failure, it absolutely MUST be reported to the user. The web page will poll every, lets say, 5 seconds, from the time the async task begins, to get the 'status' of the import. Once it's finished another response will tell the page to stop polling for status updates.
queries on final decision
ok, so as I thought, it seems that a windows service is the best idea. So as to HOW to get it to work, it seems the 'put the file there and wait for the service to pick it up' idea is the generally accepted way, is there a way I can start a process run by the service, without it having to constantly be checking a database table / folder? As I said earlier, I don't have any experience with Windows Services - I wondered if I put a public method in the service, can I call it somehow?
well ...
var thread = new Thread(() => {
// your action
});
thread.Start();
but you will have problems with that:
what if the import to sql fails? should there be any response to the client
if it fails, how do you ensure the file on a later request
what if the applications shuts down ... this newly created and started thread will be killed either
...
it's not always a good idea to store everything in sql (especially files...). if you want to make the file available to several servers why not distribute them via ftp ...?
i believe that your whole concept is a bit messed up (sry assuming this), and it might be helpful if you elaborate and give us more information about your intentions!
edit:
Can I do / should I being doing this
with a windows service? if so, how?
you can :) i advise you to create a simple console-program and convert this with srvany and sc. you can get a rough overview howto here (note: insert blanks after =... that's a silly pitfall)
the term should is relative, because you did not answer the most important question
what if a record is persisted to the database, telling a consumer that file test.img should be persisted, but your service hasn't captured it or did not transform it yet?
so ... next on
Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
you probably could create a WCF-service which recieves some binary-data and then stores this to a database. this request could be async. yes. but what for?
once again:
please give us more insight to your workflow: what are you exactly trying to achieve? which "environmental-conditions" to you have (eg. app A polls db and expects file-records which are referenced in table x to be persisted) ...
edit:
so you want to import a .csv-file. well that changes everything :)
but i won't advise you to use a wcf-service (there could be a usage: eg. a wcf-service which has a method to insert a single row, then your iteration through the file would be implemented in another app... not that good, though).
i would suggest following:
at first do everything in your webapp (as you've already done), but rather use some sort of bulk-insert and do your transformation/logic on the database.
if you have some sort of bottle-neck then, i would suggest you something like a minor job-service, eg:
webapp will upload the file and insert a row to a job-table. the job-service is continiously polling the table/or gets informed via wcf by the webapp (hey, hey, finally some sort of usage for WCF in your scenario... :) ) and then does the import-job, writing a finish-note to a table/or set the state of the job to finished ...
but this is a bit overkill :)
Please see if my below comments helps you to resolve your issue:
•Can I do / should I being doing this with a windows service? if so, how?
Yes you can do this with a windows service. And I think that is the way you should be doing it. You can implement your own service to process your request or you can use the open source code Job Proccessor
Basically the idea is..
You submit a request for processing
the csv file in database table with
some status as not started.
Then your windows service picks up
the request from database table which
are not started and update them as in
progress status.
Once the processing is complete
succesfully /unsuccesfuly your
service updated the database table
with status as Completed / Failed.
And your asp.net page can poll to
database table for the current status
every 5 sec or so.
•Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
you should not be using WCF for this purpose.
I'm running .NET on a windows box and I would like to have a function run every night at midnight. Of course since HTTP stateless and Windows doesn't have a "cron job" type function (that I know of), I will either have to visit my site myself every night at midnight or just wait for a user to visit the site to rely on it being updated.
Is there an alternative to this that I can create where something will automatically run at a certain time?
I'm pretty sure that Windows' task scheduler can do most things that cron can do. But I might be missing something.
Edit: Reached at Settings -> Control Panel -> Scheduled Tasks
If none of the other answers work for you, here's an option:
There are a bunch of server monitoring services out there that will make an http call to your site at regular intervals (every minute if you like). You can get 5 minute intervals for free on some of them.
Create a password protected page, that performs your function (if it hasn't been done yet today) and point that service at it.
At least this way you won't have to write anything additional, and you can rest easy knowing it doesn't rely on your home machine.
Jeff Attwood at some point in the podcast mentioned a dirty hack to use the Cache Expiration Callback to fake this.
He'd insert an item in .Net's Cache, with an expiration set to 2 hours, and a callback to get called once the item expired, and that was his cron.
I think this was the article:
http://www.codeproject.com/KB/aspnet/ASPNETService.aspx?display=Print
It sucks if you ask me, but for a shared hosting solution, I can't think of anything much better.
Also, there are external cron services that you give a URL to and they will "ping" it regularly, like: (these are not free)
http://webcron.org/
http://www.webbasedcron.com/
Here's a starting point to programmatically add/delete and manage tasks in the Task Scheduler.
http://www.codeproject.com/KB/system/taskscheduler.aspx
If you have command-line access you could try the "at" command, which is like an ultra-light cron:
http://support.microsoft.com/kb/313565
you can also take a look at Quartz .Net http://quartznet.sourceforge.net/ which is a scheduler
The windows equivalent of cron is At. If you have access to the machine.
Use the Timer class to create a timer that periodically calls a method to be executed.
A static timer can be started in the Application_Start event in the Global class. Because the timer uses an interval rather than an absolute time, you'll have to calculate the time interval until midnight and set the Interval property accordingly.
It looks like GoDaddy has provisions for this, but There Is More Than One Way To Do It:
When you install Drupal it needs you to set up a cron job, and I've found out that the project members have documented this step throughly. Go to http://drupal.org/cron for more information, and remember to read http://drupal.org/node/31506 for specific Windows information.
If everything else fails, google for "web cron job" and use a commercial "cron job" service. Choose carefully, don't get ripped off.
I'm also facing the same issue. I want to run ASP.NET with MSSQL at GoDaddy. But they don't have scheule task for windows hosting. After reading the post, I did my own google and found this free web cron job scheduler:
I just tried and it works perfectly...well almost. The job expires in 1 year.
http://www.setcronjob.com/
Hope it helps.
Searching for an answer to the same question, I found this post (quartz-net-with-asp-net) with setup instruction for Quartz.NET ("Enterprise Job Scheduler for .NET Platform", from their website) even from inside an ASP.NET application.
Added it here mainly for reference.
Why not install cygwin and use cron itself?
here is a pdf guide on setting it up:
http://csc.csudh.edu/kleyba/cygwin-cron.pdf