I have streamlit app deployed on streamlit cloud I got email that app hasn't seen traffic in the past days, so it will go to sleep, visitors will need to wake the app, which can take a minute or two. So is there any way to keep the app alive?
Do I need to write scripts to maintain traffic in the streamlit app.
It's not possible to keep Streamlit apps awake at all times if they're hosted on Streamlit Community Cloud. If you need the app to be awake at all times regardless of traffic, I'd recommend looking into paid hosting options.
Related
I have a site on a shared hosting provider. My site timeouts when on idle and can take up to 40 seconds to start up again, I want to increase the idle timeout. Under manage - Dedicated IIS application Pool, the idle timeout is set to 5 minutes I want to increase it, I called my provider and they said I am unable to change the settings with a shared hosting account. I was thinking if there was another way Like the web.config folder to increase the timeout time?
The app pool is likely being recycled. There's nothing you can do about this on a shared hosting service. What you can do is send pings to the web server every N minutes. If GoDaddy recycles the app pool every 5 minutes, then send a ping to your website every 4 minutes. Doing this should extend the timeout an additional 4 minutes. If you always do this, it should not recycle unless explicitly called (or unless the host has some other recycle in place).
Optionally, you can use a monitoring service that pings and reports on your server. Here are two that may be of use to you: https://uptimerobot.com/ https://www.pingdom.com/
Uptime services that I've tried didn't actually send HTTP requests to the web app so it didn't help in keeping it alive. However, I found a nice feature in Application Insights called Availability which lets you create recurring tests that actually send GET requests to your website, thus preventing it from recycling.
I explain it more in my blog post here.
Open task scheduler on your computer
Create a scheduled task to run every 4 minutes
Have it run a powershell command invoke-webreqest to your site.
It'll have the effect of someone visiting your site every 4 minutes.
I have a problem with my website and I think it's related to IIS recycling the app pool or shutting down the app after 20 mins of inactivity. As it is a low traffic site, when I first browse to the site the initial load can take up to 30 seconds, then after that it is very fast. If i then come back to the site say a few hours later I get the slow load time again. I'm sure it's something to do with the app pool shutting down after 20 mins of inactivity?
Another problem is that I am hosting via a hosting company so have no direct access to manipulate IIS at all. Does anyone know how I can keep my app alive so I don't suffer from the initial slow load speed issues?
You could setup a service to issue a request every few minutes to your site which can double as monitoring your site to ensure it's still running. You probably don't have access to the application pool configuration I assume.
I'm trying out Windows Azure free trial - as I understand, you can host up to 10 websites on the free account.
My question is: is there any way of hosting a website along with some kind of background processing or scheduled task with the free account on Azure? I'm almost sure that it's not because Web roles support that and not Web sites.
Is there any other alternative to host an ASP.NET MVC website with some kind of background processing on Azure for free? Everything would be purely for educational or personal purposes.
The free sites in Windows Azure Web Sites can technically run some background operations because you can spin up a background thread in the application start up; however, there are a several issues with this approach:
Idle sites will be shut down. This means that if the site isn't seeing a lot of traffic the process can be shut down. I'm not sure that the background processing would keep it alive, or even if it did how reliable that would be kicking off. It will depend heavily on the type of background processing I would think.
The web sites at the free level have CPU and memory quotas. Running something a lot in the background may cause you to hit these quota more often than if the site is more idle. Hitting the quota will shut down the site until a specific time period has past. Be very aware of these quotas if you are using the free or shared levels. If you were planning to have this background processing working a queue for instance this likely won't work out well.
You could use something like the free level of the Scheduler in the Windows Store app to kick off some work by having it call in to your web site and asynchronously kick off the back ground work. This might work and avoid the CPU quota if the amount of background work is pretty small and is completed quickly and with little resources used. Note that there is also a scheduler available with Mobile Services. For educational purposes this may just fine.
Don't forget that with the free trial you get $200 worth of Azure for 30 days. If you are really just trying things out you can easily spin up full cloud services, VMs or even shared web sites during the trial. If you shut them down when not actively working on them the $200 can give you a decent amount of time to try things out.
I have a web app that will run forever (at least for a few days) on my local machine using the technique (hack?) described in Jeff Atwood's post: https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
However when I run it on App Harbor my app doesn't run for more than an hour or so (I'm not sure when it dies) as long as I hit the site it stays up so I'm assuming it is being killed after an idle period, but I'm not sure why.
My app doesn't save any state or persist anything. It makes web service calls and survives errors in any calls.
I turned on a ping service to keep my app alive but I'm curious why this works on my local machine but not on App Harbor?
The guys behind App Harbor pays for EC2 instances for all running apps, so they naturally want to limit the cpu usage as much as possible. One way to achieve this is to shut down unused applications very fast and only restart them when someone actually try to access them. Paid hosting should not be limited in this way.
(As far as I have been informed they are able to host around 100k sites on less than twenty medium instances which is certainly quite impressive and calls for a very economic use of resources.)
To overcome the limitation you would need a cron job to ping your app harbour site. But this of course a quite recursive problem since you need app harbour to act as a cron job ;)
AppHarbor recycles the Application Pool frequently to keep sleeping websites from using idle CPU time. This is simply the price you pay of using a shared website hosting plan.
If you really want to run a background job then you should be using AppHarbor's background workers, since this is exactly the type of task they were built to run.
http://support.appharbor.com/kb/getting-started/background-workers
Simply build a new console application that runs your logic and include it in your solution. When you push the code the workers will be started automatically. If you happen to already have other exe's in your solution make sure to edit the app.config and set the 'deploy background worker' value to false.
I managed a ASP.NET intranet site that use Integrated Windows authentication. It is deployed in many different countries for the staff in those countries. This week I have deployed to three countries that have slow connections to the rest of the internet. That said, the local network here is quite quick.
The site is relatively quick when a user continually make requests (clicks) to the server. But if the user in inactive for between one or two minutes, the next request is very slow. After which the request are quick again until there is a minute pause between requests.
What could cause this?
Also, I was wondering if the Integrated Windows Domain logging could affect the initial load time of pages. Does it query the active directory server or whatever?