Bulk Import in WordPress on shared hosting environment - wordpress

Can any one share a strategy to perform a bulk import of post on Wordpress which may last long for about 10-15 minutes? I have tried to insert putting in a loop, but it gets interrupted in half way throwing 500 internal server error.
I have tried the same script in our local hosting and found to be working fine.
The problem is in shared hosting in which the they have put limits on resources.
I am looking for a strategy that do batch processing on a single click, in which each batch is processed after a specified time gap so that it does not put load continuously on server resources. If you have better idea, please share.

Related

Is it possible to fetch and use a file from cloud storage at when deploying a cloud function

I have a firebase function that makes use of a SQLite database (read-only) which is currently uploaded along with the function.
The problem is that the db file is quite large and gets uploaded every time the function is changed. Is there a way to fetch this file from cloud storage during the installation process (during firebase deploy) - without hard-coding the URL in the source files?
What you're trying to do is problematic because your code running in Cloud Functions may actually be running on any number of server instances, determined by the load on your project. As such, downloading a file once at the time of deployment isn't going to naturally affect all the instances that maybe created or destroyed at any given moment.
It's far better to keep doing what you're doing, and include your extra data during deployment. When a new instance is spun up to handle events for your function, the file will be immediately ready to help service requests.

Updating code on production server when using Go

When I develop and update files on production server with PHP I just copy the files on the fly and everything seems to work without interrupting the server.
But if I am to update the code on the Go server and application and would need to kill the server, copy the src files to the server, run go install, and then start the server, this would interrupt the service, and if I do this quite often then it is going to look very bad for my users of the service.
How can I update files without the downtime when using Go with Go's http server?
PHP is an interpreted language, which means you provide your code in source format and the PHP interpreter will read it and execute it (it may create a more compact binary form so that it doesn't have to analyze the source again when needed).
Go is a compiled language, it compiles into a native executable binary; going further it is statically linked which means every code and library your app is referring to is compiled and linked when the executable is created. This implies you can't just "drop-in" new go modules into a running application.
You have to stop your running application and start the new version. You can however minimize the downtime: only stop the running application when the new version of the executable is already created and ready to be run. You may choose to compile it on a remote machine and upload the binary to the server, or upload the source and compile it on the server, it doesn't matter.
With this you could decrease the downtime to a maximum of few seconds, which your users won't notice. Also you shouldn't update in every hour, you can't really achieve significant updates in just an hour of coding. You could schedule updates daily (or even less frequently), and you could schedule them for hours when your traffic is low.
If even a few seconds downtime is not acceptable to you, then you should look for platforms which handle this for you automatically without any downtime. Check out Google App Engine - Go for example.
The grace library will allow you to do graceful restarts without annoyance for your users: https://github.com/facebookgo/grace
Yet in my experience restarting Go applications is so quick, unless you have an high traffic website it won't cause any trouble.
First of all, don't do it in that order. Copy and install first. Then you could stop the old process and run the new one.
If you run multiple instances of your app, then you can do a rolling update, so that when you bounce one server, the other ones are still serving. A similar approach is to do blue-green deployments, which has the advantage that the code your active cluster is running is always homogeneous (whereas during a rolling deploy, you'll have a mixture until they've all rolled), and you can also do a blue-green deployment where you normally have only one instance of your app (whereas rolling requires more than one). It does however require you to have double the instances during the blue-green switch.
One thing you'll want to take into consideration is any in-flight requests -- you may want to make sure that in-flight requests continue to go to old-code servers until their finished.
You can also look into Platform-as-a-Service solutions, that can automate a lot of this stuff for you, plus a whole lot more. That way you're not ssh'ing into production servers and copying files around manually. The 12 Factor App principles are always a good place to start when thinking about ops.

Updating a Classic ASP website without interrupting service

A couple of questions:
1) How can I update a Classic ASP website/page without interrupting service (users getting an error or service unavailable message) or shutting the website down temporarily?
2) When updating/restoring a MSSQL DB via SQL Server Management Studio, will the website users get an error message?
Thanks in advance.
A smart practice is to use at least one separate development environment with the same setup as your production environment and debug all changes there to ensure that they work. Once your entire site is running and tested on the other, identical environment to your production environment, you should be able to simply move the files and they should work in production. This model being effective is dependent on actually being able to maintain environments as close to identical to each other as possible.
When updating/restoring a MSSQL DB
Be careful with your terminology; UPDATE and RESTORE are two very different commands.
If the database is locked by the changes being made, then it will be inaccessible to users and may cause error messages depending on your IIS and code setup. Scheduling a maintenance period and blocking user access to any pages that access the database is will help avoid messy errors and revealing any information about your infrastructure while the changes are being made.
It seems like you might want to do some basic research on development and databases both in order to make sure you understand what you're doing and can cover all of your bases. Looking up commands like RESTORE and UPDATE and using them correctly is crucial.
For example, when you rewrite one or more of your website files
via FTP, in that very moment when rewriting is taking place,
users will get a 500 Service Unavailable error. How can I avoid this?
This really shouldn't happen, although you could upload the files to a different folder, avoiding any delay there, and sync the files with a diff tool such as Winmerge (also helping you keep track of changes and revert quickly) when done uploading.

Transferring files from old dedicated server to a new one

Using Classic ASP (stop tutting), I need to build an application that transfers high resolution photos from one server to another, around 360,000 including the thumbnails to be exact. The application will be called via a Windows schedule and will run as a background process.
What is the best way to achieve this, keeping performance in-mind? The last time I built a monster script like this was transferring and converting database tables for over one million rows, the application started really fast, but then after 25,000 records it went really, really slow! So I want to avoid this.
Obviously it will be a cross-domain transfer, so I was thinking about using an ASP/FTP component, and one-by-one, grab a file, send it, and record its success in a DB table so it knows what is has done so far.
Is it best to process one file at a time and refresh, so it doesn't abuse the server's resources, or should I process 1000 at a time, or more? I want it to be as quick as possible but without clogging up the server.
Any help/suggestions would be gratefully received.
I think is best to do one file at a time because if the connection goes down for a brief period of time you don't lost the files that you have already sent.
Even when you are using ASP Classic you can take advantage of .net for uploading the files using the FTP client classes in .net and avoid purchasing/installing a third party component. Surely .net is already installed on the server.
My process will look like this:
Upload 1 file using FTP (better performance)
If successful call an ASP page that records the action in the remote DB
Wait a second and retry up to 3 times if error uploading
Proceed to next file
If the process is clogging the server, you can put a brief pause between each upload.
i have something like that running in Classic ASP, it handles tenthousands of images without problem.
On the server that houses the images I run a (vbs)script that for each image
Makes a text-file with metadata
Makes a thumbnail and a mid-sized image copy on the second (web)server
The script runs continuously and only checks per folder and file if the files are present on the webserver and if not creates them, No need for a DB.
Between every check It sleeps a second. Like that the load on the server is only 2%. I use iPhoto in command-line modus to extract the metadata and images but you could use a library for that.
So these three files are stored on the webserver in a copy of the mapstructure from the first server but without de full-sized images.
On the webserver you only need to be able to browse the thumbnails and visualize the metadata and mid-size images.
If the user needs the full-size image he clicks the mid-sized which has as url the file on the first server.
Upload all the files via FTP
Create a CSV file with all your data
Pull it into the DB in one go
The amount of network handshake over 360,000 individual transactions would be the bottleneck.

Is there a good way to create a recurring import for an ASP.NET site?

The site I'm working on is running Windows Server 2003 and SQL Server 8 (2000?), and ASP.NET 3.5.
I need to have some sort of script or application run to import data from an FTP'd text file, into the database. There is already a site running on the machine, that uses the current database. Can I use a scheduled task to reliably kick off some sort of .aspx page that will import the data? Or is there a better approach?
What about making sure that no one else can access the page that runs the import? I don't want random users running the import!
Thanks in advance!
P.S. some processing needs to occur on the data before its inserted. i.e. lookups, conditionals, etc, so the DB tools aren't robust enough (I think). I hate DTS, and I SSIS is not available in this version I think.
If you want to have a C# App handle your import I would suggest a windows application (exe) w/o a form (better than a console app because it does not pop up any UI whenever it runs). Have it run every so often (every minute) by a scheduled task.
Why would you use ASP.NET? Depending on the complexity of the job you could either load it directly to the database (BULK LOAD) or use DTS (SQL Server 2000) or SSIS (SQL Server 2005/2008) if more complex processing is needed.
DTS and stored procedures in a job.
BCP and stored procedures in a job.
You say you need to do alot of lookups and conversions? SQL is good at that - and good at doing it fast. It can seem a little intimidating at first, but it's not hard.
run a BULK INSERT or bcp to import the data instead, see here http://msdn.microsoft.com/en-us/library/aa173839(SQL.80).aspx
I'll echo other people here - you don't want to have a scheduled task hit a web page. SQL Server provides some good data import options, or you could just write a simple windows program and run it as a scheduled task.
Another option would be to write a windows service that watches your FTP directory and does the import.
As others have said, probably a separate console application (triggered by a scheduled task) or a windows service would be the best option for this scenario.
On the other hand, if you already have all the required functionality available in the web app running on the server, then you could probably set up a scheduled task, that starts a script (VBscript, JScript), which in turn calls a page of the web app.
To have some sort of security (e.g. preventing that any user can call that page), you could add some code to the page, that checks if the page was called with http://localhost. This would at least prevent the page from being called from a remote client.

Resources