How to decrease POST and GET queue in high load time? - wordpress

I have created a wordpress website to collect student's attendance. To do so, I've installed a plugin and all data are sent to google sheet.
The problem is that when all students enter and tried to submit their attendance, around hundred users go live at one moment which leads to a very high load to my website, and mostly they get error 503 or sometimes 500.
To solve this problem, some solutions cross my mind:
Of course, I can upgrade my server and hardware resources. However, I'm using a shared hosting and it is very costly to do.
I installed another plugin, and tried to handle the situation with two different and separate plugins in one page, however, as I know, they both use one GET and POST function which is the core of wordpress and it does't matter if I use different plugin simultaneously, they need to wait. Am I right?
I created two mirror page for my attendance to direct users to each page randomly, hopefully it reduces the page load. However, for form submitting the scenario is still the same since the forms in different pages also use same POST and GET.
Please give me some advice if there is any other solution. For now, I just inserted a google form as an alternative. However, I guess there is maybe another possible solution to handle inside the site not using external form provider.
Here is the site: Attendance website

Can you use 'Disconnected' architecture?
Ideally, all the attendance should be sent to a high performance queue and then your app can read it at its own pace.

Related

Can Wordpress handle more than 100 pages?

I'm thinking of hosting over 100 pages on Wordpress and I'm worried about the performance. It's very easy to create a website with Wordpress, but is it able to handle it? What do you think?
It's true Wordpress doesn't handle Pages as good as Posts, but with 100 you should be fine.
This is mostly due to the fact that pages use a different mechanism to handle URL's and are hierarchical, as reported here that is enough to make a different impact on performance.
Wordpress has documentation about Performance but doesn't state exactly the amount it starts to lag, that is because it will depend on the hardware you're running your website on.
If by pages you mean posts, I recently worked with a site that had +21k posts and there were no problems on that end.
Hierarchical post types (like pages) can cause memory issues, here is the relevant Trac ticket. See also this blog post.
Wordpress can handle n number of pages.It is obvious that on every request by a user only one page is given in response and not all pages,hence you don't need to worry about website performance while considering number of pages...but If you do complex tasks in single page and user(or many users are) is accessing that page again and again than throughput will decrease and that will be for that page only,not for whole site.
Do not think about website performance on number of pages.

How to know the number of users for a particular wordpress plugin

By checking the WordPress stats we get the idea that how many times a WordPress plugin is downloaded. But this is not the number of how many users for that plugin, right. Same user will download the plugin when a new version releases.
So do we have any tools or stats to get the total number of unique users for a WordPress plugin??
I did a research on the matter. And the answer is no.
Quotes from Otto comments in this 2010
article
about the stats charts in every plugin's page.
[...] the download count includes direct downloads as well
There is no “raw count” anywhere on that version number chart. The raw count is not data that will be made available.
For your own plugin, you can use tracking as #PeterVanDerDoes points out.
Curiously, the plugin I used as example in the research, WordPress SEO by Yoast, is the same that does this kind of tracking. And here's a nice discussion about it.
I'll reproduce the relevant part of the plugin development official guidelines:
7. No "phoning home" without user's informed consent. This seemingly simple rule actually covers several different aspects:
No unauthorized collection of user data. For example, sending the admin's email address back to your own servers without permission of the user is not allowed; but asking the user for an email address and collecting if they choose to submit it is fine. All actions taken in this respect MUST be of the user's doing, not automatically done by the plugin.
All images and scripts shown should be part of the plugin. These should be loaded locally. If the plugin does require that data is loaded from an external site (such as blocklists) this should be made clear in the plugin's admin screens or description. The point is that the user must be informed of what information is being sent where.
In general, things like banner or text link advertising should not be anywhere in a plugin, including on its settings screen. Advertising on settings screens is generally ineffective anyway, as ideally users rarely visit these screens, and the advertising is low quality because the advertising systems cannot see the page content to determine good ads. So they're best just left off entirely. Putting links back to your own site or to your social-network of choice is fine. If the plugin does include advertising from a third party service, then it must default to completely disabled, in order to prevent tracking information from being collected from the user without their consent. This is the method commonly known as "opt-in".
Note that if you do include what we consider to be "advertising spam", or attempt to game somebody else's advertising system, then we will not only remove your plugin, but also report your code to the advertising system's abuse mechanism as well. We do not react kindly to spam. Don't try it.
The only way I can think of that you could track something like this is by having the plugin phone-home with some stats to your own server.
Just make sure users can select to opt-out of tracking.

Sharing page across websites in ASP.NET

Here at our company we are trying to figure out how to create one single page and share it across domains in ASP.NET.
We would like to create a simple "cart" page that is the same for all of our clients websites, so that we can include the page from a central location (such as http://ourwebsite.com/thecart.aspx) without duplicating code, and still be able to apply the CSS styles and branding for each client to the page.
How can we share a single page across websites in ASP.NET?
Each of our client's websites are on a different domain, and in some cases may also be on different servers as well.
I think what you want to do is manage one page and have it automatically update the other pages on your client's sites, ideally the same thing as "sharing a resource" no? For that you don't necessarily need to "share" a page, you need an easy process for multiple site deployment of just the single page, errr....I think? In any case, without loading the page via an iframe, or creating a central spot like "cart.somedomain.net" and then pushing the info back and fourth (I assume you'll have shopping cart items), you'd need a way to automate the publish of the page on different sites.
Even if you were to make the "cart" page it's own solution and then just include it in the individual sites, you'd still have the deployment issue. I think you have a few options, some of them previously mentioned:
Create an iframe that loads the page from an external source.
Create a central location for all the domains to push information to for their checkout process (store.somedomain.net or somdomain.net/cart.aspx) and handle it accordingly.
Create an application or script that automates the deployment of the updated resource to multiple sites (I don't know of a tool that does this or I would offer up the name to you, I apologize).
Anyway, I hope that helps, best of luck.
Inherit from and create the page as just another server control in a custom library. You'll of course have it in source control.
In fact, it doesn't need to be a "page", rather a custom server shopping cart control.

Automatically send a newsletter everytime a page is updated

I have this page:
http://www.thedome.it/cmsms/index.php?page=alla-spina
I'd like to send a newsletter to subscribers every time this page is updated, automatically. I don't have admin privileges for this CMS thus I can't install modules, so I was thinking about a service such as FeedBurner, if only this page had a unique RSS feed (but AFAIK it doesn't, right?).
Do you know of any service/software that will allow me to solve this particular problem?
Thanks.
To do this without access to the server you will need another server with corn jobs enabled.
If I there you I would create a simple scrape script which would also check if the content of the page have not changed. Using print version of the page may make it much easier to process. The last thing to do is to set up a corn job which would run the script at desired intervals.

What does it mean when I see some IPs look at hundreds of pages on my website?

What should I do when I see some IP in my logs scrolling through 100s of pages on my site? I have a wordpress blog, and it seems like this isn't a real person. This happens almost daily with different IPs.
UPDATE: Oh, i forgot to mention, I'm pretty sure it's not a search engine spider. The hostname is not a searchengine, but some random person from india (ends in '.in').
What I am concerned with, is if it is a scraper, is there anything I can do? Or could it possibly be something worse than a scraper e.g. hacker?
It's a spider/crawler. Search engines use these to compile their listings, researchers use them to figure out the structure of the internet, the Internet Archive uses them to download the contents of the Internet for future generations, spammers use them to search for e-mail addresses, and many more such situations.
Checking out the user agent string in your logs may give you more information on what they're doing. Well-behaved bots will generally indicate who/what they are - Google's search bots, for example, are called Googlebot.
If you're concerned about script kiddies, I suggest checking your error logs. The scripts often look for things you may not have; e.g. on one system I run, I don't have ASP, however, I can tell when a script kiddie has probed the site because I see lots attempts to find ASP pages in my error logs.
Probably some script kiddie looking to take advantage of an exploit in your blog (or server). That, or some web crawler.
It's probably a spider-bot indexing your site. The "User-Agent" might give it away. It is possible to have 100s of GET requests easily for a dynamically generated Wordpress site if it isn't all blog pages but includes things like css, js and images.

Resources