Waiting for available socket... (Chrome + Wordpress) [duplicate] - wordpress

I am accessing my PHP/Apache website from Chrome. And Chrome refuses to load the page, saying "Waiting for available socket".
Other pages on this same virtual host also have the problem.
Other virtual hosts on the same server load perfectly fine.
Please advise how I can fix this.

According to various sources, Chrome opens a maximum of 6 simultaneous connections to the same server. Possible solutions include
Host data on different virtual hosts on the same machine.
Consolidate content to be loaded to require fewer connections (for example, put everything in a single .data file for Emscripten)
Make sure you don't have too many <video> or <audio> tags marked with the attribute preload="auto".
Adjust Chrome to permit up to 32 connections. Policy List
Related question:
Chrome hangs after certain amount of data transfered - waiting for available socket

Related

Extremely slow initial connection to nginx on Vagrant?

I running an Ubuntu VM via Vagrant on a Windows 10 host. On the Vagrant machine I am running a fairly standard PHP/nginx app.
Whenever I try to access the web app, it takes forever to load. Chrome network inspector shows this:
Chrome network timeline
This huge latency is completely gone on subsequent requests, but whenever I pop back into the browser and try again after a while, it crops up yet again.
I am using NFS.
I have disabled firewalls on both guest and host machines.
I increased keepalive_timeout in nginx which helped hide the problem, as it increased the time window for latency-free subsequent requests.
This latency occurs even when accessing static files, so I don't think it's a PHP-FPM/MySQL problem.
I successfully figured out what my problem was!
After looking at my Windows hosts file, it looked like my vagrant-hostmanager plugin had not been properly clearing out older IP entries (i.e. I had three seperate IP entries for myapp.dev even though only one IP was active). Probably because I'd forgotten to properly vagrant halt before shutting down my PC a few times.
Windows was clearly spending ages trying to resolve the two older entries before successfully resolving the 'real' one.
It's weird: you'd think this problem would cause the latency to show up in the DNS Lookup portion of the Chrome network timeline, rather than Initial connection, but oh well!

Browser and server : visiting a web page vs saving it

When we enter a url which is actually a song, will start playing or say streams in chrome but when we save that page it saves as mp3 ie downloads that song. Is something different between this two and browser handles it or they are actually same inside .
It is basically the same thing, if the transfer is over http, except some information that indicates the file type.
For a browers to show files, it basically downloads them as a regular file and interprets them, so displaying a page is basically a file download.
The transfer may use another protocol, like ftp (this can be used just for files).
A protocol is a way of communication between a client (a program in your computer) and a server (a program on another computer). Note that client and server are used sometimes to described the entire machine.
Web browsers are http clients, so, in order to use another protocol (like ftp, described above) you need to use a client for that protocol.

how single process of mozilla can manage tcp ports...?

In my pc if I open different tabs in IE or Chrome, different processes will be created, so each process can be listening through separate TCP port. So they manage session successfully.
But in case of Mozilla, single process will be created in Operating System, even many tabs are opened! As Mozilla creates multiple threads for multiple tabs, it has only single process id.
If a packet arrives from a web server, My OS will handover it to a particular process, using TCP Destination port number!!!
In case of IE, Chrome there is no issue, as they have different processes for each tab…!
But how Mozilla able to manage different sessions with single process…?
In fact its much simpler when you use threads, tabs are not standalone entries, each page is rendered in firefox core and what you see in the tabs are only representation of data, the view :)
It's possible to listen for response on severel connection within one process.
Handling multible tabs with one or severel processes is a design choise with pros an cons.

Simulating a remote website locally for testing

I am developing a browser extension. The extension works on external websites we have no control over.
I would like to be able to test the extension. One of the major problems I'm facing is displaying a website 'as-is' locally.
Is it possible to display a website 'as-is' locally?
I want to be able to serve the website exactly as-is locally for testing. This means I want to simulate the exact same HTTP data, including iframe ads, etc.
Is there an easy way to do this?
More info:
I'd like my system to act as closely to the remote website as possible. I'd like to run command fetch for example which would allow me to go to the site in my browser (without the internet on) and get the exact same thing I would otherwise (including information that is not from a single domain, google ads, etc).
I don't mind using a virtual machine if this helps.
I figured this was quite a useful thing in testing. Especially when I have a bug I need to reliably reproduce in sites that have many random factors (what ads show, etc).
As was already mentioned, caching proxies should do the trick for you (BTW, this is the simplest solution). There are quite a lot of different implementations, so you just need to spend some time selecting a proper one (according to my experience squid is a good solution). Anyway, I would like to highlight two other interesting options:
Option 1: Betamax
Betamax is a tool for mocking external HTTP resources such as web services and REST APIs in your tests. The project was inspired by the VCR library for Ruby. Betamax aims to solve these problems by intercepting HTTP connections initiated by your application and replaying previously recorded responses.
Betamax comes in two flavors. The first is an HTTP and HTTPS proxy that can intercept traffic made in any way that respects Java’s http.proxyHost and http.proxyPort system properties. The second is a simple wrapper for Apache HttpClient.
BTW, Betamax has a very interesting feature for you:
Betamax is a testing tool and not a spec-compliant HTTP proxy. It ignores any and all headers that would normally be used to prevent a proxy caching or storing HTTP traffic.
Option 2: Wireshark and replay proxy
Grab all traffic you are interested in using Wireshark and replay it. This I would say it is not that hard to implement required replaying tool, but you can use available solution called replayproxy
Replayproxy parses HTTP streams from .pcap files
opens a TCP socket on port 3128 and listens as a HTTP proxy using the extracted HTTP responses as a cache while refusing all requests for unknown URLs.
Such approach provide you with the full control and bit-to-bit precise simulation.
I don't know if there is an easy way, but there is a way.
You can set up a local webserver, something like IIS, Apache, or minihttpd.
Then you can grab the website contents using wget. (It has an option for mirroring). And many browsers have an option for "save whole web page" that will grab everything, like images.
Ads will most likely come from remote sites, so you may have to manually edit those lines in the HTML to either not reference the actual ad-servers, or set up a mock ad yourself (like a banner image).
Then you can navigate your browser to http://localhost to visit your local website, assuming port 80 which is the default.
Hope this helps!
I assume you want to serve a remote site that's not under your control. In that case you can use a proxy server and have that server cache every response aggressively. However, this has it's limits. First of all you will have to visit every site you intend to use through this proxy (with a browser for example), second you will not be able to emulate form processing.
Alternatively you could use a spider to download all content of a certain website. Depending on the spider software, it may even be able to download JavaScript-built links. You then can use a webserver to serve that content.
This service http://www.json-gen.com provides mock for html, json and xml via rest. By this way, you can test your frontend separately from backend.

Http requests / concurrency?

Say a website on my localhost takes about 3 seconds to do each request. This is fine, and as expected (as it is doing some fancy networking behind the scenes).
However, if i open the same url in tabs (in firefox), then reload them all at the same time, it appears to load each page sequentially rather than all at the same time. What is this all about?
Have tried it on windows server 2008 iis and windows 7 iis
It really depends on the web browser you are using and how tab support in it has been programmed.
It is probably using a single thread to load each tab in turn, which would explain your observation.
Edit:
As others have mentioned, it is also a very real possibility the the webserver running on your localhost is single threaded.
If I remember correctly HTTP standard limits the number of concurrent conections to the same host to 2. This is the reason highload websites use CDNs (content delivery networks).
network.http.max-connections 60
network.http.max-connections-per-server 30
The above two values determine how many connections Firefox makes to a server. If threshold is breached, it will pipeline the requests.
Each browser implements it in its own way. The requests are made in such a way to maximize the performance. Moreover, it also depends on the server (localhost which is slower).
Your local web server configuration might have only one thread, so every next request will wait for the previous to finish

Resources