HTTP Post to PHP script hosted on unauthenticated FTP server - http

Is it possible to create a HTTP POST that posts to the PHP script that is hosted on a separate unauthenticated FTP server?

No.
POST is an HTTP method, and HTTP requests have a specific protocol structure. FTP, being an entirely different protocol, has an entirely different structure.
FTP servers don't understand HTTP requests, and HTTP servers don't understand FTP requests. (One "server" can handle both, and in such a case would be acting as two distinct services from the perspective of any consuming client.)
If the target page is hosted as a file on an FTP server, then there is no HTTP endpoint to receive the request. There's just a file.

Related

What does ecxactly http do?

I understand that HTTP is a protocol that allows information to be transferred between a client and a server. At the moment, this protocol is used everywhere: when we're opening needed web page, downloading music, videos, applications...
MDN
HTTP is a protocol for fetching resources such as HTML documents. It is the foundation of any data exchange on the Web and it is a client-server protocol, which means requests are initiated by the recipient, usually the Web browser. A complete document is reconstructed from the different sub-documents fetched, for instance, text, layout description, images, videos, scripts, and more.
But it's not entirely clear to me what exactly HTTP does during this information transfer. If, as I read, a protocol is essentially a set of rules, then does it mean that HTTP just setting up rules for passing information between server and client? If so, what are these rules and what are they for?
Hypertext Transfer Protocol is a communications protocol. It is used to send and receive webpages and files on the internet. It is now coordinated by the W3C. HTTP version 1.1 is the most common used.
HTTP works by using a user agent to connect to a server. The user agent could be a web browser or spider. The server must be located using a URL or URI. This always contains http:// at the start. It normally connects to port 80 on a computer.
A more secure version of HTTP is called HTTPS (Hypertext Transfer Protocol Secure). This contains https:// at the beginning of the URL. It encrypts all the information that is sent and received. This can stop malicious users such as hackers from stealing the information and is often used on payment websites. HTTPS uses port 443 for communication instead of port 80.

HTTP on a HTTPS Website

I was just wondering this small little question. I know it is irreverent to coding, but I just had to know quickly.
If you type in http:// for a https:// will it still take you to the correct place?
That is mostly dependent on the server configuration. The server has to accept the initial HTTP request and be configured to redirect the client to an appropriate HTTPS url.
That being said, there are some Internet standards related to automating HTTP-to-HTTPS upgrades. HTTP Strict Transport Security and Upgrade Insecure Requests allow an HTTP/S server to tell clients that it wants them to automatically use HTTPS for all subsequent requests. If a client visits an HSTS/UIR-enabled server, it will receive a normal HTTP response with additional HSTS/UIR-related headers. If the client supports HSTS/UIR, it will then know to automatically send all subsequent HTTP requests to that same server using HTTPS, and in the case of UIR also treat any received HTTP URLs as if they were HTTPS URLs.

How can I config nginx to send multiple POST requests in one connection

I am developing an Upload application.
I use Google Chrome to upload a big file (GB) and use nginx to pass the file to my backend application.
I use Wireshark to find that Chrome send the file in one connection with multiple POST requests.
But nginx will split every POST request then send it in different connection to backend application.
How can I config nginx to make it send all the POST requests in one connection, not per POST request one connection?
Oh my god, it's pathetic!
The solution is just enable Nginx upstream keepalive.
Operations to enable upstream keepalive.

Can I whitelist a domain for unencrypted traffic from a page served over HTTPS?

I've got an internal web application that's designed to work in concert with a server running locally on the client machine. (For the curious: the local server is used to decrypt data retrieved from the server using the client machine's GPG key.)
The internal web app is served over HTTPS while the local app is accessible via localhost. It used to be that I could make unencrypted AJAX requests from the page to localhost without any issues; but it seems that recently Chrome was updated to disallow HTTP requests to any destination from pages served over HTTPS.
I understand that in the vast majority of cases, HTTP requests from a page served via HTTPS constitute a security hole. However, since I have complete control over the endpoint in this case (i.e., localhost), it seems to me that it should still be perfectly safe to make HTTP requests to that one destination even when the host page has been served via HTTPS.
Is this possible? To whitelist localhost somehow?
Since you are in control of both the client and the server, it sounds like a good candidate for Cross-Origin Resource Sharing (CORS). The server will have to set a few response headers to give access to the client. You can learn more here: http://www.html5rocks.com/en/tutorials/cors/

how to create a http server that would handle http requests

I am aware of apache web server, i can host web pages.
how to create a http page(server) that would handle http post requests and respond to those requests.
If you just want something simple, try node.js
You can write the server code in javascript. Otherwise, you can just use PHP or another scripting language with Apache web server, just make sure you enable the PHP module (or whichever module you need)

Resources