I have download a full wordpress website with below command:
wget -r --convert-links --no-parent website_url
I want to server this site with nginx but there is a problem.
some of files has an invalid names with version like js_composer_front.min.js?ver=6.0.5 and nginx throw 404.
how can I solve this problem?
As far as I know, you can't fix the mentioned issue with Nginx and Wget, you should write script to modify generated links, maybe sed or python script.
I think the better solution is to use simply-static plugin.
Related
I am using consul template v0.19.0 for windows, for rendering nginx config.
The nginx config Is rendering fine by the consul template but It is not restarting the nginx.
This is the command I am using.
Consul- template -consul-address="xx" - template="in:out:{{pathfornginx}}\nginx.exe -s reload" .
Where "in" is the ctmpl path and "out" is the final nginx config path..
I have tried with different path format but no luck.
Could anyone drop some input on this.
Thx in advance.
Did you tried to create a config.json file which will contain the path to the ctmpl file, and the command itself?
You would need to use the -config flag instead of -template
If you want an example for the config.json I can provide you one.. I'm using similar setup and it works fine for me
Provide me a better solution?
Hi and thanks for reading my question.
Its my first so please be gentle as im not a programmer but a barge captain, and curious ))
The situation:
I run a small website and we want to serve more languages, the website is based on Wordpress multisite.
After trying various translation solution we decided to go with transposh.
That plugin enables us to translate content easily and it becomes available at domain.com/en domain.com/fr etcetera those directory´s are virtual.
As we have different domain names for different languages i needed a solution to have the content of domain.com/en on domain2.com
What i did was to set the cache directory (static html) of domain.com as webroot of domain2.com. A fairly simple solution and it works like a charm.
The only problem i face is that the menu items link back to the domain.com and not domain2.com
I tried to make urls relative via wp core and two plugins but as the trailing /en or /fr is virtual making the urls relative just links back to domain.com
I have spend two days googling and im realy out of ideas. I tried different php script for search and replace, cgi scripts, perl scripts but none seem to do the job. I dont have shell access.
I was wondering if its posible to do smth like that with mod rewrite and if posible then how?
simply put:
static html site with wrong links
can i change the links via htaccess or other method which is easy (relatively) to understand and maintain?
The cache gets rebuild now and then of course
This is solved by an other approach:
Instead of using the cache i creatd a cronjob with wget:
/usr/bin/wget -np -P /destination-eg-yourwebroot/ --html-extension -nH -p -k -r http://domain.com
This creates a html copy of your website in the webroot of the new domain
For some reason wget does not always update the links in the copy, so they will point to original domain
We can run the command again but then prepend it with /usr/bin/wget -nc -k
/usr/bin/wget -nc -k -np -P /destination-eg-yourwebroot/ --html-extension -nH -p -k -r http://domain.com
It will update the links correctly, including css urls
U now have a clean copy of your website on a different domain
I've created a nginx server in a chroot at /srv/http with php-fpm. Both services use the http user and it works fine. The problem comes when I try to run an exec command such as
echo shell_exec('/usr/bin/ls');
There is no output at all on the web page or in the errors. I've also tried
error_log(shell_exec('/usr/bin/ls');
and still nothing.
Things I've Tried or Know:
safe mode off
exec enabled
user is http (using phpinfo())
display_errors = on
error_reporting = E_ALL
sudo /usr/bin/chroot --userspec=http:http /srv/http ls works fine
Can create file and read from it using file_puts_content and fopen/fread
tried shell_exec,exec,system, and passthrough - nothing worked
tried appending 2>&1 to the end of the command and nothing
I've copied all the executables and libraries necessary over
all libraries, binaries, and everything under /srv/http/www (where the webpages are) have executable and read permissions
doc_root is www
As far as I know, everything works in the chroot, except shell commands through php-fpm. Anyone have any idea where I went wrong and how to fix it?
This may sound stupid but you must just copy /bin/sh (not /bin/bash!) to you chroot.
For example see this question: How do I change the shell for php's exec()
If you chroot to some directory, then this directory becomes the root for all your PHP scripts. That means, that if you execute /usr/bin/ls from within PHP, it will try to exectue /srv/http/usr/bin/ls instead.
You can copy the executable to that directory - but be aware of the security implications. If you copy critical system executables into the chrooted directory you basically bypass the positive effects of chroot.
I get no output for
echo shell_exec('/usr/bin/ls');
either. Presumably because ls isn't a file but a built-in command. Running:
echo shell_exec('ls');
outputs:
css demos favicon.ico images js path.php robots.txt routing.php test
which is the list of files in my root directory for the site.
i thought it was simple.. or i might have missed something.. i wanted to save the output of wget and serve it as the index page... but for some reason it always serve the php version instead of the html.
i thought of changing "DirectoryIndex" to serve index.html first.. and restarted apache..
now the issue is when i do wget www.mysite.com/index.php -O index.html it ways serve the HTML version even though i specified index.php
maybe its wordpress related or some misconfig that i did..
tia,
Not sure I understand what you mean. :/
Do you want to use wget to fetch content from other website and serve it within WP?
----- Some type of cache -----
You have to call wget using directly the URl of your site, not index.php. For instance:
wget -O index.html http://yoursite.com
The you get your index.html with its content: css, text, images, js... You might use a redirection rule to serve such file as the default page.
I am trying to mirror a website (with permission, of course), but the files download into foo.com.
wget -mkp http://foo.com
gives me a folder called foo.com with all the files.
how do i get it to download the files to the current directory?
EDIT: I still want file hierarchy, but i want the root of the hierarchy to be the current directory
-n is what'll help you here.
wget -nH --mirror http://example.com
try to use wget within "-nd" param
wget -nd --mirror http://example.com