I finish installing CKAN in a virtual machine, but there is a problem in CKAN interface when accessing the IP of CKAN website. The CSS style cannot be loaded in CKAN website, so only html part can be displayed.
When using chrome to look at the page source, warning can be seen from console: "Resource interpreted as Stylesheet but transferred with XXX MIME type text/plain: XXX(CSS link)."
From linux terminal, whenever clicking links in CKAN website, python error messages come out: [Errno 32] Broken pipe
Also my CKAN link is set as http://localhost:8773, not sure if port 8773 is a problem. (Port 5000 is used for login in virtual machine)
Other installation information: CentOS 7, CKAN 2.4.1, Tomcat 7.0.69, Solr 1.4.0 PostgreSQL 9.2.18
Thanks a lot!
My CKAN Interface Problem Screenshot
You probably created a file similar to /etc/ckan/default/development.ini while installing ckan, right?
Try setting ckan.site_url = http://localhost:5000/ there.
You could also cross-check different CKAN setup tutorials for stuff that is missing in your setup:
https://yorkhuang-au.github.io/2016/01/08/Install-CKAN-On-Centos7/
https://github.com/ckan/ckan/wiki/How-to-install-CKAN-2.x-on-CentOS-7
Also what I usually do in these cases is search for a vagrant box with a finished setup and use that: https://app.vagrantup.com/boxes/search?utf8=%E2%9C%93&sort=downloads&provider=&q=ckan
Related
I moved a WordPress site from a cpanel server to plesk server. Then, i upgraded manualy the site from 3.5.1 version to 4.8.3. Afterwards i tried to upgrade plugins (fancy box) as well as to intall new plugins (contact form 7).
The issue i have is that i get the following error message "Update Failed: Download failed. Destination directory for file streaming does not exist or is not writable.".
In the server's log file i can see few warning like the following one
mod_fcgid: stderr: PHP Warning: file_exists(): open_basedir restriction in effect. File(/home/dentist/domains/dentist.com.gr/public_html/newsite/wp-content/uploads//easy-fancybox.1.6.2-Vlaovu.tmp) is not within the allowed path(s): (/var/www/vhosts/ggeorgiou.gr/ggeorgiou.work/:/tmp/) in /var/www/vhosts/ggeorgiou.gr/ggeorgiou.work/wd/dentist.com.gr/wp-includes/functions.php on line 2085, referer: http://www.ggeorgiou.work/wd/dentist.com.gr/wp-admin/plugins.php
Finally, note that in "Settings --> Media" menu in "
Store uploads in this folder" field i have put the following path of the current server: "/var/www/vhosts/ggeorgiou.gr/ggeorgiou.work/wd/dentist.com.gr/wp-content/uploads".
Any idea please what is wrong about?
Thank you
From what you posted, your exact error message is "open_basedir restriction in effect". You can read more about how to solve it here How can I relax PHP's open_basedir restriction?
Also,
Assuming you have a backup of the previous version, I would start by restoring that.
Secondly, there are many versions between 3.5.1 and 4.8.3. It is advisable to upgrade in increments of one version at a time. It is long but safer.
I've been trying to install and run keras in RStudio (Windows) in vain.
i installed keras package using normal package "keras"
(didn't use github)
I've installed latest python (3.6) and Anaconda.
then i use
> library(keras)
> install.keras()
and i get this error:
Creating r-tensorflow conda environment for TensorFlow installation...
Fetching package metadata ... CondaHTTPError: HTTP 000 CONNECTION
FAILED for url
https://repo.continuum.io/pkgs/main/win-64/repodata.json.bz2
Elapsed: -
An HTTP error occurred when trying to retrieve this URL. HTTP errors
are often intermittent, and a simple retry will get you on your way.
ConnectTimeout(MaxRetryError("HTTPSConnectionPool(host='repo.continuum.io',
port=443): Max retries exceeded with url:
/pkgs/main/win-64/repodata.json.bz2 (Caused by
ConnectTimeoutError(, 'Connection to repo.continuum.io timed out.
(connect timeout=9.15)'))",),)
Error: Error 1 occurred creating conda environment r-tensorflow In
addition: Warning message: running command
'"C:\PROGRA~3\ANACON~1\Scripts\conda.exe" "create" "--yes" "--name"
"r-tensorflow" "python=3.6"' had status 1
I've looked up everywhere on the web and can't figure out how to install keras and tensorflow properly. Using latest version of R (3.4.2)
Every method fails somewhere.
just to add to misery, i've also tried:
> devtools::install_github("rstudio/keras")
and i get this error:
Installation failed: Timeout was reached: Connection timed out after
10015 milliseconds
I am not behind any authenticated proxies. So, after multiple failure, i just downloaded the zip file from github and manually installed it using the zip file.
i also tried install.packages("keras") and that didn't give me any error either.
when i call the library i don't get any errors (as shown above)
UPDATE: I was able to install and use the package very easily on another computer that doesn't have python/anaconda installed on it already.
UPDATE 2: my proxy does not need authentication and there is no https_proxy either.
OK,, FINALLY found a solution.
Turns out RStudio uses a lot of default proxy settings, so i needed to change all that and set up my own proxy settings.
First step:
Rstudio --> Tools --> Global Options --> packages --> uncheck both "Use secure download method for HTTP" and "Use Internet Explorer librayr/proxy for HTTP"
Second step, in RStudio type:
> file.edit('./.Renviron')
Either an empty file or some file with already existing proxy settings will open. (Mine was empty). Then I included the following two:
http_proxy=http://myusename:password#proxy.server.com:port/
https_proxy=http://myusename:password#proxy.server.com:port/
(a few notes: I didn't have a https_proxy setting but I still needed to use the http_proxy details for my https_proxy setting. This was one of the culprits for my issue. Also, I needed to include the username:password even though my proxy doesn't need secure authentication. Same thing goes with the port. Port number had to be included, otherwise it wouldn't work.
Step 3:
Saved the new changes in .Renviron file and restarted RStudio.
I checked my proxy settings in RStudio after restart by typing:
> Sys.getenv("http_proxy")
> Sys.getenv("https_proxy")
The first few times i did this i realised that the proxy settings were not being changed in RStudio because i was editing the wrong .Renviron file. So, it's best to use file.edit('~/.Renviron') in step 2 to make sure it's the right file.
After all this, when i ran install.keras(), it installed successfully, including installing Tensorflow. Again, initially i had skipped step 1 so keras started being installed but it failed at installing tensorflow.
It was only going through all the steps that i was able to install both keras and tensorflow successfully over a proxy. Hope this helps.
Uninstalling Anaconda3 and installing Anaconda2 (i.e. Python 2.7) did the trick for me: https://www.anaconda.com/download/
I have just upgraded Jupyter to the version 4.3.1
While I can open previously created ipynb files, I cannot create new ones.
When I try to create a new notebook file, I get a pop up windows saying:
Creating Notebook Failed
An error occurred while creating a new notebook
Forbidden
In the terminal I notice this output:
[W 12:53:23.375 NotebookApp] 403 POST /api/contents (::1): '_xsrf' argument missing from POST
[W 12:53:23.383 NotebookApp] 403 POST /api/contents (::1) 8.92ms referer=http://localhost:8888/tree?token=e7fbbb58516dc1359fcc26a1079093166a1f713ee5b94ccd
I use Jupyter with Python 3.5.2 and IPython 5.1.0
Another alternative to confirm the issue is to open your Jupyter Session in another browser and you might be redirected to a screen like the following:
If you open a new console and type
jupyter notebook list
You'll see your current notebook and the URL will contain a token. Open that in a new tab and problem solved.
Output command should look like this:
Currently running servers:
http://localhost:8888/?token=cbad1a6ce77ae284725a5e43a7db48f2e9bf3b6458e577bb :: <path to notebook>
I had to enable cookies in the browser (which I had intentionally disabled). Then the "Forbidden" error disappeared, everything is OK now.
The generally accepted solution to prevent XSRF is to cookie every user with an unpredictable value and include that value as an additional argument with every form submission on your site.
From: http://tornado.readthedocs.io/en/latest/guide/security.html#cross-site-request-forgery-protection
Jupyter blocks non-local requests. To access Jupyter from an external address we can execute it with the following parameters:
jupyter notebook --NotebookApp.allow_origin=* --NotebookApp.allow_remote_access=1
I had this problem just now, but I noticed that it worked in Edge. Deleting all browser cache including cookies in Chrome solved this in my case.
Issue description:
Install package on CentOS7 with below command:
yum install <package_name>
error:
# yum install httpd
Loaded plugins: fastestmirror
http://centos-distro.1gservers.com/7.2.1511/os/x86_64/repodata/repomd.xml: [Errno 14] HTTP Error 403 - Forbidden
Trying other mirror.
To address this issue, please refer to the below knowledge base article
https://access.redhat.com/solutions/69319
If above article doesn't help to resolve this issue, please create a bug on https://bugs.centos.org/
Other info
>* I can be sure I did not have a proxy setting in /etc/yum.conf.
>* The firewall is closed.
>* I already try yum clean all
Something may relate
I have added the following two to config in /etc/yum.conf
timeout=9999
minrate=0
I added them because sometimes I will hit too slow or timeout error. If I remove this two from config, the error will be replaced by
# yum install httpd
Loaded plugins: fastestmirror
http://centos-distro.1gservers.com/7.2.1511/os/x86_64/repodata/repomd.xml: [Errno 12] Timeout on http://centos-distro.1gservers.com/7.2.1511/os/x86_64/repodata/repomd.xml: (28, 'Operation too slow. Less than 1000 bytes/sec transferred the last 30 seconds')
Trying other mirror.
The error message did give a link. But that link can only be read by RedHat support accounts. I tried googling this, but the result either none related or can not solve my problem. Stack overflow also had one post about this. But it is also not helping.
Can anyone give some advice?
I solved this issue. This is because some external, third party firewall of my lab is blocking me.
Although the root cause is not very meaningful, but the way to trouble shoot should worth mention to reference.
Trouble shooting
It said 403 for some url. So I will access that url from my compute with below command.
curl -i <url>
The output contains http header. It is 403. Then I copy the html body to some text file. Use a web Broswer to open it. I found some information like:
your orgainization firewall block you
So... it is a network problem of my lab
I'm trying to get R (running on Windows) to download some packages from the Internet, but the download fails because I can't get it to correctly use the necessary proxy server. The output text when I try the Windows menu option Packages > Install package(s)... and select a CRAN mirror is:
> utils:::menuInstallPkgs()
--- Please select a CRAN mirror for use in this session ---
Warning: unable to access index for repository http://cran.opensourceresources.org/bin/windows/contrib/2.12
Warning: unable to access index for repository http://www.stats.ox.ac.uk/pub/RWin/bin/windows/contrib/2.12
Error in install.packages(NULL, .libPaths()[1L], dependencies = NA, type = type) :
no packages were specified
In addition: Warning message:
In open.connection(con, "r") :
cannot open: HTTP status was '407 Proxy Authentication Required'
I know the address and port of the proxy, and I also know the address of the automatic configuration script. I don't know what the authentication is called, but when using the proxy (in a browser and some other applications), I enter a username and password in a dialog window that pops up.
To set the proxy, I tried each of the following:
Sys.setenv(http_proxy="http://proxy.example.com:8080")
Sys.setenv("http_proxy"="http://proxy.example.com:8080")
Sys.setenv(HTTP_PROXY="http://proxy.example.com:8080")
Sys.setenv("HTTP_PROXY"="http://proxy.example.com:8080")
For authentication, I similarly tried setting the http_proxy_user environment variable to:
ask
user:passwd
Leaving it untouched
Am I using the right commands in the right way?
You have two options:
Use --internet2 or setInternet2(TRUE) and set the proxy details in the control panel, in Internet Options
Do not use either --internet2 or setInternet2(FALSE), but specify the environment variables
EDIT: One trick is, you cannot change your mind between 1 and 2, after you have tried it in a session, i.e. if you run the command setInternet2(TRUE) and try to use it e.g. install.packages('reshape2'), should this fail, you cannot then call setInternet2(FALSE). You have to restart the R session.
As of R version 3.2.0, the setInternet2 function can set internet connection settings and change them within the same R session. No need to restart.
When using option 2, one way (which is nice and compact) to specify the username and password is http_proxy="http://user:password#proxy.example.com:8080/"
In the past, I have had most luck with option 2
If you want internet2 to be used everytime you use R you could add the following line to the Rprofile.site file which is located in R.x.x\etc\Rprofile.site
utils::setInternet2(TRUE)
I've solved my trouble editing the file .Renviron as documented in Proxy setting for R.
EDITED
The solutions based on the setInternet2 statement do not work with the recent R versions because setInternet2 is declared defunct.
I'm using the 4.2.1 (on Win 11Pro) while I never had any problems in previous versions .
So to solve the problem need to modify some config files in order to fix the proxy issue not only for packages installation but, in general, also to acced to a remote resource (ie. boundary maps in my case).
The question "Proxy setting for R" collect a lot of solutions. I've found that this one has solved both my problems (packages installation and remote resources) explaining step-by-step how to edit the file .Renviron
Other solutions based on the customization of the file Renviron.site for me doesn't work
install.packages("RCurl")
that will solve your problem.