How do I use winzip command line with include full path information ? I know I can do this under Winzip GUI but how to do it using cmd ? Also, is there a way to zip selected specific folders only ? Thanks
Tried GUI and it is working very slow
Winzip command line doesnt seem to zip selected folders - either parent folders or specific subfolders
I am using winzip 27 command line and this is the syntax I am using:
wzzip -a -e0 -k -P -r -yx "C:\Users\source\to\save\zipfile.zip" "C:\example"
This stores files and folder timestamps underneath C:\example. But since I have enabled -P -r, I want to store the timestamps of the upper folder, C:\example folder. How can I do that ? Does anyone have suggestions?
Also, how do I specify the path for a mapped network drive? Thanks!
jq command not found after adding jq executable
installing jq on git bash
My usecase is more similar with above shared references. I tried to execute a hook that needs to parse a json file. When hook gets executed it throws bash: jq:command not found error. So. I downloaded jq-win64.exe file and copied it to /usr/bin in Git folder. Then from git-bash I run export PATH=$PATH:"/C/Program Files/Git/usr/bin/jq-win64.exe" command and there is no error but when I checked jq --version command it still shows bash: jq:command not found error
Am I missing something? I even tried in windows cmd but is of no use. Hope someone can help me.
Thanks in advance!!!
PATH contains directories. That means what you should do:
Rename jq-win64.exe to jq.exe or just jq. (e.g. cp ~/Downloads/jq-win64.exe /usr/bin/jq).
You don't have to export your path, /usr/bin is already part of it.
If you didn't rename the file to jq (or jq.exe), then you would have to run it as jq-win64 in your console.
You could also put the binary into ~/bin folder, which should be part of PATH too. If it isn't, you can add it. Then you don't need to mess with your global binaries folder.
I am trying to run a programme on a debain dedi. Using the following code.
java -cp bin:lib/* rs.Server false 43594
However it gives me a file not found error (even though the files are present). I fixed this error in intellij by picking the $MODULE_DIR$ option. Is there a equivalent to this in unix terminals?
The problem looks to be that the directory your are in when you run the command is wrong. You either need to cd to the directory containing the bin and lib directories or specify the full path to the directories in the command line.
can you help me a little bit with the Terminal?
i would like to use these command lines :
cd /path/to/Qt
./configure -static <other parameters>
make sub-src
so i open the Terminal, and write cd then i drag QtSDK and it shows me this path : /Developer/Applications/Qt
EDIT :
so i guess the 3 lines must be copied one after the other...
i wrote those lines in a row, but this time i've got no error message, nor any responses if i try to write "-help" just after ./configure , as if it wasn't working. I'm following this tutorial from the doc : http://doc.qt.io/qt-5/osx-deployment.html
Any idea?
Im not sure what youre trying to do. Write a script to do it for you. her is a little bit of help.
if you type
pwd
it will return your current location. This way you can find out where you are.
Users/Paul/QtSDK
if the above is your goal,
pwd
should return
/Users/Paul
This means, to Change directory (cd), all you have to do is
cd QtSDK
This is assuming QtSDK is located in your user folder.
you can do
ls
to find out.
The following is the output of my ls in my home directory or "/Users/cy/
#:~ cy$ pwd
/Users/cy
#:~ cy$ ls
Desktop Downloads Movies Pictures Sites
Documents Library Music Public
You should also see your QtSDK folder there.
To restart your location, go back to where it should begin. Type:
cd
with nothing else.
this will put you back to your home folder..
Last but not least,
<Other Prams>
should be replace with the actual prams and not to be left as you have shown in your code
PS:Capitalization is important
When attempting to run R, I get this error:
Fatal error: cannot mkdir R_TempDir
I found two possible fixes for this problem by googling around. The first was to ensure my tmp directory didn't contain a load of subdirectories - it doesn't and it's virtually empty. The second fix was to ensure that TMP, TMPDIR, and R_USER in my environment weren't set to non-existent paths - I didn't even have these set. Therefore, I created a tmp directory in my home directory and added it's path to TMP in my environment. I was able to run R once and then I got the fatal error again. Nothing was in the TMP directory that I set in my environment. Does anyone know what else I can try? Thanks.
Dirk is right, but misses a point: If /tmp is full, you can't create subdirectories there. Try
df /tmp
I just hit this on a shared server, where /tmp is mounted on it's own partition, and is shared by many users. In this particular case, you can't really see who's fault it is, because permissions restrict you seeing who is filling up the tmp partition. Basically have to ask the sys admins to figure it out.
Your default temporary directory appears to have the wrong permissions. Here I have
$ ls -ld /tmp
drwxrwxrwt 22 root root 4096 2011-06-10 09:17 /tmp
The key part is 'everybody' can read or write. You need that too. It certainly can contain subdirectories.
Are you running something like AppArmor or SE Linux?
Edit 2011-07-21: As someone just deemed it necessary to downvote this answer -- help(tempfile) is very clear on what values tmpdir (the default directory for temporary files or directories) tries:
By default, 'tmpdir' will be the directory given by 'tempdir()'. This
will be a subdirectory of the temporary directory found by the
following rule. The environment variables 'TMPDIR', 'TMP' and 'TEMP'
are checked in turn and the first found which points to a writable
directory is used: if none succeeds '/tmp' is used.
So my money is on checking those three environment variables. But AppArmor and SELinux have shown to be an issue too on some distributions.
Go to your user directory and create a file called .Renviron and add the following line, save it and reopen RStudio or Rgui or Rterm
TMP = '<path to folder where Everyone has full control>'
This worked with me on Windows 7
If you are running one of the rocker docker images (e.g., rocker/verse), you need to map a local directory to the /tmp directory in the container. For example,
docker run --rm -v ${PWD}/tmp:/tmp -p 8787:8787 -e PASSWORD=password rocker/verse:4.0.4
where ${PWD} for me is ~/devProjs/r, and I created a /tmp directory inside it, so that the container's /tmp is mapped to my ~/devProjs/r/tmp directory.
Just had this issue and finally solved it. Simply a windows permission issue. Go to environment variables and find the location of the temp folders. Then right click on the folder > properties > security > advanced > change everyone to full control > tick "replace all child object permission entries with inheritable permission entries from this object" > Ok > ok.
This will also happen when your computer is completely, utterly out of space. Currently, my Mac has 0 kb free and it's causing this error. Freeing up some space solved the problem.
Check for the user account with which you are launching the RStudio with. Now u check the TMP(System Environment variable) for its location. If the user who is launching RStudio has Write access for those directories you will not face this issue. Being said that you are facing this issue, all you have to do is to change the permissions for that user to have write access on those directories.
Running R on CentOS system and had the same issue. I had to remove all R folders from the tmp directory. Usually all R folders will be in the form of /tmp/Rtmp*****
so i tried to delete the folders from /tmp by running the below.
CD into /tmp directory and run rm -rf Rtmp*
R shell Worked for me afterwards
I had this issue, solution was slightly different. I run R on a linux server - it turned out for me R had made a whole load of tempdirs when running jobs with cron that had hung and not been cleaned up, clogging up the root /tmp directory with ~300 RtmpXXXXXX folders.
Using terminal access, I navigated to the /tmp folder did a recursive find/rm - deleting all of them using this command:
find . -type d -name 'Rtmp*' -exec rm -r -v {} \;
After this, Rstudio took a while to load up, but was once again happy and my scripts began to run again.
You will need the appropriate admin rights for this solution. And always be careful when running rm -r, especially with a find command, as it's easy to remove things unexpectedly.
When it comes to deleting tmp files, make sure that the tmp files are in the server or in local.
If its in the remote, 1st check for the df /tmp in the server or in the remote to see who uses more storage.
Then use rm(file_name)` to remove the files which cause the blocking.
If its in the remote, then use rm /tmp/(file_name)..
MOreover, you can also refer to https://support.rstudio.com/hc/en-us/articles/218730228-Resetting-a-user-s-state-on-RStudio-Server