Do not recurse to all root's subdirectories - recursion

I have a huge "Downloads" directory that I would like to have watched by watchman but I am only interested in the changes directly in the directory root, i. e. when a new file appears.
When I run watchman watch /path/to/dir the whole directory tree is scanned and it takes more than a minute to finish. What I am looking for is something like --no-recurse option that would prevent from scanning the path entirely.
I checked the documentation for watch command but found nothing that would provide the solution to my problem.

Related

Change Jupyterlab startup folder (MacOS: Post Hurricane Catalina)

After updating (Catalina), and reinstalling anaconda3, my new startup folder is the in my users folder, with no way to navigate outside of the users folder. I'm just trying to get a cd directory root so I can navigate to my code files.
I've tried navigating via path (file->open from path, adding in Desktop/) won't navigate there. I've tried navigating directionally out (cd ~/ ) and won't navigate there. I found a post about this, but it's for Windows.
Please help -- this update is ruining my week. Half of my paths are ruined, I'm ready to Time Machine or Ebay this garbage.
I created an alias file and navigated through that to desktop, that works as kinda a strange solution -- basically I alias a folder inside startup folder, and place that on desktop. Only solution I've been able to find, aside from possibly reinstalling from scratch.

Git: Ignored a file, now says there's unmerged files and can't pull

Previously, I had WordPress' wp-config.php committed; since then, I've now ignored this file in .gitignore, which is fine.
Unfortunately - now when pulling to live, it provides this error:
M .htaccess
U wp-config.php
Pull is not possible because you have unmerged files.
Please, fix them up in the work tree, and then use 'git add/rm <file>'
as appropriate to mark resolution, or use 'git commit -a'.
Before this issue happened, I had an issue with pulling where it said I had uncommitted changes with these files - so I let it overwrite them, then I uploaded the actual files manually via FTP, which caused this new error.
I want to keep the files in their current state on the live server, but get around this error now and in the future.
What is the process of being able to pull the latest changes with this?
UPDATE - I'm adding an answer to a question posed in comments, and clarifying a related point
First let's clear up a little confusion: This error has nothing to do with putting the file in .gitignore. In fact, since the file is present in the index, .gitignore has no effect whatsoever on the file.
Before this issue happened, I had an issue with pulling where it said I had uncommitted changes with these files - so I let it overwrite them
You may need to be more clear about what commands you issued and what output you saw, because this is where your trouble is coming from.
When you say you "let it overwrite them", what does that mean? git resists overwriting local changes (especially uncommitted changes)...
I'm guessing what really happened is that git tried to merge changes from the remote repo into your local changes for those files, but had merge conflicts on wp-config.php.
And most likely it's still in a merging state, which you'll have to resolve before you can move on. If you say git status it will likely tell you that you're merging, with some changes "to be committed" (likely including the .htaccess file) and some "unmerged paths" (likely including the .php file).
If you have the .php file looking the way you want it in the work tree, then you can say git add wp-config.php and then git commit which should cause the merge to complete. (More generally, you have to get the file looking how you want it in the index, and do this in a way that tells git the conflict is resolved; and then you can commit to get out of the merging state.)
Now in comments you ask about whether this will put the file "back" into git. And that comes down to what it means to "have the .php file looking the way you want it".
If you never want git to provide the .php file (even during a fresh clone), then you need to remove it from the index and subsequent commit.
You could (at least temporarily) remove the file from the working tree and then do a git add (as noted above). Or, if you don't want to affect your working tree version, you can
git rm --cached wp-config.php
directly making the index look "how you want it". At this point it becomes possible for your .gitignore entry to help you avoid accidentally reintroducing the file.
If what you mean is that the file should be there, but only a default version should be in the repo (not taking any changes that might be made in the working tree), git won't do that. You'll have to get where you're going a different way. For example you could put the file in the repo as wp-config.php.default and ignore the wp-config.php path. After cloning the repo you would then copy the default file, and any local changes would be made only to the ignored copy.

ZFS folder duplication and deletion

I had a folder which was encrypted by encfs. It was in an ext4 partition. I decided to move it into another folder, because I had not enough space in that partition. The new partition was a compression-enabled zfs partition on another hard disk. During the move along with the destination folder 'td', another folder '.shutdowntd' was also created. I don't know why. I didn't created it. Maybe zfs itself created it. Maybe encfs manager did it. Last night I looked into it. I saw that the number of files in both directories is the same. I saw that for any file in the directory 'td' there is a file in directory '.shutdowntd'. File sizes were not exactly the same, but were nearly the same. When I deleted a file in one of them, the corresponding file in the other directory was also deleted! The names in the directory '.shutdowntd' were different and seemed to be hash-coded. My computer was on during the night. Today, I saw that the folder 'td' was removed and a zero-byte file with the same name was created! I restarted Ubuntu (16.04). Now I see that the file is changed back to a folder with the same name ('td') with no content. But the folder '.shutdowntd' still exists with the files in it. I can't justify this behavior and don't know why it happened. Essentially why '.shutdowntd' is created? Why 'td's' content is emptied?! How can I recover it? What's happening?!

Brackets, remembering previous session

Every time I open brackets, it points to a start-up folder instead of previously opened files and folder. Is there a way to retain previous session?
It should remember what folder you had open, so it's hard to know exactly why not. Here are some things to try though:
If you have any extensions installed, try uninstalling them. (If that fixes it, you can reinstall them one by one to see which one was the problem).
If you select Debug > Show Developer Tools in the menu, are there any errors listed in the Console tab?
If you select Help > Show Extensions Folder and then go up one level, is there a state.json file there? If so, try deleting/renaming it to see if that fixes things. If not, make sure the permissions on that folder are ok.

DirCopy() Not Working

I'm working in AutoIT to script a basic task I'll have to repeat on 50ish workstations.
I need to copy a directory and it's subdirectories and files (recursively) to a network share as a backup. For some reason, DirCopy() does not work at all.
I've tried running it on several different directories (thinking permissions issue, I'm Domain Admin account), tried doing a RunAs (again thinking permissions), and also put the #RequireAdmin tag to force the program to run on an elevated account. Nothing has worked. I can't even get it to copy empty directories.
DirCopy(#DesktopDir & "\SAMPLE\TEST1", #DesktopDir & "\SAMPLE\TEST2", 0)
Please advise!
Just figured this one out.
Turns out DirCopy() is a pretty stupid function that cannot handle if the destination directory already exists (it wants to create it for you). So if you kill the destination directory, then run the above code line, all woks as expected. But then if you add a new file into the source directory (TEST1 in my example), then it breaks again and does nothing.
Go figure...
Now time to find a work-around using something like xcopy...

Resources