How to scan/list encrypted files? [closed] - encryption

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
The infamous cryptowall has encrypted a large number of my files/folders.
While I have restored most of my files from backup, I am now looking for a way to scan the remaining encrypted files scattered across my local and network drives.
Is there a way of generating a list of those encrypted files ? (by scanninng the file header / or verifying file integrity). Is it possible in command line or with a specific software ?
Cheers,
Florian

You can try to generate a list of crypted files
http://www.bleepingcomputer.com/download/listcwall/
Try to use this link to decrypt files - better than nothing
https://www.fireeye.com/blog/executive-perspective/2014/08/your-locker-of-information-for-cryptolocker-decryption.html
Original post taken from
http://www.bleepingcomputer.com/virus-removal/cryptowall-ransomware-information

CryptoWall store in windows registry the list of all files encrypted.
Once restored some files may have been missed and might still be encrypted.
Looking at the modified file attribute gives a short list of files that have likely been and remain encrypted.
Using Recuva (Windows recovery tool), I have notice that encrypted file magic numbers are random, while for a normal file those magic numbers (first four bytes) are the same per file type.
JPEG : FF D8 FF E0
EDIT : I have found this handy unix command named "file". It is available on Linux, Cygwin, and OS X.
With a quick script to scan every files in the system, the unknown filetype are likely to be the remaining encrypted files.
Comparing those magic numbers with the file extension is scriptable and should allow to determine what is encrypted/corrupted. Yet I have not find such a tool able to perform this and compare to a well known database of magic numbers (first four bytes).

Related

How does UNIX handle a move of a file between two disk file systems? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I have three directories on a UNIX box, as described below:
/tmp mapped on /dev/mapper/data-tmpVol1
/var mapped on /dev/mapper/data-varVol1
/opt mapped on /dev/mapper/data-optVol1
If I perform a move operation from /tmp to /var, will the UNIX do in fact a copy since there are two different file systems behind scene?
If I want an instant move, is it better to copy the file first in a /var/staging and perform a move from /var/staging to /var/input?
Context around the issue: I have a process which scans for files in /var/input, and I've seen cases when it picked up half-copied files (when moving directly from /tmp to /var/input).
Regards,
Cristi
When moving across file systems, you may like to create a file in the destination directory with a temporary filename, e.g. my-file.txt~. The scanning process must ignore such temporary filenames. When the file is complete you rename it to the final name. This way when the file (with a final name) exists it is complete, or it doesn't exist at all.

tar/zip AND transfer to another server with limited space remaining [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I have a folder which is 60Gigabytes in size on a server I need to destroy. But, I only have 6G of space remaining on the server.
Besides the size of the folder, there are literally hundreds of thousands of small files in it. So doing a simple scp would take forever. I really want to tar czf the folder, but again, I don't have the space.
Is there any way to do this?
Another method that you can use (and which fulfills the "transfer to another server" part of the request):
tar cz sourcedir/ | ssh somewhere 'cat > dest.tar.gz'
Unlike scp, it's not doing individual operations, with separate round-trips, for every little file, so it will go just as fast as you can gzip (or just as fast as your network can transfer, if that's slower). Since the archive is getting written to a remote server, you don't have to worry about disk space. And since it isn't deleting as it goes, you can ^C it without being left with half of your files in their original locations and the other half in the tarball.
You can also get a live filesystem (instead of an archive) on the destination end just by changing to
tar cz sourcedir/ | ssh somewhere 'tar xC destdir/'
which operates a bit like rsync without the "sync". Add a v on the right side tar command to list files as they're received by the destination server.
I discovered the solution and wanted to share this: --remove-files is the way to achieve this.
So my command is this:
tar --remove-files -czf cpm006.tar.gz cpm006/
On a second terminal window entering du -sh /home/cpm006 several times confirmed that the files are being deleted AS SOON AS they are added to the tar archive.
The obvious benefit to this is that you can do archives for the purpose of freeing disk space even if that space is limited.
Reference:
https://linux.die.net/man/1/tar

Download files with specific date from SFTP server using PSFTP [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
How can I download files of a specific time period through PSFTP?
When I do mget *.*, it downloads all the files into local folder. Now I’m not allowed to delete these files from SFTP server or move them. So every time I download, it has to download the complete list.
Is there a way where I can download only those files through mget which are a week old?
PSFTP does not support time-based file selection.
You can use scripting interface of WinSCP instead.
It supports time-constraints in file mask.
To download all week old (7 days old) files, use the following command:
get *<7D
See the guide for converting PSFTP script to WinSCP script.
For general introduction to WinSCP scripting see:
https://winscp.net/eng/docs/guide_automation
See also similar question WinSCP time based file download.
(I'm the author of WinSCP)

Microsoft Word suddenly won't save files; "word could not create the work file - check the temp environment variable" [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
My brand-new install of Word 2007, which had been working just fine, suddenly refused to save any files. I'd hit Ctrl-S, and it wouldn't complain but it wasn't saving. Then upon exiting, Word would ask if I wanted to save. I'd click on Yes, and the same pop-up would appear, endlessly, until I chose Cancel. Also, on opening files, Word gave an error message about not being able to access a TEMP file. Exact wording: "word could not create the work file - check the temp environment variable".
I searched all over, including stackoverflow, and nothing people said to do was the problem. But they did cause me to look at the folder C:\Users\MyUserName\AppData\Local\Microsoft\Office (where of course MyUserName is my user name). The folder was encrypted. I decrypted it, and -- bingo! -- Word worked perfectly again.
This was on a 3-year-old laptop with a fresh disk re-image (OS=Win7) from the helpdesk at work. The weird thing is, that folder is encrypted on my desktop at work, and everything works fine there.
Since I couldn't find this solution anywhere, I figured I'd post it to stackoverflow. This fix is so easy that people may as well try it before any of the other proposed fixes.

recover disconnected mapped drive [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
Is there anyway to recover a mapped drive that was disconnected without knowing the server address or name? I do not want to browse through over 85 server ips to find the correct one.
From googling "windows history of mapped drives" i came upon this.
Doing a quick search of my reg, I found the following keys contain drive map history:
HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\Map Network Drive MRU
HKEY_USERS\ \Software\Microsoft\Windows\CurrentVersion\Explorer\Map Network Drive MRU
I've just checked my own registry and it seems to work.
Lots of others will be findable by searching for 'MRU' apparently.
My old mount point didn't appear in the registry when I searched for "MRU" on my Windows 7 laptop. I'm guessing I re-logged in too often after losing the mount point, and thus the history was lost.
But I was able to find my old mount path listed under HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2 . You might find your old mount listed as one of the keys under the above key. Just replace any hash marks (#) with back slashes (\).
I have a suggestion that I think a lot of people overlooked. It took me a while to find it. Since it's a server I was having issues with, I looked at users, ctl - alt -del. Then I noticed another login under users. Same user name, logged off, but still there. I then logged them off again, using the option, logoff, suddenly my mapped drives are fine. Shows amount of storage, used, etc. So looks like this solved my issue with losing mapped drives. Hope this helps others still scratching there heads.
You may use the wmic command to find out the map drives history:
wmic path win32_mappedlogicaldisk get DeviceID, ProviderName

Resources