Having read this past question for git, I would like to ask if there exists something like that, but
can be done programmatically (file list) on each machine;
works for Mercurial.
The reason for this is that I would like to include in my public dotfiles repository some configuration files that store password in plaintext. I know I could write a wraparound script for hg(1) but I would like to know if there are alternative approaches, just for the sake of curiosity.
Thank you.
You could use a pair of pre-commit and post-update hooks to encrypt/decrypt as necessary. See http://hgbook.red-bean.com/read/handling-repository-events-with-hooks.html for more details.
However, it's worth pointing out that if you're storing encrypted text in your repo you'll be unable to create meaningful diffs -- essentially everything will be like a binary file but also poorly compressible.
Mercurial has a filter system that lets you mangle files when they are read from the repository or written back. If you have a program like the SSH agent running that lets you do non-interactive encryption and decryption, then this might just be workable.
As Ryan points out, this will necessarily lead to a bigger repository since each encrypted version of your files will look completely different from the previous version. Mercurial detects this and stores the versions uncompressed (encrypted files cannot be compressed anyway). Since you will use this for dotfiles, you can ignore the space overhead, but it's something to take into consideration if you will be versioning bigger files in encrypted form.
Please post a mail to Mercurial mailing list with your experiences so that other users can benefit from them too.
Related
I want to upload cobol files to Nexus. These cobol files can be equipped with groupId, artifactId, version. For packaging/type I would use cobol.
Is there any harm in doing this (these files are not zipped like jar,war,ear,zip)?
At the moment we are using Nexus 2.14, but I would like to use this method in the future also with Nexus 3.x or Artifactory.
The reason: Our developers often have to release wars/ears and cobol simultaneously and I would like to handle them in a similar way.
Repository Managers are at their core somewhat like hard drives. There's no harm in putting anything in them but there can be problems on usage, which is generally more the concern. Without knowing the specific usage, it's really impossible to say.
NXRM3 does have a "raw" format which is intended for "everything else" use so not as to tangle with your actual repository formats. Documentation here: https://help.sonatype.com/display/NXRM3/Raw+Repositories+and+Maven+Sites
That being said there are use cases where formats need to intermingle with things outside their norm and thus it is helpful to be stored side by side. The default security of a NXRM repository prevents this but can be lowered to accept anything (temporarily or permanently). See the "Strict Content Validation Type" field in a repository configuration. This is at your own risk=)
My computer has injected by the "Cerber Ransomware", and now I couldn't use my important files. How could I decrypt these files without paying the desired money?
Before giving a sure shot, I would be interested to know which Cerber version you are talking about. If it's Cerber V1, you can easily remove it by using a tool developed by TrendMicro but if it is the latest version which is in between V2 to V6, then you must wait for the tool to be released. There's no solution as of now to remove cerber ransomware from your PC.
Actually, I went through another Ransomware with "no solution yet" and in order to solve it I used HxD to check the code of the encrypted files, I confronted that with non encrypted files with the same extension (for example, two video files frome the same camera, one encrypted and one not encrypted) and by doing that I have been able to recognize what was wrong in the code, copying and pasting the code from the non encrypted files to the encrypted one. In this way I recovered mainly video and audio files, and it's a good way to recover big files because the ransomware usually encrypts the header of the files and that's it. Anyway remember to always have copies of the crypted files if you try to manipulate it as I did, otherwise you may lose it!
Is there a directory-encryption variant similar to VIM's "vim -x file"? I am looking for something like "mkdir -encrypt folder".
There is no "general" way to encrypt directories (ie, one that works across all file and operating systems) (see below).
You can, however (as Dante mentioned) use TrueCrypt to create an encrypted filesystem in a file, then mount ("attach", in Windows terminology?) that file.
If you're using Linux, you can even mount that file at a particular directory, to make it appear that the directory is encrypted.
If you want to know how to use TrueCrypt, checkout the docs for Windows here: http://www.truecrypt.org/docs/?s=tutorial and for Linux here: http://www.howtoforge.com/truecrypt_data_encryption (scroll down to the "TrueCrypt Download" heading).
So, a quick explanation why you can encrypt files but not directories:
As far as the "computer" (that is, the hardware, operating system, filesystem drivers, etc) is considered, "files" are just "a bunch of bits on disk" (in the same way a book is "just a bunch of ink on paper"). When a program reads from or writes to a file, it can read or write whatever the heck it wants -- so if that program wants to encrypt some data before writing it to the file, or read a file then decrypt the data that it reads, great.
Directories are a different story, though: to read (ie, list) or write (ie, create) directories, the program (be it, mkdir, ls, Windows Explorer or Finder) has to ask the operating systeme, then the operating system asks the filesystem driver "Hey, can you make the directory /foo/bar?" or "hey, can you tell me what's in /bar/baz?" -- all the program or operating system see (basically) is a function to make directories and a function to list the contents of a directory.
So, to encrypt a directory, you can see that it would have to be the filesystem driver that is doing the encryption, not the program creating/listing the directories... And no modern filesystems support per-directory encryption.
On Linux, the simplest way is probably to use EncFS
"EncFS provides an encrypted filesystem in user-space. It runs without any special permissions and uses the FUSE library and Linux kernel module to provide the filesystem interface."
it basically mounts an encrypted folder as a plain one.
More info on wikipedia
TrueCrypt Its open source and supports multiple types of encryption.. What operating system do you wish to know about?
Edit: Windows Vista/XP, Mac OS X, and Linux are all supported.
I would recommend Enterprise Cryptographic Filesystem i.e. ecryptfs found in apt-get as ecryptfs-utils in Debian/Ubuntu because more flexible than TrueCrypt.
It is probably one of the strongest way here to encrypt the directory.
It can be used with two passwords: login passhrase and password so making it a kind of double password system.
It is also POSIX implemented.
The limitation of this system like many other encryption systems is that it supports only filenames/directory names up to 144, in contrast to 255 Linux standard.
Maintained four years and last update 4 months ago so a good thing for future.
Comparison between TrueCrypt and encryptfs from this blog post
Truecrypt is simulated hardware encryption. It creates a virtual
encrypted hard disk, which your operating system can more or less
treat like an ordinary hard disk, but for the kernel hooks Truecrypt
adds to lock and unlock the disk. EcryptFS is an encrypted filesystem.
Unlike Truecrypt, which encrypts individual disk blocks, systems like
EcryptFS encrypt and decrypt whole files.
and more comparasion between the two systems here:
Those complications (and the fact that ecryptfs is slower) are part of
why people like block-level encryption like TrueCrypt, but I do
appreciate the flexibility of ecryptfs.
I am a Mac/Ubuntu user. I have folders such as "AWK", "awk", "awk_tip" and "awk_notes". I need to archive them, but the variety of utilities confuse me. I had a look at Tar, cpio and pax, but Git has started to fascinate me. I occasionally need encryption and backups.
Please, list the pros and cons of different archiving utilities.
Tar, cpio and pax are ancient Unix utilities. For instance, tar (which is probably the most common of these) was originally intended for making backups on tapes (hence the name, tar = tape archive).
The most commonly used archive formats today are:
tar (in Unix/Linux environments)
tar.gz or tgz (a gzip compressed tar file)
zip (in Windows environments)
If you want just one simple tool, take zip. It works right out of the box on most platforms, and it can be password protected (although the protection is technically weak).
If you need stronger protection (encryption), check out TrueCrypt. It is very good.
Under what OS / toolchain are you working? This might limit the range of existing solutions. Your name suggests Unix, but which one? Further, do you need portability or not?
The standard linux solution (at least to a newbie like me) might be to tar and gzip or bzip2 the folders, then encrypt them with gnupg if you really have to (encrypting awk tutorials seems a bit of overkill to me). You can also use full-fledged backup solutions like bacula, sync to a different location with rsync (perhaps sync to a backup server?).
If you've backing up directories from an ext2/ext3 filesystem, you may want to consider using dump. Some nice features:
it can backup a directory or a whole partition
it saves permissions and timestamps,
it allows you to do incremental backups,
it can compress (gzip or bzip2)
it will automatically split the archive into multiple parts based on a size-limit if you want
it will backup over a network or to a tape as well as a file
It doesn't support encryption, but you can always encrypt the dump files afterwards.
Like most *nix people, I tend to play with my tools and get them configured just the way that I like them. This was all well and good until recently. As I do more and more work, I tend to log onto more and more machines, and have more and more stuff that's configured great on my home machine, but not necessarily on my work machine, or my web server, or any of my work servers...
How do you keep these config files updated? Do you just manually copy them over? Do you have them stored somewhere public?
I've had pretty good luck keeping my files under a revision control system. It's not for everyone, but most programmers should be able to appreciate the benefits.
Read
Keeping Your Life in Subversion
for an excellent description, including how to handle non-dotfile configuration (like cron jobs via the svnfix script) on multiple machines.
I also use subversion to manage my dotfiles. When I login to a box my confs are automagically updated for me. I also use github to store my confs publicly. I use git-svn to keep the two in sync.
Getting up and running on a new server is just a matter of running a few commands. The create_links script just creates the symlinks from the .dotfiles folder items into my $HOME, and also touches some files that don't need to be checked in.
$ cd
# checkout the files
$ svn co https://path/to/my/dotfiles/trunk .dotfiles
# remove any files that might be in the way
$ .dotfiles/create_links.sh unlink
# create the symlinks and other random tasks needed for setup
$ .dotfiles/create_links.sh
It seems like everywhere I look these days I find a new thing that makes me say "Hey, that'd be a good thing to use DropBox for"
Rsync is about your best solution. Examples can be found here:
http://troy.jdmz.net/rsync/index.html
I use git for this.
There is a wiki/mailing list dedicated to the topic.
vcs-home
I would definetly recommend homesick. It uses git and automatically symlinks your files. homesick track tracks a new dotfile, while homesick symlink symlinks new dotfiles from the repository into your homefolder. This way you can even have more than one repository.
You could use rsync. It works through ssh which I've found useful since I only setup new servers with ssh access.
Or, create a tar file that you move around everywhere and unpack.
I store them in my version control system.
i use svn ... having a public and a private repository ... so as soon as i get on a server i just
svn co http://my.rep/home/public
and have all my dot files ...
I store mine in a git repository, which allows me to easily merge beyond system dependent changes, yet share changes that I want as well.
I keep master versions of the files under CM control on my main machine, and where I need to, arrange to copy the updates around. Fortunately, we have NFS mounts for home directories on most of our machines, so I actually don't have to copy all that often. My profile, on the other hand, is rather complex - and has provision for different PATH settings, etc, on different machines. Roughly, the machines I have administrative control over tend to have more open source software installed than machines I use occasionally without administrative control.
So, I have a random mix of manual and semi-automatic process.
There is netskel where you put your common files on a web server, and then the client program maintains the dot-files on any number of client machines. It's designed to run on any level of client machine, so the shell scripts are proper sh scripts and have a minimal amount of dependencies.
Svn here, too. Rsync or unison would be a good idea, except that sometimes stuff stops working and i wonder what was in my .bashrc file last week. Svn is a life saver in that case.
Now I use Live Mesh which keeps all my files synchronized across multiple machines.
I put all my dotfiles in to a folder on Dropbox and then symlink them to each machine. Changes made on one machine are available to all the others almost immediately. It just works.
Depending on your environment you can also use (fully backupped) NFS shares ...
Speaking about storing dot files in public there are
http://www.dotfiles.com/
and
http://dotfiles.org/
But it would be really painful to manually update your files as (AFAIK) none of these services provide any API.
The latter is really minimalistic (no contact form, no information about who made/owns it etc.)
briefcase is a tool to facilitate keeping dotfiles in git, including those with private information (such as .gitconfig).
By keeping your configuration files in a git public git repository, you can share your settings with others. Any secret information is kept in a single file outside the repository (it’s up to you to backup and transport this file).
-- http://jim.github.com/briefcase
mackup
https://github.com/lra/mackup
Ira/mackup is a utility for Linux & Mac systems that will sync application preferences using almost any popular shared storage provider (dropbox, icloud, google drive). It works by replacing the dot files with symlinks.
It also has a large library of hundreds of applications that are supported https://github.com/lra/mackup/tree/master/mackup/applications