What to include when teaching a UNIX course? [closed] - unix

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I was asked to teach UNIX to a group of people in my company who probably don't know much about UNIX with total Windows background. Help me decide the Course Contents. I don't want to go in and teach them a set of commands. I want it to be more on the lines of the UNIX architecture, file system, pipes, how everything are files, process creation and handling, virtual memory management etc. What do you all think? Help me in covering these topics.

You must read The UNIX Philosophy by Mike Gancarz. It might be worthwhile using as a text, but it will definitely give you a lot of the reasons why UNIX is good and how to leverage it's power best.

Unix topics in order of importance:
Pipes
Tool philosophy (do one thing well)
The permissions model
Shell syntax
Interacting with processes
Picking and using an editor
Basic C programming
An ideal way for Windows folks to learn how to function in a Unix environment is to have them use Cygwin on their Windows box.

Both Unix and Windows share most of their basic OS concepts: file descriptors, processes, virtual memory, etc... The only main difference you will need to address immediately is the different path tree structure: single root plus mount points vs drive letters.
I think you have to distinguish between several widely different topics:
using the shell:
You will need to get into concepts like process structure, file descriptors, basic commands.
programming under Unix:
You will need to address IDEs, compiling tools, building tools, and dynamic linking.
using the Unix desktop:
Modern Unices all have fairly comprehensive desktop environments, that work in a pretty similar way to Window... no big learning curve there.

You should include information about the shell. Explain the standard old method of using the output of one command as the input for the next, using the pipe.
Also show how output redirection is powerful, and how error redirection works (&2>).
Have your "Students" install Cygwin on their workstations to give them the opportunity to run "Unix-Commands" right inside Windows.

Underlying theory is always good, mention why UNIX is designed the way it is Eric Raymond's The Art of UNIX programming is good for that.
If they're going to be developing for UNIX some of the standards wont be amiss, Filesysetm Hierarchy standard and POSIX for example.

Sounds to me like you want to take a basic OS course, and make it UNIX specific. If you're designing the course for developer types, I'd think that would work well - they'd be familar with basic OS constructs and would appreciate knowing the UNIX specific flavors and then the commands that interact with each construct.
If you're designing the course for regular people, though, they might get lost in the OS theory. Even with a simple OS example, the whole thing gets very complicated.
My favorite UNIX book of all time is "A Student's Guide to UNIX". I'm sure there's many great competitor's out there. But what I liked was that it combined commands with basic theory and bundled each section with a bit of history on why given parts of the OS were designed a certain way and/or a bit of history on who the designers were. So much of UNIX is the commands, it was nice to have all those little blurbs and they were often nice memory joggers.

I'd start with fundamentals and compare each concept to its Windows counterpart. Kernel, driver, memory, process, daemon, file, user, a shell (vs. the command prompt), a filesystem etc.

let them run some UNIX-like system e.g. from live-cd etc (ubuntu or knoppix maybe some other live UNIX systems as well)
If they are power windows users, compare bash to powershell.
Most of windows users also don't get the cncept of init scripts vs. Windows services so I would explain that as well.

General directory structure.
Sockets and other various IPCs. Unix lets you treat them as files, which makes programming easier.
pthread library and concurrency concepts.

I would go through the Linux Administration Handbook and look at the chapters in the book and focus on those concepts that are important to a user as opposed to an administrator.

In addition to all of the other great suggestions, I would recommend discussing regular expressions in detail with examples in sed, awk, perl, vi, etc. REs are used in so many places, they really deserve their own place in the discussion. Add in a discussion of the common text processing utilities - cut, paste, grep, etc.

Related

Encrypting R script under MS-Windows

I have a bunch of R scripts which I am running on a Windows machine and want to ensure that the code remains unread by those not intended to see it. On a Linux box, I could wrap the R code in a bash script #! and make an encrypted (and perhaps even a limited-life) executable shell script. What are my options to do something on similar lines under Windows?
My answer is a bit late, but I believe this is a good question. Unfortunately, I don't believe that there is a solution, or at least an easy one, at the present time.
The difficulty is common because, for most interpreted languages, including R, it is often possible to turn on logging and inspection of all commands being run. This can negate many tricks to obfuscate the code.
For those who prefer to think of code being open == good, one should know that a common reason to obfuscate the code is if one is consulting with a client that hires multiple vendors. It is not uncommon for a client to take scripts from vendor A and ask vendor B why it doesn't work with their system. (This may be done by a low-level IT flunkie, rather than someone responsible for the NDA contracts.) If A & B are competitors, A's code has just been handed to B. When scripts == serious programs, then serious code has been given away.
The ways I've seen this addressed are:
Make a call to a compiled language, and use standard protections available there.
Host the executable on a different server, and use calls to the server to execute the calculations. (In R, there are multiple server-side options.)
Use compiled (preprocessed / bytecode) code within the language.
Option 2 is actually easier and better when the code may be widely distributed, not just for IP reasons. A major advantage is that it lets you upgrade the code without having to go through the pain of a site-wide release process. If new libraries are needed, no problem - update the server.
Option 3 is done in Matlab with .p files, and can be done with py2exe for Python on Windows. In R, the new bytecode compilation may be analogous, but I am not familiar enough with it to address any differences between .Rc files in the R context and .p files in the Matlab context. For more info on the compiler, see: http://www.inside-r.org/r-doc/compiler/compile
Hosting computations on the server is great for working with unsophisticated users, because it is easier to iterate quickly in response to bugs or feature requests. The IP protection is simply a benefit.
This is not a specifically R-oriented strategy. (And it's a bit unclear what your constraints or goals really are anyway.) If you want a cross-platform encryption method, you should look into the open-source program TrueCrypt. It supports creating encrypted files that can be mounted as volumes on any machine that supports the volume formatting method. I have tested this across the Mac PC divide , since the Mac can read FAT files, but have no experience with how it might work across the Linux-PC chasm.
(Their TODO list for Windows includes;"Command line options for volume creation (already implemented in Linux and Mac OS X versions)". So I don't see any clear way to use this from within R without you running the program from the OS.)
I don't think this is possible because the R interpreter has to be able to decrypt and read the code in order to execute it which means that whoever is using that interpreter will also be able to decrypt and read the code.
I am by no means an expert, so I reserve the right to be 100% wrong about that statement.
I believe the best solution is to ensure value comes from the expertise and services provided by your company and it's employers---not from keeping secrets.
Failing that, you could try separating the code into a client/server model. That way the client just sends data and receives results---they never have access to the code that runs on the server.
However, the scientist in me just said "that solution sucks and I would never trust results provided under such conditions".

Which DVCS is most conducive to experimenting?

I was wondering which DVCS is most conducive to experimentation i.e. branching, etc. I want something where anyone can quickly launch smaller projects and refactor code quickly. I want to create an environment where experimenting is cheap and can be discarded/merged easily.
Git is known for very cheap branching, they made it so that branching was something trivial, so that, like you said, you could create branches for any little thing. I don't have experience with the other DVCSes, but I imagine they're pretty similar given their similar nature. I just know that cheap branching is one of Git's reasons for creation, or something like that. Sorry if I misunderstood your question.
Here's a section of a popular article/site giving details about git over other version control systems.
In response to your comment: On windows I imagine? I've been fine using msysgit, get msysGit-fullinstall-1.6.4-preview20090729. For a detailed walkthrough with screenshots that helped out some friends, I recommend the Git for Windows Developers series.
You could also try Mercurial, it's fast, it's distributed and it's easier to use. If you like working with a GUI try -- TortoiseHg.
Here is an analysis done by google before they integrated mercurial into google code.
Your requirements match Darcs or Git.
If you're a GUI user, why don't you take a look at Plastic SCM? http://codicesoftware.blogspot.com/2010/03/distributed-development-for-windows.html. It's one of the few commercial DVCSs out there and it's focused on ease of use but it has all the features you're looking for:
Excellent branching and merging support (full merge tracking, rename support and all that)
Distributed (and easy to use)
Subtractive merge support (you can do it from the GUI)
Besides:
Very good visualization
Excellent Windows GUI (check it)
Excellent VStudio integration

Fitnesse vs any other subsystem testing tool [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
We are currently using Fitness for subsystem testing.
we are having lot of issues using the tool, few to mention
Development time for writing Fixture is more then writing the actual code
Issues around check in of the dlls so that Qa can test them
Issues in running Fitnesse for project which uses NHibernate
limited help online
We are planning to use some other tool to do the testing
Few options which we know are
SOAP UI
Story teller
I am not sure whether we will have similar problems with these tools
It would be great to know if someone has experience using these tool and could guide us
In our project we have adopted TDD so we have Nuits for unit testing.
It would be great if anyone is aware of tools/ideas which could extend nunits for subsystem testing as well.
Component testing tools are all about calling functions. Your tests cause functions to be called in "fixtures" that then call into the SUT. Any tool based on this premise will encounter the problems you reference above.
However, most of those problem are manageable. For example you should not be writing lots of fixtures. If you are, something is wrong. Secondly, your fixtures ought to be little more than wiring code to call the APIs in your application. If your fixtures are doing significant work, then something is wrong.
In most FitNesse environments the number of fixtures is rather small. For example, there are over two hundred acceptance tests for fitnesse itself, but the number of fixtures in on the order of a dozen, and they are all relatively simple.
Get help on the fitnesse#yahoogroups.com site. The folks there are usually very responsive to questions.
If you can communicate with your software using text, then I have had success on past projects rolling my own framework using expect.
The framework I cooked up stored tests as XML files, using a simple xUnit style markup. The xml files were then transformed into executable tests using a stylesheet. I ended up transforming the tests into Tcl/Expect, but you could transform them into anything. In fact, if you wanted, you could transform them into multiple languages, depending on your needs.
Several people have kindly reminded me (in the same way you remind you poor dottering grandfather about the drool on his chin) that we are in the 21st century when they inquire why I would choose Tcl over some more modern language. As it turns out, for the purposes of this kind of testing, I haven't yet found a better choice. The Tcl language still kicks butt in this area. Trust me, I didn't wake up one day and say to myself "self, what I need a test framework implemented in a scripting language everyone will hate!"
Believe it or not, I really was looking for a tool, any tool, that had the following characteristics:
Cross platform. This was non-negotiable. We do a lot of cross platform development and we already use WAY too many tools that don't support cross platform development.
Simple syntax. Say what you want about Tcl, but the syntax is very regular. I knew that some native code would probably creep even into the XML files (and originally it was Tcl only, no XML) and I wanted the syntax to be comprehensible to a non-programmer. This simplicity is a core strength of Tcl. As it turns out, it also made transforming the XML easier too.
Free. My favorite price ;-)
Writing tests as simple xml files allowed non-programmers to write customer acceptance level tests - no programming required.
Easily extended.
I did not set out to home grow this to the extent I have. Initially, I looked at established test frameworks like DejaGnu and android. Mostly they had way too many features. They were so feature laden that I didn't think they would be easy for a project to start using without a lot of up front training. Looking at DejaGnu, got me interested in Tcl in general, and after a brief look at tcltest, I almost gave up. Both DejaGnu and tcltest assume you are an advanced Tcl scripter, which I didn't think anyone at my company ever would be. In addition, I wanted the test framework (if possible) to support an xUnit type of test framework and neither of these tools did.
Eventually I found TclTkUnit, a Tcl based testing framework that is designed along xUnit lines. It was only a short leap of logic to realize I could run TclTkUnit in Expect instead of tclsh and get everything I needed.
As it ended up getting used more, I added another stylesheet to render the xml files nicely in a web browser. The test framework generated it's own documentation.
On another project we needs a very basic sim / stim environment to emulate a person throwing switches and pushing buttons on a piece of hardware we didn't have. It only took a few hours to hack the test framework to function as a simulator. Creating the framework took some work, but we felt that it did pay benefits in the long run. I really believe that these types of unforseen consequences of creating your own tools is why people in the agile community & XP in particular have always been such strong advocates.
We have adopted a Fitnesse-based but practically-code-free approach using GenericFixture (google for Anubhava to find his wordpress site) for Fitnesse.
What this allows us to do is to create "executable test narratives" using a language that is friendly to the business-side (as opposed to the technical-side). This language, which is very easily defined, practically without coding, in Generic Fixture, is called a DSL (domain specific language). So we can write our test narratives using e.g. medical terms or even in a language other than English. Basically what we get is transforming our Use Cases into executable narratives.
We are starting to use it in a large project (15 ppl for 2 years) and it seems (so far) to have a good future.
It easily allows Test Driven Development or test-creation after development (traditional approach).
It is wiki-based (Fitnesse) and its versioning and refactoring funcitonality has proven so far sufficient.
I can give more info if anyone is interested.
best regards,
Aristotelis.
We use unit-testing frameworks like NUnit to drive our subsystem tests as well - the tests don't care how they are run. It doesn't have fitnesse's document-based approach, though.

Is PowerShell ready to replace my Cygwin shell on Windows? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I'm debating whether I should learn PowerShell, or just stick with Cygwin/Perl scripts/Unix shell scripts, etc.
The benefit of PowerShell would be that the scripts could be more easily used by teammates that don't have Cygwin; however, I don't know if I'd really be writing that many general purpose scripts, or if people would even use them.
Unix scripting is so powerful, does PowerShell come close enough to warrant switching over?
Here are some of the specific things (or equivalents) I would be looking for in PowerShell:
grep
sort
uniq
Perl (how close does PowerShell come to Perl's capabilities?)
AWK
sed
file (the command that gives file information)
etc.
Tools are just tools.
They help or they don't.
You need help or you don't.
If you know Unix and those tools do what you need them to do on Windows - then you are a happy guy and there is no need to learn PowerShell (unless you want to explore).
My original intent was to include a set of Unix tools in Windows and be done with it (a number of us on the team have deep Unix backgrounds and a healthy dose of respect for that community.)
What I found was that this didn't really help much. The reason for that is that AWK/grep/sed don't work against COM, WMI, ADSI, the Registry, the certificate store, etc., etc.
In other words, UNIX is an entire ecosystem self-tuned around text files. As such, text processing tools are effectively management tools. Windows is a completely different ecosystem self-tuned around APIs and Objects. That's why we invented PowerShell.
What I think you'll find is that there will be lots of occasions when text-processing won't get you what you want on Windows. At that point, you'll want to pick up PowerShell. NOTE - it is not an all or nothing deal. Within PowerShell, you can call out to your Unix tools (and use their text process or PowerShell's text processing). Also you can call PowerShell from your Unix tools and get text.
Again - there is no religion here - our focus is on giving you the tools you need to succeed. That is why we are so passionate about feedback. Let us know where we are falling down on the job or where you don't have a tool you need and we'll put it on the list and get to it.
In all honesty, we are digging ourselves out of a 30-year-hole, so it is going to take a while. That said, if you pick up the beta of Windows Server 2008 /R2 and/or the betas of our server products, I think you'll be shocked at how quickly that hole is getting filled.
With regard to usage - we've had > 3.5 million downloads to date. That does not include the people using it in Windows Server 2008, because it is included as an optional component and does not need a download.
V2 will ship in all versions of Windows. It will be on-by-default for all editions except Server core where it is an optional component. Shortly after Windows 7/Windows Server 2008 R2 ships, we'll make V2 available on all platforms, Windows XP and above. In other words - your investment in learning will be applicable to a very large number of machines/environments.
One last comment. If/when you start to learn PowerShell, I think you'll be pretty happy. Much of the design is heavily influenced by our Unix backgrounds, so while we are quite different, you'll pick it up very quickly (after you get over cussing that it isn't Unix :-) ).
We know that people have a very limited budget for learning - that is why we are super hard-core about consistency. You are going to learn something, and then you'll use it over and over and over again.
Experiment! Enjoy! Engage!
grep
Select-String cmdlet and -match operator work with regexes. Also you can directly make use of .NET's regex support for more advanced functionality.
sort
Sort-Object is more powerful (than I remember *nix's sort). Allowing multi-level sorting on arbitrary expressions. Here PowerShell's maintenance of underlying type helps; e.g. a DateTime property will be sorted as a DateTime without having to ensure formatting into a sortable format.
uniq
Select-Object -Unique
Perl (how close does PowerShell come to Perl capabilities?)
In terms of Perl's breadth of domain specific support libraries: nowhere close (yet).
For general programming, PowerShell is certainly more cohesive and consistent, and easier to extend. The one gap for text munging is something equivalent to Perl's .. operator.
AWK
It has been long enough since using AWK (must be >18 years, since later I just used Perl), so can't really comment.
sed
[See above]
file (the command that gives file information)
PowerShell's strength here isn't so much of what it can do with filesystem objects (and it gets full information here, dir returns FileInfo or FolderInfo objects as appropriate) is that is the whole provider model.
You can treat the registry, certificate store, SQL Server, Internet Explorer's RSS cache, etc. as an object space navigable by the same cmdlets as the filesystem.
PowerShell is definitely the way forward on Windows. Microsoft has made it part of their requirements for future non-home products. Hence rich support in Exchange, support in SQL Server. This is only going to expand.
A recent example of this is the TFS PowerToys. Many TFS client operations are done without having to startup tf.exe each time (which requires a new TFS server connection, etc.) and is notably easier to then further process the data. As well as allowing wide access to the whole TFS client API to a greater detail than exposed in either Team Explorer of TF.exe.
As someone whose career focused on Windows enterprise development from 1997 - 2010, the obvious answer would be PowerShell for all the good reasons given previously (e.g., it is part of Microsoft's enterprise strategy; it integrates well with Windows/COM/.NET; and using objects instead of files provides for a "richer" coding model). For that reason I'd been using and promoting PowerShell for the last two years or so, with the express belief I was following the "Word of Bill."
However, as a pragmatist I'm no longer sure PowerShell is such a great answer. While it's an excellent Windows tool and provides a much needed step towards filling the historic hole that is the Window command line, as we all watch Microsoft's grip on consumer computing slip it seems increasingly likely that Microsoft has a massive battle ahead to keep its OS as important to the enterprise of the future.
Indeed, given I find my work is increasingly in heterogeneous environments, I'm finding it much more useful to use Bash scripts at the moment, as they not only work on Linux, Solaris and Mac OS X, but they also work—with the help of Cygwin—on Windows.
So if you buy into the belief that the future of the OS is commoditized rather than a monopolized, then it seems to make sense to opt for an agile development tool strategy that keeps away from proprietary tools where feasible. If however you see your future being dominated by all-that-is-Redmond then go for PowerShell.
I have used a bit of PowerShell for script automation. While it is very nice that the environment seems to have been thought out much more than Unix shells, in practice the use of objects instead of text streams is much more clunky, and a lot of the Unix facilities that have been developed in the last 30 years are still missing.
Cygwin is still my scripting environment of choice for Windows hosts. It certainly beats the alternatives in terms of getting things done.
There are lots of great great answers here, and here is my take. PowerShell is ready if you are... Examples:
grep = "Select-String -Pattern"
sort = "Sort-Object"
uniq = "Get-Unique"
file = "Get-Item"
cat = "Get-Content"
Perl/AWK/Sed are not commands, but utilities hence hard to compare, but you can do almost everything in PowerShell.
I have only recently started dabbling in PowerShell with any degree of seriousness. Although for the past seven years I've worked in an almost exclusively Windows-based environment, I come from a Unix background and find myself constantly trying to "Unix-fy" my interaction experience on Windows. It's frustrating to say the least.
It's only fair to compare PowerShell to something like Bash, tcsh, or zsh since utilities like grep, sed, awk, find, etc. are not, strictly speaking, part of the shell; they will always, however, be part of any Unix environment. That said, a PowerShell command like Select-String has a very similar function to grep and is bundled as a core module in PowerShell ... so the lines can be a little blurred.
I think the key thing is culture, and the fact that the respective tool-sets will embody their respective cultures:
Unix is a file-based, (in general, non Unicode) text-based culture. Configuration files are almost exclusively text files. Windows, on the other hand has always been far more structured in respect of configuration formats--configurations are generally kept in proprietary databases (e.g., the Windows registry) which require specialised tools for their management.
The Unix administrative (and, for many years, development) interface has traditionally been the command line and the virtual terminal. Windows started off as a GUI and administrative functions have only recently started moving away from being exclusively GUI-based. We can expect the Unix experience on the command line to be a richer, more mature one given the significant lead it has on PowerShell, and my experience matches this. On this, in my experience:
The Unix administrative experience is geared towards making things easy to do in a minimal amount of key strokes; this is probably as a result of the historical situation of having to administer a server over a slow 9600 baud dial-up connection. Now PowerShell does have aliases which go a long way to getting around the rather verbose Verb-Noun standard, but getting to know those aliases is a bit of a pain (anyone know of something better than: alias | where {$_.ResolvedCommandName -eq "<command>"}?).
An example of the rich way in which history can be manipulated:
iptables commands are often long-winded and repeating them with slight differences would be a pain if it weren't for just one of many neat features of history manipulation built into Bash, so inserting an iptables rule like the following:
iptables -I camera-1-internet -s 192.168.0.50 -m state --state NEW -j ACCEPT
a second time for another camera ("camera-2"), is just a case of issuing:
!!:s/-1-/-2-/:s/50/51
which means "perform the previous command, but substitute -1- with -2- and 50 with 51.
The Unix experience is optimised for touch-typists; one can pretty much do everything without leaving the "home" position. For example, in Bash, using the Emacs key bindings (yes, Bash also supports vi bindings), cycling through the history is done using Ctrl-P and Ctrl-N whilst moving to the start and end of a line is done using Ctrl-A and Ctrl-E respectively ... and it definitely doesn't end there. Try even the simplest of navigation in the PowerShell console without moving from the home position and you're in trouble.
Simple things like versatile paging (a la less) on Unix don't seem to be available out-of-the-box in PowerShell which is a little frustrating, and a rich editor experience doesn't exist either. Of course, one can always download third-party tools that will fill those gaps, but it sure would be nice if these things were just "there" like they are on pretty much any flavour of Unix.
The Windows culture, at least in terms of system API's is largely driven by the supporting frameworks, viz., COM and .NET, both of-which are highly structured and object-based. On the other hand, access to Unix APIs has traditionally been through a file interface (/dev and /proc) or (non-object-oriented) C-style library calls. It's no surprise then that the scripting experiences match their respective OS paradigms. PowerShell is by nature structured (everything is an object) and Bash-and-friends file-based. The structured API which is at the disposal of a PowerShell programmer is vast (essentially matching the vastness of the existing set of standard COM and .NET interfaces).
In short, although the scripting capabilities of PowerShell are arguably more powerful than Bash (especially when you consider the availability of the .NET BCL), the interactive experience is significantly weaker, particularly if you're coming at it from an entirely keyboard-driven, console-based perspective (as many Unix-heads are).
I am not a very experienced PowerShell user by any means, but the little bit of it that I was exposed to impressed me a great deal. You can chain the built-in cmdlets together to do just about anything that you could do at a Unix prompt, and there's some additional goodness for doing things like exporting to CSV, HTML tables, and for more in-depth system administration types of jobs.
And if you really needed something like sed, there's always UnixUtils or GnuWin32, which you could integrate with PowerShell fairly easily.
As a longtime Unix user, I did however have a bit of trouble getting used to the command naming scheme, and I certainly would have benefitted more from it if I knew more .NET.
So essentially, I say it's well worth learning it if the Windows-only-ness of it doesn't pose a problem.
If you like shell scripting you will love PowerShell!
Start at A guided tour of the Microsoft Command Shell (Ars Technica).
As my recent experiments led me into depths of PowerShell and .NET calls, I must say that PowerShell can replace Cygwin and Unix shell.
I'm not sure about Perl, but since both PowerShell and Perl are Turing complete as programming languages, I give this as a yes to replacing Perl too.
One thing that PowerShell has above Cygwin and ordinary Bash under *nix, is its ability to perform sandboxed DLL calls, manipulating the operating system via direct API calls, WMI methods and even COM objects. How about launching Internet Explorer via code, then doing whatever you want with its displayed document, effectively emulating a back-end for a Web server?
How about gathering data from SQL servers and other data providers, parse them and export as CSV, mail messages, text and actually any kind of existing and non-existing file formats? (With proper skills of creating a valid file out of data received, of course, but CSV are readily available).
And there is an extra security available via signed cmdlets and scripts,
group policies, and execution policies that help prevent malicious code from running on your system even if you run them as administrator.
About what commands are implemented - the answer by Richard lists them and PowerShell's capability of emulating their functionality already.
About whether PowerShell is strong to warrant switching over - this is more a matter of personal preference, although as more and more Windows services are providing PowerShell cmdlets to control them, not using PowerShell with these services present is considered a hindrance. (Hyper-V server is the primary such service, and it also provides the ability to do more with PowerShell cmdlets than with GUI!)
Probably this answer is five years late, but still, if someone performs administrative tasks or general scripting of various stuff on Windows, they should definitely try harnessing PowerShell for their purposes.
When you compare PowerShell to the combination Cygwin/Perl/Shell, be aware that PowerShell only represents the "Shell" part of that combination.
You can however invoke any command from PowerShell just as you do from cmd.exe or Cygwin. It does not re-implement the specified functions, and it is certainly not comparable to Perl.
It's "just" a shell, but it makes programming easier providing a comfortable interface to the .NET universe.
Also keep in mind that PowerShell requires Windows XP, Windows Server 2003 or higher, which may pose a problem depending on your IT infrastructure.
Update:
I had no idea what kind of philosophical debate my answer would spark.
I posted my answer in the context of the question: Compare PowerShell to Cygwin and Perl and Bash.
PowerShell is a shell, as it makes no syntactic difference between built-in commands, commandlets, user functions, and external commands (.exe, .bat, .cmd). Only invoking .NET methods differ by adding a namespace or an object in the call.
Its programmability derives from the .NET framework, not from anything specific to the PowerShell "language".
I'd say I believe PowerShell is a "scripting language" as soon as Bugzilla or MediaWiki are implemented as PowerShell scripts running on a web server ;)
Until then, enjoy the comparisons.
TL;DR -- I don't hate Windows or PowerShell. I just can't do anything in Windows or on PowerShell.
I personally still find PowerShell underwhelming at best.
tab completion of directory paths do not compound, requiring the user to enter a path separator after every name completion.
I still feel like Windows doesn't even have the concept of a path or of what a path is, with no accessible user home indicator ~/ short of some #environment://somejibberish/%user_home%
NTFS is still a mess and seemingly always will be. Good luck navigating.
cmd-esque interface, The dinosaur cmd.exe is still visible in PowerShell, Edit → Mark still being the only way to copy information, and copying only in the form of rectangular blocks of visible terminal space. and Edit → Mark still being the only way to paste strings into the terminal.
Painting it blue doesn't make it any more attractive. I don't mind Microsoft developers having a taste in color though.
Windows always opens at top left corner of screen. For somebody who uses vertical task bars this is incredibly annoying, especially considering that the Windows task bar will cover the only corner of the window that gives access to copy/paste functionality.
I can't speak much on the grounds of the tools Windows includes. Being that there is a whole set of open-source, freely licensed CLI tools, and PowerShell ships with, to my knowledge, none of them is an utter disappointment.
PowerShell's wget takes seemingly incomparable arguments to GNU wget. Thanks, glimmer of hope portably-useless.
PowerShell POSIX is not Bash-compatible, particularly the && operator is not handled, making the simplest of conditional command following not a thing.
I don't know man; I gave it a shot, I really did; I still try to give it a shot in the hopes that the next time I open it it will be any less useless. I cannot do anything in PowerShell, and I can barely do things with a real project to bring GNU tools to Windows.
MySysGit gives me the dinosaur cmd.exe prompt with a couple of GNU tools, and it is still very underwhelming, but at last path completion works. And the Git command will run in Git Bash.
Mintty for MySysGit gives the Cygwin interface over mysysgit's environment, making copy and paste a thing (select to copy (mouse), Shift+Ins to paste, how modern...). However, things like git push are broken in Mintty.
I don't mean to rant, but I still see huge problems with command-line usability on Windows even given tools like Cygwin.
P.S.: Just because something can be done in PowerShell, doesn't make it usable. Usability is deeper than ability and is what I tend to focus on when trying to use a product as a consumer.
The cmdlets in PowerShell are very nice and work reliably. Their object-orientedness appeals to me a lot since I'm a Java/C# developer, but it's not at all a complete set. Since it's object oriented, it's missed out on a lot of the text stream maturity of the POSIX tool set (awk and sed to name a few).
The best answer I've found to the dilemma of loving OO techniques and loving the maturity in the POSIX tools is to use both! One great aspect of PowerShell is that it does an excellent job piping objects to standard streams. PowerShell by default uses an object pipeline to transport its objects around. These aren't the standard streams (standard out, standard error, and standard in). When PowerShell needs to pass output to a standard process that doesn't have an object pipeline, it first converts the objects to a text stream. Since it does this so well, PowerShell makes an excellent place to host POSIX tools!
The best POSIX tool set is GnuWin32. It does take more than 5 seconds to install, but it's worth the trouble, and as far as I can tell, it doesn't modify your system (registry, c:\windows\* folders, etc.) except copying files to the directories you specify. This is extra nice because if you put the tools in a shared directory, many people can access them concurrently.
GnuWin32 Installation Instructions
Download and execute the exe (it's from the SourceForge site) pointing it to a suitable directory (I'll be using C:\bin). It will create a GetGnuWin32 directory there in which you will run download.bat, then install.bat (without parameters), after which, there will be a C:\bin\GetGnuWin32\gnuwin32\bin directory that is the most useful folder that has ever existed on a Windows machine. Add that directory to your path, and you're ready to go.
I haven't seen that the PowerShell has really taken off, at least not yet. So it might not be worth the effort of learning it unless those others on your team already know it.
For your predicament you might be better off with a scripting language that others could get behind, Perl like you mentioned, or others like Ruby or Python.
I think a lot of it depends on what you need to do. Personally I've been using Python for my own personal scripts, but I know when I start writing something that I'll never be able to pass it on - so I try not to do anything too revolutionary.
Why not use both? Call PowerShell scripts in Cygwin just like any other interpreted scripts like Perl, etc.
I do this enough that I wrote https://bitbucket.org/jbianchi/powershell for a Bash wrapper to call powershell.exe in Cygwin. It can be used as a shebang as the first line of a powershell.exe .ps1 script (since PowerShell also uses "#" as a comment). See https://bitbucket.org/jbianchi/powershell/wiki/Home for examples
In a couple of lines, Cygwin and PowerShell are different tools however if you have Cygwin installed you can run the Cygwin executables within a PowerShell session. I've gotten so used to PowerShell that now I no longer use grep, sort, awk, etc. There are pretty much built-in alternatives in PowerShell, and if not you can find a cmdlet out there.
The main tool I find myself using is ssh.exe, but within a PowerShell session.
It works great.
I found PowerShell programming to be not worth the effort.
I have several years of experience with shell scripting under Unix, but I found it enormously difficult to do much of anything with PowerShell.
It seems like many functions require you to interrogate the Windows Management Interface and issue SQL-like commands to get the information you need.
For example, I wanted to write a script to remove all files with a specific suffix from a directory tree. Under Unix, this would be a simple ...
find . -name \*.xyz -exec rm {} \;
After a couple of hours dicking around with Scripting.FileSystemObject and WScript.Shell and issuing "SELECT * FROM Win32_ShortcutFile WHERE Drive = '" & drive & "' AND Path = '" & searchFolder & "'", I finally gave up and settled for Windows Explorer's Search command and just do it manually. There's probably some way to do what I wanted, but I didn't see anything obvious and all the examples on the MSDN site were so trivial as to be worthless.
EDIT Heh, of course as soon as I wrote this I poked around some more and found what I had been missing: the -recurse option to the remove-item command is faulty (revealed if you use get-help remove-item -detailed).
I had been trying "remove-item -filter '* .xyz' -recurse" and it wasn't working, so I gave up on it.
Turns out you need to use get-childitem -filter '*.xyz' -recurse | remove-item
You can also try running Bash scripts on Windows using BashWin at
https://github.com/skanga/BashWin.
PowerShell is very powerful, more powerful than the standard built-ins of the Unix shells (but only because it includes much of the functionality usually shelled out to subprograms). Also, consider that you can write applets in any .NET language, including IronPython, IronRuby, PerlNet, etc.. or you can simply call your Cygwin commands from PowerShell, ignoring all the extra functionality and it will work similarly to Bash, KornShell, or whatever...

What are some online solutions for easily accessing my source code from anywhere? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm a college student and at any given time I have 4-5 programs I'm writing in various languages for various classes/projects.
At any given hour of the day I might be in the library, at home, in any of our different computer lab classrooms, etc.
Right now my current modus operandi is at the end of each class period or coding session, I gmail myself the current state of whatever I'm working on with an appropriate subject line (ie, "MIPS Assembly Lab 2, Revision #3").
However, this is becoming cumbersome and I'm looking for other solutions.
Restrictions:
No Thumbdrive. I'm about as absent minded as possible while still somehow managing to function. I'll lose it.
Portable or Web Apps only. I can't install non-portable executables. So, if a tool requires an installation wizard or administration privileges, I can't use it. I can use portable executables however, stored in our network drive space given to each student. So, that might open some possibilities.
I'm looking for some kind of online storage that I can easily download the latest files for my project or update those in the online storage, with as little friction as possible.
I've considered using some free version control repository and trying to find a portable executable or web-tool I can use to integrate with it, but I wonder if it might be overkill. I'm not really looking for keeping a revision history.
I've seen videos of things like dropbox and it seems like it is a step in the right direction.
Any suggestions?
A good solution would be to get an account at some hosting provider that offers shells (eg. Dreamhost) and do your work remotely. That way you always have a consistent environment that you can just ssh into from anywhere.
It's far easier to find a run-anywhere SSH client than a run-anywhere filesharing or revision control system.
www.github.com?
Git binaries should be usable without any installation process (I do not get the 'portable' part there, as you do not mention anything about your work environment).
Or, alternatively, a thumbdrive git repository, altough you said that you do not want to use a thumbdrive.
The most simple solution (if you can share the code with the world), is to create a project on Google code. This gives you a subversion repository plus a wiki to sort your ideas and an issue tracker for your TODO list, too.
Today, I prefer Subversion in your situation for two reasons:
There is a command line standalone client (just a couple of files which need no install) for Windows. Git would need either Cygwin or MinGW and a Unix environment of some kind. Too much hassle.
It's a bit more simple to use than Git. Git asks a paradigm shift from your brain and unless you get that right, Git will feel "weird".
For professional work on large projects, I prefer Git :)
CVS, Subversion and GIT all allow to create the repository on a network share. All of them discourage this because, in the case of a network outage, the repository may become corrupted.
So if you have frequent network outages, this might not be an option but frankly, most networks are pretty stable today. And in my 15 years since I use VCS, I never had one corrupt a repo on a share. Most network file systems will try their very best to commit pending writes, so unless the server completely dies, the data will be saved when the hiccup is over.
But if you're still worried, use git because it allows to restore the main repository with minimal data loss from your local copy (see this question for details).
We use CVSDude, who do CVS and SVN, it's a pay service $6/month = 250M, works really well, although maybe you're after something free?
You can get a free account on drivehq to store up to 1GB of data. Nothing fancy, but if you're looking for some place to put software it might do the trick.
You might like to check out Bespin:
Bespin is a Mozilla Labs experiment
that proposes an open, extensible
web-based framework for code editing.
I use Dreamhost's integrated SVN. They have an interface for setting up repositories and user accounts. I work on a mac which comes with SVN installed so the whole thing for me was completely painless. Couple of clicks, point SVN at my server and I was good to go.

Resources