Console Application Structure - standards

I've written several .Net Console Applications over the past 6 months and we have many more throughout different projects in our organization. I generally stick to the same standard format/structure for my Console Applications. Unfortunately, many of our console applications do not.
I have been looking into ways of standardizing the structure of these Console Applications. I would also like to provide a framework for the basic structure of a Console Application and provide easy access to standard ways of handling things such as argument passing, logging, etc.
Can anyone suggest Best Practices for addressing these concerns? I have been reading this MSDN article on Console Applications in .Net which suggests a Design Pattern for Console Apps. The example uses a Template Method pattern to handle some of the concerns I listed earlier.
Two negatives of using this approach are listed in the article.
Ending up with twice as many classes
Having many simple, similar classes
Can anyone suggest better, or more standard, ways of handling this? What about listing additional negatives with this approach?

To answer part of my own question...
I did spend quite a bit of time looking for a standard way to handle the parsing of arguments passed to my console applications. I was specifically looking for something similar to GetOpt for python. With that in mind, the solution that I settled on is NDesk.Options. It covers all of our needs and seems to handle arguments in a standard fashion. I thought this might help someone who stumbles on this question in their search.

I tend to keep my console applications as simple as possible and as much in another layer so that it can be more easily unit and acceptance tested. I also generally keep my console applications simple/single tasked, so I don't often have many possible paths through things like command line arguments. This allows me to just take the arguments, if any, parse them and pass them along to "back end" logic.

Related

SOA: Use SDO (Service Data Object)? [duplicate]

I've been programming in Delphi with Midas/DataSnap for quite long time and quite happy with it. Moving to .NET I'm more than happy with the ADO.NET DataSet. For CRUD application, I'm highly uncomfortable with any kind of ORM. Generic data-structure with automatic diff/delta handling get my job done better for me, an average database application developer.
Tried to study Java years ago, and could not find similar idea implemented. The closest I could find is SDO (Service Data Object). I thought it should be widely adopted when I saw it, but I'm wrong. Even the spec is rather old now, I still hardly find many people discuss on it or use it extensively. Assuming from information I can find on the internet, SDO usage is highly passive.
Wondering if it's dying ? Any experience in SDO you want to share ? Manual DTO coding is always better ?
Ok. I see. The answer is "no"
;)
Same for me when trying SDO first time. Old specs, passive feedback... Definitely NO.
I wouldn't recommend using SDO unless it's imposed on you by some other part of the project.
WebSphere process server uses SDO. It's not really a bad API once you learn it. But the spec and the documentation are vague. It doesn't spell out what happens if you ask for a field that doesn't exist, or whether it does type conversions while getting or setting fields, to name two gripes.
I don't think the API defines how to define new types, so that part will be implementation-specific. Type definitions are based on XSD, so you'll be working with those and all of the associated standards.
As others have implied, the API isn't widely used. This means it'll be hard to find people experienced with it, or help using it.

Restrict violation of architecture - asp.net MVP

If we had a defined hierarchy in an application. For ex a 3 - tier architecture, how do we restrict subsequent developers from violating the norms?
For ex, in case of MVP (not asp.net MVC) architecture, the presenter should always bind the model and view. This helps in writing proper unit test programs. However, we had instances where people directly imported the model in view and called the functions violating the norms and hence the test cases couldn't be written properly.
Is there a way we can restrict which classes are allowed to inherit from a set of classes? I am looking at various possibilities, including adopting a different design pattern, however a new approach should be worth the code change involved.
I'm afraid this is not possible. We tried to achieve this with the help of attributes and we didn't succeed. You may want to refer to my past post on SO.
The best you can do is keep checking your assemblies with NDepend. NDepend shows you dependancy diagram of assemblies in your project and you can immediately track the violations and take actions reactively.
(source: ndepend.com)
It's been almost 3 years since I posted this question. I must say that I have tried exploring this despite the brilliant answers here. Some of the lessons I've learnt so far -
More code smell come out by looking at the consumers (Unit tests are best place to look, if you have them).
Number of parameters in a constructor are a direct indication of number of dependencies. Too many dependencies => Class is doing too much.
Number of (public) methods in a class
Setup of unit tests will almost always give this away
Code deteriorates over time, unless there is a focused effort to clear technical debt, and refactoring. This is true irrespective of the language.
Tools can help only to an extent. But a combination of tools and tests often give enough hints on various smells. It takes a bit of experience to catch them in a timely fashion, particularly to understand each smell's significance and impact.
You are wanting to solve a people problem with software? Prepare for a world of pain!
The way to solve the problem is to make sure that you have ways of working with people that you don't end up with those kinds of problems.... Pair Programming / Review. Induction of people when they first come onto the project, etc.
Having said that, you can write tools that analyse the software and look for common problems. But people are pretty creative and can find all sorts of bizarre ways of doing things.
Just as soon as everything gets locked down according to your satisfaction, new requirements will arrive and you'll have to break through the side of it.
Enforcing such stringency at the programming level with .NET is almost impossible considering a programmer can access all private members through reflection.
Do yourself and favour and schedule regular code reviews, provide education and implement proper training. And, as you said, it will become quickly evident when you can't write unit tests against it.
What about NetArchTest, which is inspired by ArchUnit?
Example:
// Classes in the presentation should not directly reference repositories
var result = Types.InCurrentDomain()
.That()
.ResideInNamespace("NetArchTest.SampleLibrary.Presentation")
.ShouldNot()
.HaveDependencyOn("NetArchTest.SampleLibrary.Data")
.GetResult()
.IsSuccessful;
// Classes in the "data" namespace should implement IRepository
result = Types.InCurrentDomain()
.That().HaveDependencyOn("System.Data")
.And().ResideInNamespace(("ArchTest"))
.Should().ResideInNamespace(("NetArchTest.SampleLibrary.Data"))
.GetResult()
.IsSuccessful;
"This project allows you create tests that enforce conventions for class design, naming and dependency in .Net code bases. These can be used with any unit test framework and incorporated into a build pipeline. "

The Model-Repository-Service-Validator-View-ViewModel-Controller design pattern (?)

When I first heard about ASP.NET MVC, I was thinking that would mean applications with three parts: model, view, and controller.
Then I read NerdDinner and learned the ways of repositories and view-models. Next, I read this tutorial and soon became sold on the virtues of a service layer. Finally, I read the Fluent Validation documentation, and I'll be darned if I didn't end up writing a bunch of validators.
Tonight, I took a step back and thought about what had become of my project. It seems to have become the victim of the design pattern equivalent of "feature creep". Somehow I'd gone from Model-View-Controller to Model-Repository-Service-Validator-View-ViewModel-Controller. You want loosely coupled and DRY? We got your loosely coupled and DRY right here! But I'm wondering if this could be a case of too much of a good thing.
Am I right to be concerned? Or is this actually not as crazy as it sounds? On one hand, it seems crazy to have so many layers. On the other hand, every layer has a clearly defined purpose that makes sense to me. Have your MVC applications turned into MRSVVVMC apps too? If not, what do they look like? Where's that right balance?
If you have one form with three attributes, this is overkill.
But if you have a 'real' application, and the responsibilities of each layer are well defined, I'd consider it pretty reasonable.
It sounds to me like you found a pattern and went looking for a problem. You should find a problem, and use the appropriate tool from your toolbox... not all the tools. Unless this is an academic exercise of course.

Abstraction or not?

The other day i stumbled onto a rather old usenet post by Linus Torwalds. It is the infamous "You are full of bull****" post when he defends his choice of using plain C for Git over something more modern.
In particular this post made me think about the enormous amount of abstraction layers that accumulate one over the other where I work. Mine is a Windows .Net environment. I must say that I like C# and the .Net environment, it really makes most things easy.
Now, I come from a very different background made of Unix technologies like C and a plethora or scripting languages; to me, also, OOP is just one, and not always the best, programming paradigm.. I often struggle (in a working kind of way, of course!) with my colleagues (one in particular), because they appear to be of the "any problem can be solved with an additional level of abstraction" church, while I'm more of the "keeping it simple" school. I think that there is a very different mental approach to the problems that maybe comes from the exposure to different cultures.
As a very simple example, for the first project I did here I needed some configuration for an application. I made a 10 rows class to load and parse a txt file to be located in the program's root dir containing colon separated key / value pairs, one per row. It worked.
In the end, to standardize the approach to the configuration problem, we now have a library to be located on every machine running each configured program that calls a service that, at startup, loads up an xml that contains the references to other xmls, one per application, that contain the configurations themselves.
Now, it is extensible and made up of fancy reusable abstractions, providers and all, but I still think that, if we one day really happen to reuse part of it, with the time taken to make it up, we can make the needed code from start or copy / past the old code and modify it.
What are your thoughts about it? Can you point out some interesting reference dealing with the problem?
Thanks
Abstraction makes it easier to construct software and understand how it is put together, but it complicates fully understanding certain issues around performance and security, because the abstraction layers introduce certain kinds of complexity.
Torvalds' position is not absurd, but he is an extremist.
Simple answer: programming languages provide data structures and ways to combine them. Use these directly at first, do not abstract. If you find you have representation invariants to maintain that are at a high risk of being broken due to a large number of usage sites possibly outside your control, then consider abstraction.
To implement this, first provide functions and convert the call sites to use them without hiding the representation. Hide the data representation only when you're satisfied your functional representation is sufficient. Make sure at this time to document the invariant being protected.
An "extreme programming" version of this: do not abstract until you have test cases that break your program. If you think the invariant can be breached, write the case that breaks it first.
Here's a similar question: https://stackoverflow.com/questions/1992279/abstraction-in-todays-languages-excited-or-sad.
I agree with #Steve Emmerson - 'Coders at Work' would give you some excellent perspective on this issue.

Automating paper forms and process flow in the office

I have been tasked with automating some of the paper forms in HR. This might turn into "automate all forms" eventually, so I want to approach this in a way which will be best for the long term and will be a good framework as this project grows.
The first things that come to mind were:
-InfoPath/SharePoint (We currently don't use SharePoint now, and wouldn't be an option for the next two years.)
-Workflow Foundation (I've looked into this and does not seem too attractive or appropriate)
Option I'm considering at this point:
-Custom ASP.NET (VB.NET) & SQL Server, which is what my team mostly writes their apps with.
-Leverage Infopath for creating the forms electronically. Wondering if there is a good approach to integrating this with a custom built ASP.NET app.
-Considering creating the app as an MVC web app.
My question is this:
-Are there other options I might want to consider?
-Are there any starter kits or VB.NET based open source projects there which would be a starting point or could be used as a good reference. Here I'm mostly concerned with the workflow processing.
-Any comnments from those who have gone down this path?
This is going to sound really dumb, but in my many years of helping companies automate paper form-based processes is to understand the process first. You will most likely find that no single person understands the whole thing. You will need to role-play the many paths thru the process to get your head around it. And once you present your findings, everyone will be shocked because they had no idea it was that complex. Use that as an opportunity to streamline.
Automating a broken process only makes it screw up faster and tell a lot of people.
As far as tools, my experience dates me but try to go with something with these properties:
EASY to change. You WILL be changing it. So don't hard-code anything.
Possible revision control - changes to a process may or may not affect documents already in route?
Visual workflow editing. Everyone wants this but they'll all ask you to drive it. Still, nice tools.
Not sure if this helps or not - but 80% of success in automating processes is not technology.
This is slightly off topic, but related - defect tracking systems generally have workflow engines/state. (In fact, I think Joel or some other FC employee posted something about using FB for managing the initial emails and resume process)
I second the other advice about modeling the workflow before doing any coding or technology choices. You will also want this to be flexible.
as n8owl reminded us, automating a mess yields an automated mess - which is not an improvement. Many paper-forms systems have evolved over decades and can be quite redundant and unruly. Some may view "messing with the forms" as a violation of their personal fiefdoms, so watch your back ;-)
model the workflow in terms of the forms used by whom in what roles for what purposes; this documents the current process as a baseline. Get estimates of how long each step takes, both in terms of man-hours and calendar time
understand the workflow in terms of the information gathered, generated, and transmitted
consolidate the information on the forms into a new set of forms for minimal workflow
be prepared to be told "This is the way we've always done it and we're not going to change", and to gently (a) validate their feelings, (b) explain how less work is more efficient, and (c) show concrete benefits [vs.the baseline from step 1]
soft-code when possible; use processing rules when possible; web services and html forms (esp. w/jquery) will go a long way if you have an intranet
beware of canned packages (including sharepoint) unless you are absolutely certain they encompass your organization's current and future needs
good luck!
--S
I detect here a general tone of caution with regards to a workflow based approach and must agree. Be advised about the caveats of most workflow technologies which sacrifice usability for flexibility.

Resources