What's the difference between the AspNetCore.Mvc and AspNetCore.Mvc.Core NuGet packages? Is Mvc.Core just bare bones stuff while Mvc is an all-inclusive package? That's what I would guess from looking at the descriptions here and here but it's not totally clear.
see https://github.com/aspnet/Mvc/issues/4785
AspNetCore.Mvc has all the basic stuff already set for you
if you want to use AspNetCore.Mvc.Core you will have to configure them yourself
it seems wise to use AspNetCore.Mvc unless you KNOW you need AspNetCore.Mvc.Core
If you use .AddMvc() then you get a lot of "opinionated" features, e.g. what kind of app are you building, which formatters are registered and in what order, which application conventions are there by default.
If you use .AddMvcCore() ("and you know what you're doing) then the behavior of your application will be decided by your own opinions and not the built-in default opinions.
Related
It seems that Xunit no longer supports extending TraitAttributes. They have sealed the class.
There are also some other issues with Autofixture's plugin for AutoData() where we can inject random created data through an attribute. There are a few work around's for this, however I am attempting to evaluate this for a larger product overall. I liked the demo's since they could do small things like SQL, Excel, custom Attributes for category.
It seems there was more functionality before the changes. I have looked at the site and still see some of the features are returning and there isn't much information.
Is there a new set of functionality coming out? Or possibly a change that will allow us to recreate the older functionality in a new way? It seems the SQL and Excel have a work around, however I can't find any information about when the latest version will be compatible with "Autofixture with xUnit.net data theories" Nuget package. I really like what I have seen, though I can say I don't like breaking changes when I look at enterprise solutions. I cringe a little when I think about if I had this in place in an enterprise and I had made a lot of custom attributes, or used Moq and Autofixture to populate and now all my tests were broken. So I guess the other question is, does Xunit seem to change a lot with breaking changes? There is the other option of moving Xunit back a version. Though at some point I would need to know if these things would be fixed or if they were permanently removed, since I wouldn't want to spend time using functionality that is being removed.
Another is AutoFixtureMoqAutoDataAttribute that doesn't load without that side Nuget package. With the side nuget packages not being updated.
I guess the end question may be.. Does anyone know of any plans to get these features to work with the current version of xunit so that I can start implementing and then expect to do mass replaces later? Or are these permanently breaking changes where we shouldn't implement anything that is currently missing.
Thank you in advance.
Short answer
If you want to use xUnit.net 1.x with AutoFixture, use AutoFixture.Xunit.
If you want to use xUnit.net 2.x with AutoFixture, use AutoFixture.Xunit2.
Explanation
xUnit.net 2.0 introduced breaking changes, compared to xUnit.net 1.x (e.g. 1.9.2). For AutoFixture, we wanted to make sure that AutoFixture supports both. There are people who want to upgrade to xUnit.net 2.x as soon as possible, but there are also people who, for various reasons, will need to stay with xUnit.net 1.x for a while longer.
For the people who wanted or needed to stay with xUnit.net 1.x for the time being, we wanted to make sure that they'd still get all the benefits of various bug fixes and new features for the AutoFixture core, so we're maintaining two parallel (but feature complete) Glue Libraries for AutoFixture and xUnit.net.
As an example, we've just released AutoFixture 3.30.3, which addresses a defect in AutoFixture itself. This bug fix thus becomes available for both xUnit.net 1.x and 2.x users.
Thus, when you need to migrate from xUnit.net 1.x to xUnit.net 2.x, you should uninstall AutoFixture.Xunit and instead install AutoFixture.Xunit2. As far as I know, there should be feature parity between the two.
Traits
AutoFixture.Xunit and AutoFixture.Xunit2 don't use the [Trait] attribute, so I don't know exactly what you have in mind regarding this.
AutoMoq
Again, when it comes to AutoFixture.AutoMoq, it doesn't depend on xUnit.net, so I don't understand the question here as well. It sounds like a separate concern, so you may want to consider asking a separate question.
I'd like to get some information regarding using MySQL alongside ASP.NET (particularly MVC 3). From what I've found and experienced, it doesn't quite seem as customizable in terms of the Membership and User classes which come with Asp.Net, especially when it comes to validation or registration.
For example, after configuring my web.config file to use MySQL, I found myself realizing that, although a fair amount of tables were auto-generated for me to use, I wasn't able to change the names of them. Because of this, it seemed as though if I were to change a column name, or add a column to the table, it wouldn't quite work with the system, since everything has been pre-built.
Yet, with ADO.Net/Entity Framework, it appears that I might actually be able to have more freedom in how I go about creating my websites using MsSQL. Is this true? Is MySQL just not meant for ASP.Net, despite the the fact that you can install and use it at your leisure. Or is it that it just requires more work to get everything working, and you kind of have to reinvent the wheel by creating your own database classes and validation tools?
I'm not trying to bash either MySql or MsSql, I'm simply looking for a good analysis on the topic, as Google hasn't helped me much in this area.
This is more an issue with the default providers, and one of the many reasons why the 1st thing I did when I learnt about them was to try and make my own. (To be clear, creating your own one from scratch does require a fair amount of work, there are a few good tutorials out there that can give you a quick start)
[It'd make all our lives easier if the .Net framework used Interfaces for the providers rather than the base class... ]
To be clear, the big thing with the auto generated providers is the sprocs they use require the specified names, if you want to change the table names then you'll have to also update all the Sprocs as well. (This is true for any custom provider you may chose to build/use)
I need to develop an e-store application in C#.NET. There are number of open-source packages already available, like nopcommerce, dotnetcart and so on. I went through the source code of some and found them very tedious or to say very deep functional. My requirement is pretty straightforward. Need to have just one level of categorization and a simple and clean front-end. Therefore, i am bit sceptical about using such big solution for a simple e-store.
What do you think ?, should i use the already existing solutions or develop the one accustomed to my requirements.
Use Ecwid. It very simple, free and easy built shopping cart for any site: http://www.ecwid.com
Try to consider Orchard CMS + e-commerce-module.
Once I've seen a nice tool called .NET Reflector. It can show the entire object hierarchy of .Net binaries/apps (sorry if the term is wrong).
Is there something like this for Qt? As Qt has very good QMetaObject abilities, it should be possible to traverse object-trees, call methods(slots), change properties, etc.
I am currently re-factoring a Qt project. The naming of variables is very domain specific and I am not the expert in this domain. So, it is difficult for me to map a widget-variable to the widget on the screen. Such tool would be a great help for me to understand the code.
Thank you very much in advance!
For simple uses you might want to take a look at QObject::dumpObjectTree()
If you need something more advanced there's kspy
kspy: examines the internal state of a
Qt/KDE app KSpy is a tiny library
which can be used to graphically
display the QObjects in use by a
Qt/KDE app. In addition to the object
tree, you can also view the
properties, signals and slots of any
QObject. Basically it provides much
the same info as
QObject::dumpObjectTree() and
QObject::dumpObjectInfo(), but in a
much more convenient form. KSpy has
minimal overhead for the application,
because the kspy library is loaded
dynamically using KLibLoader. See /usr
/share/doc/kspy/README for usage
instructions. This package is part of
the KDE Software Development Kit.
It depends on KDE's klibloader so if you are not under KDE you have to modify it but it should be rather easy. Sources are here.
There's QSpy project. It inspects all QWidgets of running application. I'm not sure how well it works, because I couldn't use it on Mac OS X. Maybe on Windows it works better. https://github.com/sashao/martlet
http://qt-apps.org/content/show.php/QSpy?content=102287
I am working on a design spec for a new application that will be heavily workflow driven.
Before I re-invent the wheel, is there a decent lightweight workflow engine that plugs into ASP.NET already around?
Basically, I'm looking for something that handles moving through a defined set of workflow pages while handling state management automatically.
If this isn't around already, I'll definitely try to abstract the engine from my app and put it on codeplex, as it would be really handy.
Any suggestions?
Note: .NET 2.0, so no WWF, though I think WWF is overkill for my needs.
EDIT: Seems like there is a legitimate need for this, and there isn't a product out there...So I might build this.
Here is what I'm picturing:
Custom Page class called WebFlowPage
All WebFlowPage's are registered in a Workflow mapper.
Each WebFlowPage has some form of state object.
A HttpHandler handles picking the appropriate WebFlowPage based upon the workflow, and populating it from the state object.
Is the workflow dynamic, or static?
If the workflows are simple, you could roll your own workflow engine.
In certain situations, it can be fairly simple, and just a couple of data tables to handle the rules, processing and state.
Alot of workflow engines are built for large scale processing (credit card applications, for example). For small scale, you should at least consider your own, which would eliminate the overhead and dependency of/on an engine.
Not sure exactly what you wish to do here, but Ra-Ajax can easily keep state at least if you want your solution ajaxified...
For reference purposes you might want to check out the Ajax Calendar sample or even the (banalistically implemented) Ajax Wizard sample. It surely beats the hell out of doing it with JavaScript...
And every time you "do something" you're in "server-land" which means you can store temporaries all the time as you wish...
The project is LGPL
(PS!
Yes I do work with it)
Building a custom workflow engine is not trivial, although it may seem simple at first. We've tried that. It depends a lot on the complexity of the logic you need it to cover.
Given the current state of the Windows Workflow Foundation and the lack of another framework that abstracts the workflow concepts, I would choose WF if you need complex logic, asynchronous handling or branches in your workflows.
Tracking your state through the workflow can be accomplished by carrying some kind of xml payload or storing the state in a database,
If your workflow is actually a sequential set of forms that need to be filled in by the user, tracking the steps and guiding the user to the next step can be accomplished with some simple custom solution.
You could take a look at the InRule engine too.
Also, there is nxBRE.
These too are mostly used for business rules.
InRule is proprietary, whereas nxBRE supports RuleML (the defacto standard).
You might need to make your own implementation for the pages, and use the rule engine as the "structure".
At this moment, I know that Sharepoint 2007 supports page workflows (using WF), but this would imply using .NET Framework 3 and deployng sharepoint.
My suggestion would be to use whatever you find more light and easier to use.
I think the term "workflow" is very open to interpretation. I have been working lately with a type of workflow that is very different from what you seem to be describing. Mine is a state machine based workflow where the state of a particular record determines what actions a user can take to move the record to the next step in the business process. So "workflow" in this instance means how the record flows from one state to another until it is finally completed.
Your usage of workflow seems to have more to do with moving a user from one page to another in a linear multi-step process, which is a completely different use case (correct me if I'm wrong). So before coming up with a general purpose "workflow" engine that anyone could use, I would recommend defining a little bit better exactly what types of situations this system would handle.
I've been using this for a few months http://objectflow.codeplex.com. Not asp specific but it may fit your needs
While browsing the web for some workflow & BPM resources, I found the following project: NetBPM. Unfortunately, the project seems to be stopped.
I don't think there is a workflow engine that will automatically handle state for you, but if you are moving through a set of pages like a process such as checkout on an ecommerce site, perhaps the ASP.NET wizard control could help you?
There are few workflow options. "Aspose" and "Skelta" are the offers I´m evaluating.
Fábio
you can use WorkFlow Engine, just read the document and run the Demo.
all of the features you need for a dynamic workflow engine they added in there.