I'm doing some refactoring and am trying to reuse my genertated entity models. My application has a few assemblies, one being my outward facing public types (API) and one containing implementations of providers (such as the log).
I'd like to split the generation of the entities and models so that the entities will be in the API assembly and the container will be in the implementation assembly. Is this possible?
Is possible. This is how I did it.
Assembly A
Database.EDMX
Models.TT
Models.cs
Assembly B
Database.EDMX (Added as a Link to the real file in Assembly A)
EntityContainer.TT
EntityContainer.cs
That's how everything is laid out. These are the rough steps:
Right click on the EDMX in A (public API assembly) and Add Code Generation File
Adds a TT to the project. Called it Models, as it will contain the models only.
Edited the TT and removed code generation for entity containers
In assembly B (internal implementations) added Database.EDMA as a link
Opened in assembly B, right click and Add Code Generation File
Adds a TT to project B. Called it EntityContainer as it will contain that only.
Edited TT to do the following
Removed entity creation steps
Changed the path to Database.EDMX to a relative path pointing at the original copy in A
Added a using for my models
Hopefully this will all compile and work correctly (I'm still far from getting everything compiled and tested). Looks good so far.
Additional change:
In my entity container TT, I had to modify the definition of the EscapeEndTypeName to the following:
string EscapeEndTypeName(AssociationType association, int index,
CodeGenerationTools code)
{
EntityType entity = association.AssociationEndMembers[index]
.GetEntityType();
return code.CreateFullName(
code.EscapeNamespace(association.NamespaceName), code.Escape(entity));
}
I'm using association.NamespaceName as it contains the correct namespace from the other assembly.
I don't know the answer, but I think that your question is essentially equivalent to "Is it possible to cause a T4 template in one project to emit code into a different project?" If you can do that, then you can do what you want. Note, though, that this is substantially easier in EF 4.
So I think you might get useful feedback if you asked that question directly.
Related
I would like to use shared resx file to specify all the translatable strings (both for translator convenience, and more importantly to avoid having dozens of separate resx files that clash with DRY principle). I got it working with IStringLocalizer for controllers and views, but I just can't figure out how to implement it for model's data annotations.
It works by using separate files like Models.AccountViewModels.LoginViewModel.en.resx, but how would I go and use shared resource file for data annotations instead of specific ones? Can anyone share example of implementation?
P.S. Environment is .NET Core 1.1 so both validation and display annotations should be in that version available for localization
Step 1: Create a simple class, named ValidationMessages.cs and leave it empty. I will assume that your class is located in /Validation folder.
Step 2: Modify provider for data annotations localizer in your Startup.cs file to be like this:
mvcBuilder.AddDataAnnotationsLocalization(options =>
{
options.DataAnnotationLocalizerProvider = (type, factory) =>
{
return factory.Create(typeof(ValidationMessages));
};
});
Step 3: Create folder /Validation in /Resources (I assume that you are keeping all resource files in that folder) and then add ValidationMessages.fr-FR.resx file there (for French culture i.e.).
Step 4: Add entries to the resource files with keys of your liking. I assume that you will have keys like RequiredError, MaxLengthError, etc.
Step 5: Decorate properties on your model class with [Required(ErrorMessage="RequiredError")].
Next time property validation fails, validation messages will be pulled from ValidationMessages.{culture}.resx files.
Keep in mind though, that not only validation messages will be searched there, but also property names if you use DisplayAttribute.
I'm using Visual Studio 2013 and ASP.Net MVC 5. I've created a bunch of views for my models and then I've changed them. I want to run scaffolding on some models and create some views automatically and then change the automatically-generated views. Is there another way other than re-naming some files or creating another solution and copying stuff?
Yes, you can re-scaffold by scaffolding the same model again, using the same model class and controller names as before. Your existing controller and views will be replaced.
Details:
Right click on your project or controller folder,
Add ... New Scaffolded Item,
MVC 5 Controller with views using Entity Framework,
Add
Choose your model and data class,
And ensure your controller name is the same as the one to replace.
I use version control - GIT to do it quickly and safely. I use Git Extensions (http://code.google.com/p/gitextensions/) user interface for git.
Have your code commited before re-scaffolding. Then re-scaffold the views, and go to staging (the button Commit in Git Extensions). It shows all changes that re-scaffold made and colors the new and deleted code lines. From there you can stage only the selected new lines, that changed in controller. After staging selected lines, reset the unstaged other changes.
There you have it! Your already modified code with new scaffolded parts. Do any edits and testing necessary and commit.
This is a new answer to an old question. It's somewhat similar to the existing answers, but I think different enough and easy enough to be of value.
1) Save the existing project/solution to version control just as good practice.
2) When re-scaffolding, use a different controller name which will create a controller class and it's 5 attendant views, but it won't overwrite anything that exists, preserving all your existing work.
3) Extract the appropriate methods from the re-scaffolded controller. Bindings for create/edit will likely change when the model changes, so capture those. Then delete the re-scaffolded controller.
4) That leaves the views in place to copy and paste the appropriate UI code for any new or redefined model properties. Once all the code needed has been copied, simply delete the re-scaffolded views.
It was a great question because we often have to change a model, and it's nice to have all the basic UI stuff automatically created for us.
I am using LINQ to SQL with C#.
Is there a method through which we can generate entity class files from the table schema?
By dragging tables onto the graphical designer classes are generated but they are not the real class files(i mean actual files with the extension cs).
I am aware of that we can code the class files first and then create the schema manually or programmatically, but i wanted to know if the reverse is possible, may be using some third-party tools. I feel it will be very convenient to use LINQ that way.
Thanks in advance.
I'm not as familiar with LINQ to SQL as I am with Entity Framework (v4), but EF certainly would fit your requirements. You can download the POCO templates for EF from Microsoft, right through VS2010 in the Extension Manager (Tool > Extension Manager, click on Online Gallery, and search for POCO). The link is not just the download for the template, but a walkthrough on how to get started.
I also have started a series of blog posts that include some nice T4 templates for an Entity Framework EDMX model that auto generate DTO classes for all of your entity classes, whether you're using the default code generation model, or Microsoft's POCO template. The auto generated DTOs are handy for use in UI or service layers, and save you from having to bring in dependencies on Entity Framework in consuming layers. It's also very easy to get DTOs from your entity objects.
var people = from p in context.People select p;
return people.ToDtos();
Might be worth a look (shameless self promotion).
If you need/want to stick with LINQ to SQL, do a google search for "linq to sql POCO", it seems some people have had a degree of success with this, but most of the search results seem to be from 2008 and earlier, so I'm not sure about currency / relevancy.
All the classes generated when you drag tables to the designer are created as partial classes. There is no reason you can't just create a file for each one and use that to make the necessary modifications.
Absolutely you can, if you use the T4 template for L2S - http://l2st4.codeplex.com/
You still use the .DBML file, but you need to set the "build action" to "none" on the file to turn off the compilation of the default code that gets generated. Then you add the .tt file and the .ttinclude file from your codeplex download.
The T4 template has a line of code in it that you can modify to suit your purposes:
FilePerEntity = false, // Put each class into a separate file
Oddly, Entity Framework 4 is using this approach too with the dual methods of generating the code from the model file, but with EF, the T4 template is included with VS2010. With Linq-to-sql, you have to download the T4 template separately. The nice part with using T4 is you can add other customizations as you go. However, initially the code that's generated is identical as what you got from the .DBML designer.
I was going through Unity 2.0 to check if it has an effective use in our new application. My application is a Windows Forms application and uses a traditional bar menu (at the top), currently.
My UIs (Windows Forms) more or less support Dependency Injection pattern since they all work with a class (Presentation Model Class) supplied to them via the constructor. The form then binds to the properties of the supplied P Model class and calls methods on the P Model class to perform its duties. Pretty simple and straightforward.
How P Model reacts to the UI actions and responds to them by co-ordinating with the Domain Class (Business Logic/Model) is irrelevant here and thus not mentioned.
The object creation sequence to show up one UI from menu then goes like this -
Create Business Model instance
Create Presentation Model instance with Business Model instance passed to P Model constructor.
Create UI instance with Presentation Model instance passed to UI constructor.
My present solution:
To show an UI in the method above from my menu I would have to refer all assemblies (Business, PModel, UI) from my Menu class. Considering I have split the modules into a number of physical assemblies, that would be a dificult task to add references to about 60 different assemblies. Also the approach is not very scalable since I would certainly need to release more modules and with this approach I would have to change the source code every time I release a new module.
So primarily to avoid the reference of so many assemblies from my Menu class (assembly) I did as below -
Stored all the dependency described above in a database table (SQL Server), e.g.
ModuleShortCode | BModelAssembly | BModelFullTypeName | PModelAssembly | PModelFullTypeName | UIAssembly | UIFullTypeName
Now used a static class named "Launcher" with a method "Launch" as below -
Launcher.Launch("Discount");
Launcher.Launch("Customers");
The Launcher internally uses data from the dependency table and uses Activator.CreateInstance() to create each of the objects and uses the instance as constructor parameter to the next object being created, till the UI is built. The UI is then shown as a modal dialog. The code inside Launcher is somewhat like -
Form frm = ResolveForm("Discount");
frm.ShowDialog();`
The ResolveForm does the trick of building the chain of objects.
Can Unity help me here?
Now when I did that I did not have enough information on Unity and now that I have studied Unity I think I have been doing more or less the same thing. So I tried to replace my code with Unity.
However, as soon as I started I hit a block. If I try to resolve UI forms in my Menu as
Form customers = myUnityContainer.Resolve<Customers>();
or
Form customers = myUnityContainer.Resolve(typeof(Customers));
Then either way, I need to refer to my UI assembly from my Menu assembly since the target Type "Customers" need to be known for Unity to resolve it. So I am back to same place since I would have to refer all UI assemblies from the Menu assembly. I understand that with Unity I would have to refer fewer assemblies (only UI assemblies) but those references are needed which defeats my objectives below -
Create the chain of objects dynamically without any assembly reference from Menu assembly. This is to avoid Menu source code changing every time I release a new module. My Menu also is built dynamically from a table.
Be able to supply new modules just by supplying the new assemblies and inserting the new Dependency row in the table by a database patch.
At this stage, I have a feeling that I have to do it the way I was doing, i.e. Activator.CreateInstance() to fulfil all my objectives. I need to verify whether the community thinks the same way as me or have a better suggestion to solve the problem.
The post is really long and I sincerely thank you if you come til this point. Waiting for your valuable suggestions.
Rajarshi
As I can see from this code
Form customers = myUnityContainer.Resolve<Customers>();
all your code need to know about the customer - is that it's a Form class. So if you use xml configuration for unity you can do the following:
<type type="Form" mapTo="Customer" name="Customer">
</type>
And then you'll be able to resolve it like this:
Form customers = myUnityContainer.Resolve<Form>("Customer");
and there is no need to refference your UI assembly. Offcourse it should be presented in the bin directory or GAC. In this case if you'll develop new Assembly - all you need is to change config and put in in bin or gac.
If you want to make unity configuration from db then you'll have to add referrence to your ui, becouse you'll have to call Register("Customer").
when I compile some .py codefiles with no class definitions into dlls , the compiled dll is created with a "DRLCachedCode" class inside. Why is that?
When you compile IronPython code it doesn't get compiled to normal .NET code where you'd have a class at the IL level for each class you have at the source level. Instead it gets compiled into the same form that we compile to internally using the DLR.
For user code this is just a bunch of executable methods. There's one method for each module, function definition, and class definition. When the module code runs it executes against a dictionary. Depending on what you do in the module the .NET method may publish into the dictionary a:
PythonType for new-style classes
An OldClass for old-style classes
A PythonFunction object for function
definitions
Any values that you assign to (e.g.
Foo=42)
Any side effects of doing exec w/o providing a dictionary (e.g. exec "x=42")
etc...
The final piece of the puzzle is where is this dictionary stored and how do you get at it? The dictionary is stored in a PythonModule object and we create it when the user imports the pre-compiled module and then we execute the module against it. Therefore this code is only available via Python's import statement (or the extension method on ScriptEngine "ImportModule" which is exposed via IronPython.Hosting.Python class).
So all of the layout of the code is considered an internal implementation detail which we reserve the right to change at any point in time.
Finally the name DLRCachedCode comes because the DLR (outer layer) saves this code for us. Multiple languages can actually be saved into a single DLL if someone really wanted to.
This link answers the question: http://www.ironpython.info/index.php/Using_Compiled_Python_Classes_from_.NET/CSharp_IP_2.6 how to access an IronPython class from C#.
Manual compilation: \IronPython 2.7\Tools\Scripts>ipy pyc.py /out:MyClass /target:dll MyClass.py did not work. Only when I used SharpDevelop with IronPython it worked as in the post.