How to use the project templates? - xamarin.forms

I am new to Xamarin. I have done a couple of smaller projects (2-3 pages, one table). I have a new project that is a great candidate for a shell app. It will have 20 pages, will consume data from a transactional database (cloud hosted) but also have an offline datastore (SQLite). Right now, I just want to get the local version up and running. The template for Shell App generates an IDataService and a MockDataStore. That is a great place to start - but how do I have more than one table? I am a little confused how I would use that. What I would love to see is a template generated shell app that just adds another table (and corresponding list,detail views along with view models. For example, the simple "todo" sample but add a contact table to assign todo tasks to would be perfect. Thanks in advance for your help.

I hope this helps others new to Xamarin. When starting a new project and choosing anything other than the "blank template", the template generates a model (Item) and a services folder containing an Interface (IDataStore) and a MockDataStore. Being new to XAML in general, I spent a lot of time working on getting the UI to look like what I wanted it, learning about Shell navigation and similar topics. Finally it was time to include the data part of my project. Where I got stuck was trying to make sense of the boilerplate code. My understanding of DependencyService was for platform specific code (e.g. Android, iOS) and NOT data service dependency. Further, the templated code is a typed interface (IDataStore). The solution was fundamental - all that interface does is insure CRUD operations are available in whatever you use in a datastore. For me, simply changing IDataStore to not be typed as an Item, solved everything. It allowed me to keep the database layer abstracted away. In my little project, I completed my "MockDataStore" adding additional CRUD operations until I was ready for my real data operations. NOTE: if you generate the WEB API project from the template, it will make more sense - you can flip between your MockDataStore and actual data store.

Related

Duplicate column issue in migration scripts in .net microservice module. I have to resolve duplicate column in my migration which already executed

In .net core microservices, another team works on open source modules, and I extend their modules in my project. I already added one column in a Entity and then same column is added by open source team. Now duplicate column error is showing.
I can not alter open source migration files and my column is already in production.
How to resolve this issue please suggest.
i think we are dealing with multiple smells here.
each microservice should have its own Datarealm
by the sound of it you extended a Table of an object which was generated by the opensource service
But this does not help you, the question is can you merge the Data which is in this column.
If so you might get away with a simple copy/update.
What you need todo is:
Create a new Column With a different Name
Copy your existing Data into the new column
Drop your column
Execute your Migration
Copy your Data back into the column
Test your application very carefully, if the logic which you have implemented for the from you generated Data works as intended
Drop your backup column
Depending on the amount of Data this will lead to a downtime, so plan ahead and have a rollback strategy ready if something goes wrong.
Personal Opinion to Prevent those Smells
Every time i needed an opensource project in my projects in the past i wrote a wrapper around it, this has multiple benefits.
For one if the api of the project changes you only have to update it in exactly one place, which improves the maintainability.
Because it has a wrapper it automatically gets a own Database and if i need to extend an entity which i get from one of those opensource projects, i usually do it via Foreign Keys with a different Table. Which then gets linked via a view.
Yes this costs some performance, but in the end it was worth it every time.

Is ADO.NET Entity Framework (with ASP.NET MVC v2) a viable option when writing custom and contantly updated websites?

I've just finished going through the MvcMusicStore tutorial found here. It's an excellent tutorial with working source code. One of my favorite MVC v2 tutorials so far.
That tutorial is my first introduction to using ADO.NET Entity Framework and I must admit that most of it was really quick and straight-forward. However, I am worried about maintainability. How customizable is this framework when the customer requests additional features to their site that require new fields, tables and relationships?
I am very concerned about not being able to efficiently execute customer's change orders because the Entity models are basically drag-and-drop, computer generated code. My experience with code generators is not good. What if something goes haywire in the guts of the model and I'm unable to put humpty-dumpty back together?
In the long run, I wonder if using hand typed models which human-beings can read and edit is a more efficient course than using Entity Framework.
Has anyone worked enough with entity framework to say that they are comfortable using it in a very fluid development environment?
I have been using entity framework(V1.0) for about a year in my current project. We have 100s of tables,all added to the edmx.
Problems we face (though not sure if the new entity framework resolves these issues)
When you are used to VS.net IDE, you
will be used to doing all drag/drop
operations from your IDE. The
problem is, once your edmx hosts
100s of tables,the IDE really stalls
and you would have to wait for 3-4
minutes before it becomes responsive
With so many tables ,any edits you
do on the edmx take long.
When you are going to use a version
control, comparing 10000 line XML is
quite painful. Think about merging 2
branches each having a 10000 line
edmx,the tables, new association
between tables, deleted associations
and going back and forth comparing
xmls. You would need a good xml
comparison tool if you are serious
about merging 2 big edmx files
For performance reasons we had to
make the csdl,msl and ssdl as
embedded resources
Your edmx should have to be in sync
with your DB all the time,or at
least, when you try to update the
edmx, it will try to sync and might
throw some obscure errors if they
are out of sync.
Be aware that your
entities(tables/views) should always
have a primary key, else you will
get obscure errors. See my other
question here
Things We did/I might consider in the future when using EF
Use multiple edmx by using 1 edmx
for tables logically grouped/linked
together. Be aware of the fact that
if you do this, each edmx should
live in its own namespace. If you
try to add 2 related tables(say
person & address) to 2 edmx in the
same namespace, you will get a
compiler error stating that the
foreign key relationship is already
defined. (Tip: create a folder and
create the edmx under this folder.
If you try to alter the namespace in
the edmx without having the folder,
it does not save properly the
namespace the next time you
open/edit it)
fewer tables in edmx => less heavy
container => good
fewer tables in edmx=> easier to
merge when merging 2 branches
Be aware of the fact that object
context is not thread safe
Your repository (or what ever DAO you use) should be responsible for creating and disposing the container it creates. Using DI frameworks, especially in a web app complicated things for us. Web requests are served from the threadpool and the container were not disposed properly after the web request was served as the thread itself was not disposed. The container got reused (when the thread was reused) and created a lot of concurrency issues
Don't trust your VS IDE. Get a good
XML editor and know how to edit the
edmx file (though you don't need to
edit the designer). Get your hands dirty
ALWAYS ALWAYS ALWAYS (just cannot emphasize this enough) run a
SQL profiler (and I mean each and
every step of your code) when you
execute your queries. As complex as
the query might look, you will be
surprised to find how many times you
hit the DB Example:(sorry, unable to
get code to the right format,can
someone format it ?)
var myOrders = from t in context.Table where t.CustomerID=123
select t; //above query not yet
executed
if(myOrders.Count>0)//DB query to
find count {
var firstOrder = myOrders.First()//DB query to get
first result
}
Better approach
// query materialized, just 1 hit to
DB as we are using ToList() var
myOrders = (from t in Context.tables
where t.customerID=123 select
t).ToList();
if(myOrders.Count>0)//no DB hit
{
//do something
var myOrder = myOrders[0];//no DB hit
}
Know when to use tracking and no
tracking(for read-only) and web apps
do a lot of reads than writes. Set
them properly when you initialize
your container
Did I forget compiled queries ? Look
here for more goodies
When getting 1000s of rows back from
your DB, make sure you use IQueryable and detach the
objectContext so that you don't
run out of memory
Update:
Julie Lerman address the same problem with a similar solution. Her post also points to Ward's work on dealing with huge number of tables
I'm not too familiar with Entity Framework, but I believe it simply generates an EDM file which can be hand-edited. I know I've done this quite frequently with the Linq-to-SQL DBML files that the designer generates (it's often faster to hand-edit them than use the designer for small tweaks).
You know I'd be interested if any developers can provide some insight into this.
Any Entity Framework examples seem to only consist of about ten to twenty tables, which is small scale really.
What about using the EF on a database with hundreds or even a thousand tables?
Personally, I know several developers and organisations that were burned by LINQ-to-SQL and are holding off for a year or so to see what direction EF takes.
Starting from Entity Framework 4 (with Visual Studio 2010), the generated code is outputted from T4 (Text Template Transformation Toolkit) files which you can edit so you have full control over what is generated. See Oleg Sych's blog which is a mine of information about T4. Code generation is not a problem and T4 opens so many perspectives that I can't live without anymore.
I'm currently working on a project where we use Entity Framework 4 for the data access layer, and Scrum as the agile project management method. From one sprint to another, there are several tables added, other modified, new requirements added. When you have run once into each potential EF problem (like knowing that default values from database are not persisted by default in the .edmx file, or changing a nullable column to a non-nullable and updating the designer doesn't change the mapped property state), you're good to go.
Edit: to answer your question, it's EF 4 whose code generation is based on T4 rather than T4 supporting EF. On EF 3.5 (or EF 1.0 if you prefer), you could in theory use T4 by writing them from scratch, parsing the EDMX file in the T4 code and generate your entities. It would be quite a lot of work considering all of this is already done by EF 4. Plus, Entity Framework 3.5 only supports one type of entitiy, whereas EF 4 as built in or downloadable templates for POCO entities (that don't know anything about persistence), Self-Tracking Entities...
Considering Entity Framework itself, I think it was lacking many features in its first release, and while usable, was quite frustrating to use. EF4 is much more improved. It still lacks some basic features (like enum support), but it has become my data access layer of choice now.

ASP.NET Scaffolding/Templating CRUD Solutions

I've been looking into ASP.NET Dynamic Data and how it does scaffolding and routing. I've only scratched the surface, but it's looking like I'd have to create a template for each table that I didn't want to display all columns the same way.
My first impression after looking at dynamic data is that it would seem like less time on the programmer to have to edit one-time generated user controls rather than build a template for each table that doesn't have a uniform display behavior.
What proven solutions are people currently using that help ease the laborious tasks of creating ASP.NET CRUD type user controls?
Thanks
In ASP.NET webforms we use CodeSmith. From a single entity we generate admin pages, codebehinds, service layers, data layers and db stored procedures. All in a matter of seconds. I'd recommend you check it our for quickly building the crud in your apps.
We're actually working on our own code generation tool. It has already proven to work perfectly on the lower layers and now we're on the way to extend it for the presentation layer, that is for generating user controls.
I've not looked into dynamic data (although I'd like to when I have some time) but my biggest fear is always to lose flexibility. The problem is that these front-ends are then maybe generated dynamically each time based on some template and editing, especially bringing in special customer wishes becomes quite difficult. For small standard apps it may work perfectly though.
What we're therefore doing is to "generate" these usercontrols based on a set of standard custom server controls we've developed, but we'll generate just the first time from some static information about the entities in our application. Then you can continue customizing.
Such systems should help the developer, improving his development speed, doing the initial awkward work but then they should give him the flexibility to modify till the maximum. They should not add additional complexity...
I used .netTiers CodeSmith templates long time ago (years) and it was proven so strong, so, it must be more than great now.
I know a (big) company who have built a customization engine (allowing GUI for internal company options) around those templates to use them in most of their applications and were so successful.
I've used http://www.ironspeed.com/ in the past which has been great. Saved us MONTHS of time on our last project which has a big DB, so the cost is worth it. But it looks a bit ugly and can be tricky to update the DB schema once you've generated.
Obviously not much widespread use out there other than whats provided in Visual Studio.
Have a look at Blinq.

ASP.NET based Workflow Engine

I am working on a design spec for a new application that will be heavily workflow driven.
Before I re-invent the wheel, is there a decent lightweight workflow engine that plugs into ASP.NET already around?
Basically, I'm looking for something that handles moving through a defined set of workflow pages while handling state management automatically.
If this isn't around already, I'll definitely try to abstract the engine from my app and put it on codeplex, as it would be really handy.
Any suggestions?
Note: .NET 2.0, so no WWF, though I think WWF is overkill for my needs.
EDIT: Seems like there is a legitimate need for this, and there isn't a product out there...So I might build this.
Here is what I'm picturing:
Custom Page class called WebFlowPage
All WebFlowPage's are registered in a Workflow mapper.
Each WebFlowPage has some form of state object.
A HttpHandler handles picking the appropriate WebFlowPage based upon the workflow, and populating it from the state object.
Is the workflow dynamic, or static?
If the workflows are simple, you could roll your own workflow engine.
In certain situations, it can be fairly simple, and just a couple of data tables to handle the rules, processing and state.
Alot of workflow engines are built for large scale processing (credit card applications, for example). For small scale, you should at least consider your own, which would eliminate the overhead and dependency of/on an engine.
Not sure exactly what you wish to do here, but Ra-Ajax can easily keep state at least if you want your solution ajaxified...
For reference purposes you might want to check out the Ajax Calendar sample or even the (banalistically implemented) Ajax Wizard sample. It surely beats the hell out of doing it with JavaScript...
And every time you "do something" you're in "server-land" which means you can store temporaries all the time as you wish...
The project is LGPL
(PS!
Yes I do work with it)
Building a custom workflow engine is not trivial, although it may seem simple at first. We've tried that. It depends a lot on the complexity of the logic you need it to cover.
Given the current state of the Windows Workflow Foundation and the lack of another framework that abstracts the workflow concepts, I would choose WF if you need complex logic, asynchronous handling or branches in your workflows.
Tracking your state through the workflow can be accomplished by carrying some kind of xml payload or storing the state in a database,
If your workflow is actually a sequential set of forms that need to be filled in by the user, tracking the steps and guiding the user to the next step can be accomplished with some simple custom solution.
You could take a look at the InRule engine too.
Also, there is nxBRE.
These too are mostly used for business rules.
InRule is proprietary, whereas nxBRE supports RuleML (the defacto standard).
You might need to make your own implementation for the pages, and use the rule engine as the "structure".
At this moment, I know that Sharepoint 2007 supports page workflows (using WF), but this would imply using .NET Framework 3 and deployng sharepoint.
My suggestion would be to use whatever you find more light and easier to use.
I think the term "workflow" is very open to interpretation. I have been working lately with a type of workflow that is very different from what you seem to be describing. Mine is a state machine based workflow where the state of a particular record determines what actions a user can take to move the record to the next step in the business process. So "workflow" in this instance means how the record flows from one state to another until it is finally completed.
Your usage of workflow seems to have more to do with moving a user from one page to another in a linear multi-step process, which is a completely different use case (correct me if I'm wrong). So before coming up with a general purpose "workflow" engine that anyone could use, I would recommend defining a little bit better exactly what types of situations this system would handle.
I've been using this for a few months http://objectflow.codeplex.com. Not asp specific but it may fit your needs
While browsing the web for some workflow & BPM resources, I found the following project: NetBPM. Unfortunately, the project seems to be stopped.
I don't think there is a workflow engine that will automatically handle state for you, but if you are moving through a set of pages like a process such as checkout on an ecommerce site, perhaps the ASP.NET wizard control could help you?
There are few workflow options. "Aspose" and "Skelta" are the offers I´m evaluating.
Fábio
you can use WorkFlow Engine, just read the document and run the Demo.
all of the features you need for a dynamic workflow engine they added in there.

Does anyone use Iron speed designer for rapid asp.net development? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Visual studio is pretty good but doesn't create stored procedures automatically. Iron Speed designer does supposedly. But is it any good?
I have used Ironspeed extensively for the past two years for most of our ASP.NET forms over data projects.
It works. Does several things well: stored procs, fast layout of table browse and CRUD screens, fast layout of single record CRUD screens. It manages the round-trip (or half-round trip) process decently, detecting changes in your back end db schema and updating its data access layer, then making the changed columns available for you to alter your UI (in record or table control panels). ISD (as they call it) does an excellent job in making security management for your app pretty painless, even down to the control level (if you use ISD's subclassed versions of asp.net controls). Final plus, not a small one, is the CSS-based theme control (easy to change to a variety of themes, easy to customize a particular theme, and not even too bad to build your own theme variant by forking an existing one you like). Depending upon whether you let ISD create your stored procs in the code base or the database, changing DB's at run time can be a piece of cake.
Fairly active forum with a core group of helpful contributors. You can probably avoid the paid tech support through the forum.
Okay, the down sides. Creates fairly large code conglomerations, being a three tiered architecture. As Galwegian says, like any framework, you've got the velvet handcuffs (get your mind out of the gutter if you are thinking about anything other than code limitations and conventions!). The velvet handcuffs are the page and control model, the data layer, lack of a business object/class capability per se, the postback model, and the temptation to make your user GUI look like THEIR user GUI that comes out of the box because it is so darned easy and convenient.
ISD builds a basic page by combining an HTML template (in to which you place ISD specific code generation tags and any other tags, etc., you which using the ISD GUI or by hand). The page model relies upon a code behind page created from a piece of code template. The base classes are almost completely overridable, so that you can override all of the default functions, regenerate the application and not lose your overrides. The database controls live in the page container, but have their own class definitions (i.e., their code-behind) in specific /app_code files. Again, each control type has its own base class with pretty completely overridable methods. A single record control (showing a single db record) is pretty simple. A table, showing several records, has a table class and a table row class. The ISD website (www.ironspeed.com/support) has good documentation of the ISD model as a whole.
So, where are the problems in this model?
1. Easy and tempting to live with their out of the box GUI. Point ISD at your database, pick the tables you want to have it turn in to pages, tell it the kinds of pages, give it a thematic style and five minutes later you're viewing the application. Cool. But, it is very easy to forget that their user GUI is probably not what your user wants to see. So, be prepared to think for yourself and tinker with the GUI thus created. Not hard to do, and you can use VS 2005 to help you.
Business objects. You could put together your own business objects, but it would be difficult and you would get no help from ISD. ISD does a LOT of building of simple validation and checking (appropriate look up values, ranges, lengths, etc.) ISD lets you build custom queries, but these are read-only. It is smart enough (and you can override the write from a page in any case) to let you take a one to many view and write it back to the database (you'd probably override the default base method, but it isn't that hard to do). However, when you get in to serious dependency checking, ISD is still really about tables and not business objects. So, you're going to write some code.
If you are smart, you'll write it once store it in app_code somewhere and use it by calling it from an overridden method in your table or record controls. If you are like most of us, you'll first spaghetti it in to one of the code-behind classes above, and then forget you did so, or have a copy in each of the 10 pages that manipulates customer data. In my world, that has usually meant 5 identical functions and 5 that are all different (even though they are all supposed to be the same). ISD makes it tempting to order marinara, because the model lends itself to spaghetti code. Of course, you can completely prevent this, but you gotta learn the ISD model to determine the best way to do it on your project.
Page state and postbacks. Although ISD is quite open about this problem and tells users not to just take the defaults of returning the whole asp.net page state in the postback stream (cache on the server instead), the default is to return the whole page. Can make for some BIG pages. Which makes users think S L O W. As I said, you can manipulate this. But, what newbie is going to get this when it is SO tempting to just point, click, and boom - instant application. Your manager is now off your back because her product inventory table is "on the web" with a cool search and edit GUI (of 400kb state pages if you've gone a bit nuts and have just taken the default behaviors of ISD). Great in-house, but the customers in the real world....
Again, knowledge is the key. You can fix this, but you need to know you SHOULD.
Database read/write postbacks. No big problem here, but you also need to know that the model is to fetch only the data used at the moment. If your table shows 1000 records in 50 record increments, when you go from records 1 to 50 to 51 through 100, you will postback and hit the database again. This keeps data current, but increases server traffic.
Overall: Try the demo version. Point it at something simple that you really want to turn in to an asp.net application. Build maybe three tables. Then dissect it using the above as a guide. See what YOU think and post back to this question.
I have used it for convenience for a very small project. It did what I wanted and saved me a couple of days work.
The main problem I found was when it came to customising or extending the generated project. You have to spend quite a bit of time trying to understand Ironspeed's way of doing things which, I'll admit, is not my way.
I'd use it again for a small project if I knew in advance I wouldn't have to customise it much after.
If stored procedure generation is all you are after, CodeSmith is a decent option at a fraction of the cost of IronSpeed. There are several sproc templates available, and you can create your own or tweak an existing if that is what you need. You can also gen .Net code to your hearts content with CodeSmith. Tons of business class templates already exist for this.
IronSpeed's value is not in the sproc generation, but in the RAD features. I agree with #Galwegian... IronSpeed is OK for mock ups or very simple apps, not so good at all if you need to do any customization.
You may want to check out Evolutility CRUD framework. It provides some of the same features (limited to CRUD) and is open source.
IronSpeed has been great (out-of-the-box) at helping me develop data-driven corporate Intranet applications. While the code model takes a little getting used to, it is effective at maintaining a nice three-tier app. While the page templates can appear garish compared to 2010's web-design, it gets the job done, when you need function over form.
Iron Speed Designer is great for simple CRUD type web applications. You can find some useful information on our web site http://www.dotnetarchitect.co.uk/

Resources