EF Core custom results from stored procedure - asp.net

I'm using EF Core which I believe is also known as EF 7? Anyways, I have a stored procedure that returns custom results that will not identify with any specific table. How am I supposed to access those results and how should I call the sql command?
Normally we have to use .FromSql but that is only available on entities, eg. _context.User.FromSql(). I don't have an entity for it.
So I tried building a dbset/entity for the results, but again, there is no associated table, and there is also no "Key". How am I supposed to parse the data then of the custom results?

You can create a fake entity for the result of your stored procedure. You can set any property as the Key, even if in the results of the stored procedure the key values are not unique.
For example if you have a table like the following :
CREATE TABLE [dbo].[banana_hoard]
(
[id] INT NOT NULL PRIMARY KEY IDENTITY (1,1),
[owner] NVARCHAR(64) NOT NULL,
[bananas] BIGINT NOT NULL
)
You can have a query that does not return the row id like this :
public class Program
{
public static void Main(string[] args)
{
using (var db = new MonkeyDbContext())
{
var sp_results = db.search.FromSql(#"execute <YOUR_STORED_PROC>");
str_result = String.Join("\n", sp_results.Select(a => JsonConvert.SerializeObject(a) ));
Console.WriteLine("stored proc result :\n" + str_result);
}
}
}
public class MonkeyDbContext : DbContext
{
public DbSet<StoredProcRow> search { get; set; }
protected override void OnConfiguring (DbContextOptionsBuilder builder)
{
builder.UseSqlServer(#"Server=(localdb)\monkey_db;Database=monkey_db;Trusted_Connection=True;");
}
}
public class StoredProcRow
{
[Key]
public string Owner { get; set; }
public long Bananas { get; set; }
}

Related

How to diagnose slow Entity Framework stored procedure call?

Problem: I'm calling a stored procedure through EF Core. When I run the stored procedure directly (via 'debug procedure'), it runs quickly, but it runs VERY slowly when called by EF's FromSqlRaw. So the problem appears to be when converting the returned data-table to a list of objects.
Setup:
Simple application with a list of blog posts. The stored procedure gets a hierarchical list of posts and associated users from a TPH table of posts, plus a table of users.
// Code is simplified, actually 8 parameters
SqlParameter depth_p = new SqlParameter("#depth", depth);
SqlParameter authorizedUserID_p = new SqlParameter("#authorizedUserID", authorizedUser.ID);
IQueryable<PostUser> query = postContext.PostUsers
.FromSqlRaw("Post.USP_ReadDebate #depth, #authorizedUserID",
parameters: new[] { depth_p, authorizedUserID_p });
List<PostUser> postUsers = query.ToList(); // This hangs.
26 columns are returned and converted by EF into the PostUser class.
PostUser holds 26 "ordinary" properties. No navigation properties, custom classes or any getters or setters that do any work.
public class PostUser
{
// Post fields
public Int32? ID { get; set; } // Primary Key
public String Text { get; set; }
public Guid OwnerID { get; set; }
public int? ParentID { get; set; } // nullable
public bool IsDisabled { get; set; }
public DateTime TimeCreated { get; set; }
public bool User_IsBanned { get; set; } = false;
// some others...
public PostUser() { }
}
Note: the stored procedure is very complex. It calls another stored procedure which fills a #spid table, then inserts the contents of that #SPID table into a table variable and returns that.
But again when debugged directly it returns quickly, so I think the problem is when EF Core is converting the returned data to the PostUser object.
Bottom Line: is there any way to get visibility into what EF Core is doing on the conversion to PostUser to find the problem?
Thank you!

Updating database keys, clustered index error

I'm working with VS2013 on an asp.net web api project an Entity Framework deployed on Azure
I'm trying to alter the keys on one of my tables to instead of being composed of 2 foreign keys its composed of 3 keys. Using the Add-Migration command it generates the following migration
public partial class ChangeKeys_v4 : DbMigration
{
public override void Up()
{
DropPrimaryKey("dbo.MyTable");
AddPrimaryKey("dbo.MyTable", new[] { "ClientId", "Order", "ZoneId" });
}
public override void Down()
{
DropPrimaryKey("dbo.MyTable");
AddPrimaryKey("dbo.MyTable", new[] { "ClientId", "ZoneId" });
}
}
This is the entity class that was changed to include order as a key:
public class MyTable
{
[Key, Column(Order = 1), ForeignKey("Client")]
public int ClientId { get; set; }
[Key, Column(Order = 2)]
public int Order { get; set; }
[Key, Column(Order = 3), ForeignKey("Zone")]
public string ZoneId { get; set; }
public virtual Client Client { get; set; }
public virtual Zone Zone { get; set; }
public MyTable() { }
public MyTable(Client c, int o, Zone z)
{
Client = c;
Order = o;
Zone = z;
}
}
I've successfully did Update-Database on my development environment, but when doing so to the text environment I get the following error
Tables without a clustered index are not supported in this version of SQL Server. Please create a clustered index and try again.
Could not drop constraint. See previous errors.
The statement has been terminated.
What changes can I do to my migation class so it works.
I've seen some solution and most say to drop table, and that's not really an option for when I have to push to production environment.
You can't do the EF code->DB upgrade for this change with Azure hosting the db.
First, you have to have downtime. Then you run a script in SSMS connecti ng to the Azure DB. The script will:
Renames MyTable to e.g. TableOld
Creates MyTable with the new PK, FK's etc
insert into MyTable(col names) select (col names) from MyTableOld;
Run some checks: row counts in both tables, other queries which you know the results to. Basically check everything works.
Drop table MyTableOld
Maybe delay step 5 until you're sure it's all OK after the users have given it a test drive.

Inject or Bind "Alias" in an ServiceStack entity

I have 3 tables which contains same set of columns. Do i need to create 3 entities for all the DB tables? Is there a way to avoid creating 3 entities and have only one in ServiceStack?
Yes there is one way of doing it like below
List<EntityA> list = db.SqlList<EntityA>("SELECT COL_A,COL_B FROM TableA");
Entity without Alias on Class
public class EntityA
{
[Alias("COL_A")]
public string ColumnA { get; set; }
[Alias("COL_B")]
public string ColumnB { get; set; }
}
in this way i can change the table name(TableA/TableB/TableC) provided in the Query
but I want something like injecting / passing the alias while retrieving the results from the database. I am not sure if this is possible with service stack
Edited
Let me rephrase the question Instead of returning differenct objects like EntityTableA/EntityTableB/EntityTableC as Result i want
return db.Select<GenericEntity>(w => w.OrderBy(o => o.ColumnA));
the GenericEntity can be any tables result
You can just use inheritance to reduce boilerplate:
public class EntityBase
{
[Alias("COL_A")]
public string ColumnA { get; set; }
[Alias("COL_B")]
public string ColumnB { get; set; }
}
Then inherit properties from the shared entity, e.g:
public class TableA : EntityBase {}
public class TableB : EntityBase {}
Then query it as normal:
var results = db.Select<TableA>(q => ColumnA == "A");
Otherwise yeah the using any of the raw SQL API's will work as well.
Modifying SqlExpression
You can also override the SqlExpression FromExpression to include your own table, e.g:
var q = db.From<GenericEntity>().OrderBy(o => o.ColumnA);
q.From("TableA");
List<GenericEntity> results = db.Select(q);
This will change the SQL to SELECT from TableA instead.

How do I name a many-many table for EF6 and do I need to add special mapping for this?

I am using EF 6 and trying to map a many to many relationship. So far I have:
public partial class ObjectiveDetail
{
public ObjectiveDetail()
{
this.SubTopics = new List<SubTopic>();
}
public int ObjectiveDetailId { get; set; }
public string Text { get; set; }
public virtual ICollection<SubTopic> SubTopics { get; set; }
}
public partial class SubTopic
{
public SubTopic()
{
this.ObjectiveDetails = new List<ObjectiveDetail>();
}
public int SubTopicId { get; set; }
public int Number { get; set; }
public string Name { get; set; }
public virtual ICollection<ObjectiveDetail> ObjectiveDetails { get; set; }
}
Our DBA is going to write the code for the many to many table. Should this be as follows
with a table name of ObjectiveDetailSubTopic or something completely different ?
CREATE TABLE [dbo].[ObjectiveDetailSubTopic] (
[ObjectiveDetailId] INT NOT NULL,
[SubTopicId] INT NOT NULL
);
Can someone tell me if this is the correct way to create the table. Also do I have to
add some code to map the ObjectiveDetail and SubTopic classes to the new join class so
EF will know what to do?
Our DBA is going to write the code for the many to many table. Should
this be as follows with a table name of ObjectiveDetailSubTopic or
something completely different ?
As long as you follow the SQL Database table naming conventions, the table name can be anything. I usually name the join table like yours, by connecting the two table names.
To create the join table using sql, see below:
CREATE TABLE [dbo].[ObjectiveDetailSubTopic](
ObjectiveDetailSubTopicId int identity primary key,
ObjectiveDetailId INT NOT NULL,
SubTopicId INT NOT NULL,
foreign key(ObjectiveDetailId) references ObjectiveDetail(ObjectiveDetailId ),
foreign key(SubTopicId) references SubTopic(SubTopicId )
);
But you don't need to create the join table by your own, Entity Framework will create it for you. You just need to mapping the relationship with the Fluent API in your DbContext class like below:
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Entity<ObjectiveDetail>().
HasMany(c => c.SubTopics).
WithMany(p => p.ObjectiveDetails).
Map(m =>
{
m.MapLeftKey("ObjectiveDetailId ");
m.MapRightKey("SubTopicId ");
m.ToTable("ObjectiveDetailSubTopic");
});
}

EF Migrations: Move Table from 2 Column PK to Single Column causes ALTER before DROP and fails

I'm using EF 4.3.1 Code First Migrations. I have a table like:
public class Product
{
[Key]
[Column(Order=0)]
[MaxLength(100)]
public string Store { get; set; }
[Key]
[Column(Order=1)]
[MaxLength(100)]
public string Sku { get; set; }
}​
I have an existing table created with the above code. I then moved it to a single-column Primary Key:
public class Product
{
[MaxLength(100)]
public string Store { get; set; }
[Key]
[MaxLength(100)]
public string Sku { get; set; }
}​
This causes EF to fail in the next automatic migration, complaining:
ALTER TABLE [Product] ALTER COLUMN [Store] nvarchar
The object 'PK_Product' is dependent on column 'Store'. ALTER
TABLE ALTER COLUMN Store failed because one or more objects access this
column.
Clearly the PK_Product needs to be dropped before attempting to fire this ALTER statement (why is it altering the column at all?), but instead the migration fails.
Am I doing something wrong or is this a bug? Workarounds?
You won't be able to do this with an automatic migration. You'll have to create a migration using Add-Migration and then change it so it only modifies the PK.
The migration can be as simple as:
public partial class TheMigration : DbMigration
{
public override void Up()
{
DropPrimaryKey("Products", new[] { "Store", "Sku" });
AddPrimaryKey("Products", "Sku");
}
public override void Down()
{
DropPrimaryKey("Products", new[] { "Sku" });
AddPrimaryKey("Products", new[] { "Store", "Sku" });
}
}
EF is altering the column because, when it's part of a Key, it's implicitly NOT NULL.
You can leave it as-is, add a [Required] attribute, or allow EF to alter the column after dropping the PK.

Resources