Adding/updating child and parent record same time - asp.net

Can someone please show me the easiest way to create/update a parent and child record at the same time (like customer with multiple addresses) with least or no code as possible? Both Web Forms and in MVC.

The basic idea would be to create/update the parent record and return the new ID (key). Then use that key to create the related child records. For example, say you have an Events table and a related EventDates table:
public static int CreateEvent(
out int eventId,
DateTime datePosted,
string title,
string venue,
string street1,
string city,
string state,
string zipCode)
{
...
}
public static void AddEventDates(
int eventDateID,
int eventID,
DateTime startDate,
DateTime endDate)
{
...
}
It's important to maintain data integrity here; if one of the updates fails then both need to be returned to the original state. You could implement this yourself or use transactions:
http://msdn.microsoft.com/en-us/library/z80z94hz%28VS.90%29.aspx

Related

java DynamoDBMapper - partially mapped entities - amount of Read Capacity Units

Does java DynamoDB load whole Items when the #DynamoDBTable annotated class maps only a subset of their attributes?
example: "Product" table, holding items with these attributes:
id, name, description. I would like to get the names of several products, without loading the description (which would be a huge amount of data).
Does this code load description from DynamoDB?
#DynamoDBTable(tableName = "Product")
public class ProductName {
private UUID id;
private String name;
#DynamoDBHashKey
#DynamoDBTyped(DynamoDBAttributeType.S)
public UUID getId() { return id; }
public void setId(UUID id) { this.id = id; }
#DynamoDBAttribute
public String getName() { return name; }
public void setName(String name) { this.name = name; }
}
...
DynamoDBMapper dynamoDBMapper = ...
dynamoDBMapper.batchLoad(products); // TODO is description loaded? what is the amount of Consumed Read Capacity Units?
As their docs say:
DynamoDB calculates the number of read capacity units consumed based on item size, not on the amount of data that is returned to an application. For this reason, the number of capacity units consumed will be the same whether you request all of the attributes (the default behavior) or just some of them (using a projection expression). The number will also be the same whether or not you use a filter expression.
As you see, projections does not impact on the amount of capacity units used.
BTW, in your case, description field will be returned anyway, because you do not need to annotate every field with DynamoDB annotation, only those, who are keys, or named differently, or need custom converters. All non-annotated fields will populated from the corresponding DB fields automatically.

Audit.net data models example

Does any one have a working example of how to added audit models to an existing project, for Audit.Net.
It is one fantastic component to use, and up until now, my team and I have gotten by with the standard JSON files, however, we'd like to migrate our current solution to our Xamarin application, and would like to store the auditing in the local SQLite database on the device.
However, the documentation for this project is somewhat lacking and there is no concise examples of how to get custom auditing working with Entity Framework.
We have worked through the MD files on the github repo, but we still cannot get auditing to work.
Another question, similar to this has been asked HERE, but there is no definitive example of what the Audit_{entity} table should look like, what fields it MUST contain, and how to set up relationships for it.
We tried to reverse engineer the JSON files into a relational structure, but at the time of asking this question, we have not gotten any auditing to write to the SQLite database.
Sorry about the documentation not helping too much, hope I (or anybody) can provide better documentation in the future.
I am assuming you are using EntityFramework to map your entities
to a SQLite database, and you want to use the EF data
provider
to store the audits events in the same database, in Audit_{entity} tables.
There is no constraint on the schema you want to use for your Audit_{entity} tables, as long as you have a one-to-one relation between your {entity} table and its Audit_{entity} table. Then the mapping can be configured on several ways.
The recommendation for the Audit_{entity} tables is to have the same columns as the audited {entity} table, with any common additional column needed, like a User and a Date defined on an Interface.
So, if all your Audit_{entity} tables has the same columns/properties as its {entity}, and you added some common columns (defined on an interface), the configuration can be set like this:
public class User
{
public int Id { get; set; }
public string Name { get; set; }
}
public class Audit_User : IAudit
{
public int Id { get; set; }
public string Name { get; set; }
// IAudit members:
public string AuditUser { get; set; }
public datetime AuditDate { get; set; }
public string Action { get; set } // "Insert", "Update" or "Delete"
}
Audit.Core.Configuration.Setup()
.UseEntityFramework(x => x
.AuditTypeNameMapper(typeName => "Audit_" + typeName)
.AuditEntityAction<IAudit>((ev, ent, auditEntity) =>
{
auditEntity.AuditDate = DateTime.UtcNow;
auditEntity.AuditUser = evt.Environment.UserName;
auditEntity.AuditAction = ent.Action;
});
Note the interface is not mandatory, but using it makes the configuration cleaner. Also note you can make your Audit_{entity} inherit from your {entity} if you wanted to.
Update
Maybe my assumption at the beginning is incorrect and you are not auditing EF entities, but any other type of audit. If that's the case, what you are looking for is a Data Provider that stores the audit events into your SQLite database.
At the time being, there is no built-in data provider that stores to SQLite, and if there was one, it would store just the JSON representation of the event in one column (like the SQL/MySql providers). But it looks like you want to have a custom schema, so you will need to implement your own data provider.
Check the documentation here.
Here is a sample skeleton of a data provider:
public class SQLiteDataProvider : AuditDataProvider
{
public override object InsertEvent(AuditEvent auditEvent)
{
// Insert the event into SQLite and return its ID
}
public override void ReplaceEvent(object eventId, AuditEvent auditEvent)
{
// Replace the event given its ID (only used for CreationPolicies InsertOnStartReplaceOnEnd and Manual)
}
// async implementation:
public override async Task<object> InsertEventAsync(AuditEvent auditEvent)
{
// Asynchronously insert the event into SQLite and return its ID
}
public override async Task ReplaceEventAsync(object eventId, AuditEvent auditEvent)
{
// Asynchronously replace the event given its ID
}
}
Then you just set it up with:
Audit.Core.Configuration.Setup()
.UseCustomProvider(new SQLiteDataProvider());

Gridview with Dictionary

I've created a Dictionary like
Dictionary<Department,bool> dict= new Dictionary<Department,bool>();
here Department is a class and I have Id,Name and Code for the departments. And in the bool am sending whether the person is HOD or not.
Am adding the records to this Dictionary like
dict.Add(department,chkHOD.checked);
here the records are successfully added to the Dictionary and after this am binding the Dictionary to a GridView like
gridDept.Datasource=dict;
gridDept.Databind();
now the inserted records are displayed fine in the gridview. After this am storing this records in the 'StaffDepartments' table in my database. I have 3 columns in the 'Staffdepartments' table
1.StaffId(PK - has link with the Staff table)
2.DepartmentId(PK - has link with the Department table)
3.IsHOD.
here the records are stored fine in the database.No problem in adding the records into the database.
I have some questions here
*1.How can check whether the DepartmentId is already there in the Dictionary before adding to it.
2.When am editing the staff detail how can I delete the Selected Department from the Dictionary by checking the checkbox in Gridview rows.(here the records are coming from the database, so when I click delete button the records should be deleted in the database as well)*
if its a List instead of Dictionary, I can get the DepartmentId by
int departmentId = (int)gridDept.DataKeys[row.RowIndex].Values["DepartmentId"];
but in Dictionary i dunno how to do the same with Key and Value pairs....can anyone help me here.
How can check whether the DepartmentId is already there in the
Dictionary before adding to it.
You could use this:
if (!dict.Keys.Any(d => d.DepartmentId == department.DepartmentId))
dict.Add(department,chkHOD.checked);
But something is wrong here. If your real key is the DepartmentId and not the Department (object identity) you should make it the key in the dictionary. For example, you could define a helper class:
public class DepartmentBindingHelper
{
public int DepartmentId { get; set; }
public Department Department { get; set; }
public bool Checked { get; set; }
}
An then define a dictionary like this:
var dict = new Dictionary<int, DepartmentBindingHelper>();
And add the objects this way to the dictionary:
if (!dict.ContainsKey(department.DepartmentId))
dict.Add(department.DepartmentId, new DepartmentBindingHelper
{
DepartmentId = department.DepartmentId,
Department = department,
Checked = chkHOD.checked
});
Then you can bind only the value collection to the grid:
gridDept.Datasource = dict.Values;// it's an IEnumerable<DepartmentBindingHelper>
gridDept.Databind();
And your code to retrieve the DepartmentId from a row would work without changes:
int departmentId = (int)gridDept.DataKeys[row.RowIndex].Values["DepartmentId"];

ASP.NET MVC2 LINQ - Repository pattern, where should the pagination code go?

I'm working on adding an HtmlHelper for pagination, but I am unsure where the proper and/or most beneficial place to put certain parts of the pagination code from a performance and maintainability standpoint.
I am unsure if the Skip(), Take() and Count() portions of Linq to SQL data manipulation should live within the repository or the controller.
I am also unsure if their order and where they are used affects performance in any way.
If they live within the repository from my understanding this is how it would work:
1. I would pass the pageIndex and pageSize as arguments to the repository's method that grabs the data from the database.
2. Then grab the full data set from the database.
3. Then store the count of TotalItems of that full data set in a variable.
4. Then apply the Skip() and Take() so the data set retains only the page I need.
5. Display the partial data set as a single page in the view.
If they live in the controller from my understanding this is how it would work:
1. I would grab the full data set from the repository and store it into a variable inside of the controller.
2. Then get the count of TotalItems for the full data set.
3. Then apply the Skip() and Take() so the data set retains only the page I need.
4. Display the partial data set as a single page in the view.
Inside the controller (I realize I will incorrectly get the page count here and not TotalItems):
Character[] charactersToShow = charactersRepository.GetCharactersByRank(this.PageIndex, this.PageSize);
RankViewModel viewModel = new RankViewModel
{
Characters = charactersToShow,
PaginationInfo = new PaginationInfo
{
CurrentPage = this.PageIndex,
ItemsPerPage = this.PageSize,
TotalItems = charactersToShow.Count()
}
};
Inside the repository:
public Character[] GetCharactersByRank(int PageIndex, int PageSize)
{
IQueryable characters = (from c in db.Characters
orderby c.Kill descending
select new Character {
CharID = c.CharID,
CharName = c.CharName,
Level = c.Level
});
characters = PageIndex > 1 ? characters.Skip((PageIndex - 1) * PageSize).Take(PageSize) : characters.Take(PageSize);
return characters.ToArray();
}
This code is a partial example of how I was implementing the Skip(), Take() and Count() code living in the repository. I didn't actually implement getting and returning the TotalItems because that was when I realized I didn't know the proper place to put this.
Part of the reason I am unsure where to put these is that I don't know how Linq to SQL works underneath the hood, and thus I don't know how to optimize for performance. Nor do I know if this is even an issue in this case.
Does it have to grab ALL the records from the database when you do a .Count() on the Linq to SQL?
Does it have to make separate queries if I do a .Count(), then later do a .Skip() and .Take()?
Is there any possible performance problems with using .Count() prior to a .Skip() and .Take()?
This is my first time using an ORM so I'm not sure what to expect. I know I can view the queries Linq to SQL is running, however I feel that listening to someone with experience in this case would be better use of my time.
I would like to understand this more in depth, any insight would be appreciated.
I keep a generic PaginatedList class inside my Helpers folder where I also put other Helper classes.
The PaginatedList is straight out of NerdDinner, and it looks like this.
public class PaginatedList<T>: List<T>
{
public int PageIndex { get; private set; }
public int PageSize { get; private set; }
public int TotalCount { get; private set; }
public int TotalPages { get; private set; }
public PaginatedList(IQueryable<T> source, int pageIndex, int pageSize)
{
PageIndex = pageIndex;
PageSize = pageSize;
TotalCount = source.Count();
TotalPages = (int) Math.Ceiling(TotalCount / (double)PageSize);
this.AddRange(source.Skip(PageIndex * PageSize).Take(PageSize));
}
public bool HasPreviousPage
{
get
{
return (PageIndex > 0);
}
}
public bool HasNextPage
{
get
{
return (PageIndex + 1 < TotalPages);
}
}
}
I found this on the NerdDinner site that Marko mentioned above and it answered a lot of my questions.
From NerdDinner on the bottom of page 8:
IQueryable is a very powerful feature that enables a variety of interesting deferred execution scenarios (like paging and composition based queries). As with all powerful features, you want to be careful with how you use it and make sure it is not abused.
It is important to recognize that returning an IQueryable result from your repository enables calling code to append on chained operator methods to it, and so participate in the ultimate query execution. If you do not want to provide calling code this ability, then you should return back IList or IEnumerable results - which contain the results of a query that has already executed.
For pagination scenarios this would require you to push the actual data pagination logic into the repository method being called. In this scenario we might update our FindUpcomingDinners() finder method to have a signature that either returned a PaginatedList:
PaginatedList< Dinner> FindUpcomingDinners(int pageIndex, int pageSize) { }
Or return back an IList, and use a "totalCount" out param to return the total count of Dinners:
IList FindUpcomingDinners(int pageIndex, int pageSize, out int totalCount) { }

problem with asp.net gridview

I have problem with gridview deleting.I have table name Doctor with
Id,Name,Address,Phone.Id is auto generated field.After adding data
when i am displaying in gridview then if delete any id from gridview
Again then if i add any new details from the form its starting from
the new number.I mean if i delete the last id no 5 then again if i
add any new doctor its taking id value 6 not from 5.My query is it
should start again from 5.Here is my code.Pls help me.
public class Doctor
{
public int Id { get; set; }
public string Name { get; set; }
public string Address { get; set; }
public string Phone { get; set; }
}
public static class DoctorDataLayer
{
public static void AddDoctor(Doctor doctor)
{
string connectionString = ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString; // JohannesH: Changed from .ToString() to .ConnectionString
using(var connection = new SqlConnection(connectionString))
{
using (var command = new SqlCommand("insert into doctor values(#name,#address,#phone)", connection))
{
command.Parameters.AddWithValue("#name", doctor.Name);
command.Parameters.AddWithValue("#address", doctor.Address);
command.Parameters.AddWithValue("#phone", doctor.Phone);
connection.Open();
command.ExecuteNonQuery();
connection.Close();
}
}
}
}
public static class DoctorBusinessLayer
{
public static void CreateDoctor(string name, string address, string phone)
{
DoctorDataLayer.AddDoctor(new Doctor {Name = name, Address = address, Phone = phone});
}
}
This is perfectly normal database behaviour and has nothing to do with your GridView. If you have an issue with gaps in autogenerated (identity) columns, either use your own logic to generate unique ID's or use custom SQL scripts to check for gaps in Identity values and fill those gaps.
Example B in the Transact-SQL reference shows a way to do just this.
So the Id is created by the database (autonumber). When id 5 is used it's used up. This is normal behavior.
As other have noted, if this is an autogenerated ID from the DB then once it is used it will not be regenerated, each ID is unique regardless if the data still exists or not. If IDs were recycled you could get into issues with foreign references that may have pointed to the old item with that ID and now would point to a new different record with the reused ID.
Typically you don't expose the IDs to the user anyway so it is a non issue.
You shouldn't depend on autogenerated ids sequences being ordered or not having gaps. As others have noted, the behavior you are seeing is perfectly normal behavior for an autogenerated id and to make it otherwise you'll need to jump through a lot of hoops. If you need the ids to be ordered by the insertion sequence, you should put in an autogenerated date/time field and then select the data ordered by that field (and index it). That way if you ever decide to switch from a numeric id to a GUID or some other id format in which the sort order is different than the insertion order your data will still be ordered correctly. If you need to have a "place order" for each, generate that automatically (say a rownumber) as you are selecting ordered by date. That way you will still have strict numerical ordering even if records get deleted later.

Resources