Devexpress xaf many to many relationship oid key Name Change - devexpress

I want to set many to many relationship oid key name.
In many to many relationship Oid is created automatically but on database side I want to change oid name to custom name.
For Example;
If I try to create Person and Task many to many relation. Third table attributes in below;
KomutTanim (FK to Makine)
Makine (FK to KomutTanim)
OID (PK, guid)** (I want to set this key name??)**
Tell me how can I do. I added sample code in below
[Association("Relation.KomutListesi_Makine",typeof(KomutTanim),UseAssociationNameAsIntermediateTableName = true),XafDisplayName("Makine Komutları")]
public XPCollection<KomutTanim> Komutlar
{
get
{
return GetCollection<KomutTanim>(nameof(Komutlar));
}
}
[Association("Relation.KomutListesi_Makine", typeof(Makine), UseAssociationNameAsIntermediateTableName = true), XafDisplayName("Makineler")]
public XPCollection<Makine> MasterId
{
get
{
return GetCollection<Makine>(nameof(MasterId));
}
}

You can customize XPO metadata or manually create a persistent class for your intermediate table. These approaches are illustrated in the How to implement a many-to-many relationship with an intermediate table ticket.
The solution with customizing XPO metadata uses XAF APIs to access an XPClassInfo instance via the XPDictionary property. You can access XPDictionary using only XPO methods as illustrated at How to get an XPClassInfo instance. Also, you can manually create a ReflectionDictionary instance (ReflectionDictionary is an XPDictionary descendant) as described in the How to create persistent metadata on the fly and load data from an arbitrary table article.
XPDictionary dictionary = new ReflectionDictionary();
XPClassInfo intermediateClassInfo = dictionary.GetClassInfo(typeof(KomutTanim)).FindMember(nameof(KomutTanim.MasterId)).IntermediateClass;
intermediateClassInfo.FindMember("Oid").AddAttribute(new PersistentAttribute("MyName"));
string conn = "My connection string";
IDataStore store = XpoDefault.GetConnectionProvider(conn, AutoCreateOption.SchemaAlreadyExists);
IDataLayer dl = new SimpleDataLayer(dictionary, store);
XpoDefault.DataLayer = dl;

Related

Handling reads of Cosmos DB container with multiple types?

I'd like to store several different object types in a single Cosmos DB container, as they are all logically grouped and make sense to read together by timestamp to avoid extra HTTP calls.
However, the Cosmos DB client API doesn't seem to provide an easy way of doing the reads with multiple types. The best solution I've found so far is to write your own CosmosSerializer and JsonConverter, but that feels clunky: https://thomaslevesque.com/2019/10/15/handling-type-hierarchies-in-cosmos-db-part-2/
Is there a more graceful way to read items of different types to a shared base class so I can cast them later, or do I have to take the hit?
Thanks!
The way I do this is to create the ItemQueryIterator and FeedResponse objects as dynamic and initially read them untyped so I can inspect a "type" property that tells me what type of object to deserialize into.
In this example I have a single container that contains both my customer data as well as all their sales orders. The code looks like this.
string sql = "SELECT * from c WHERE c.customerId = #customerId";
FeedIterator<dynamic> resultSet = container.GetItemQueryIterator<dynamic>(
new QueryDefinition(sql)
.WithParameter("#customerId", customerId),
requestOptions: new QueryRequestOptions
{
PartitionKey = new PartitionKey(customerId)
});
CustomerV4 customer = new CustomerV4();
List<SalesOrder> orders = new List<SalesOrder>();
while (resultSet.HasMoreResults)
{
//dynamic response. Deserialize into POCO's based upon "type" property
FeedResponse<dynamic> response = await resultSet.ReadNextAsync();
foreach (var item in response)
{
if (item.type == "customer")
{
customer = JsonConvert.DeserializeObject<CustomerV4>(item.ToString());
}
else if (item.type == "salesOrder")
{
orders.Add(JsonConvert.DeserializeObject<SalesOrder>(item.ToString()));
}
}
}
Update:
You do not have to use dynamic types if want to create a "base document" class and then derive from that. Deserialize into the documentBase class, then check the type property check which class to deserialize the payload into.
You can also extend this pattern when you evolve your data models over time with a docVersion property.

How to migrate dynamodb data on major table change?

During development structures and requirements change. Key and index settings need to be changed, that might break incremental table update. So my solution so far is to delete the table and recreate it from the cloudformation stack.
But how to solve this problem with a production deployment? Is it possible to automate dynamodb deployment as follows?
Create new table
Migrate data from old table to new table
Delete old table
Yes, it is perfectly possible to automate such a deployment structure. As long as you have code to create a table, it should be fairly straightforward to get all of the data from an old table, change the data, and then upload it all to a new table without any drops in up-time. If you write what language you would like to do such a thing in I can help a bit more.
I've done this before and I've added below a small generified code-sample on how you could do this in Java.
Java method for creating a table given the class of the object type stored in dynamo:
/**
* Creates a single table with its appropriate configuration (CreateTableRequest)
*/
public void createTable(Class tableClass) {
DynamoDBMapper mapper = createMapper(); // you'll need your own function to do this.
ProvisionedThroughput pt = new ProvisionedThroughput(1L, 1L);
CreateTableRequest ctr = mapper.generateCreateTableRequest(tableClass);
ctr.withProvisionedThroughput(new ProvisionedThroughput(1L, 1L));
// Provision throughput and configure projection for secondary indices.
if (ctr.getGlobalSecondaryIndexes() != null) {
for (GlobalSecondaryIndex idx : ctr.getGlobalSecondaryIndexes()) {
if (idx != null) {
idx.withProvisionedThroughput(pt).withProjection(new Projection().withProjectionType("ALL"));
}
}
}
TableUtils.createTableIfNotExists(client, ctr);
}
Java method to delete table:
private static void deleteTable(String tableName) {
AmazonDynamoDB client = AmazonDynamoDBClientBuilder.standard().build();
DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable(tableName);
try {
System.out.println("Issuing DeleteTable request for " + tableName);
table.delete();
System.out.println("Waiting for " + tableName + " to be deleted...this may take a while...");
table.waitForDelete();
}
catch (Exception e) {
System.err.println("DeleteTable request failed for " + tableName);
System.err.println(e.getMessage());
}
}
I would scan the whole table and plop all of the content into a List and then map through that list, converting the objects into your new type, and then create a new table of that type but with a different name, push all of your new objects, and then delete the old table after switching any references you might have of the old table to the new one. Unfortunately this does mean that everything consuming your tables are going to need to be able to switch between your two staging tables.

How To Use The RelatedEntities Collection To Create RelatedEntities

I'm attempting to follow the logic describe here: https://msdn.microsoft.com/en-us/library/gg309282.aspx for creating associated entities using the RelatedEntities Property. Problem is, no Associated Entities are getting created. I'm attempting to perform this action from within a Pre=Operation plugin... Is it not supported within a PreOperation Plugin? What am I doing wrong if it?
Here is the code:
var collection = new EntityCollection();
collection.Entities.AddRange(incentives.Select(e => e.ToSdkEntity()));
target.RelatedEntities.Add(new Relationship(new_LeadProduct.Fields.new_lead_new_leadproduct_LeadId), collection);
Since a pre-create plugin executes before the target entity has been created in the database you will not be able to create related entities referencing the target. You should execute related entity logic in a post-create plugin.
Edit:
This answer applies if you are trying to create related records associated with the Target in a plugin operation. Your question did not specify otherwise but based on the code in your answer it looks like this is not what you are trying to do.
Here is the code from the MSDN Example:
//Define the account for which we will add letters
Account accountToCreate = new Account
{
Name = "Example Account"
};
//Define the IDs of the related letters we will create
_letterIds = new[] { Guid.NewGuid(), Guid.NewGuid(), Guid.NewGuid() };
//This acts as a container for each letter we create. Note that we haven't
//define the relationship between the letter and account yet.
EntityCollection relatedLettersToCreate = new EntityCollection
{
EntityName = Letter.EntityLogicalName,
Entities =
{
new Letter{Subject = "Letter 1", ActivityId = _letterIds[0]},
new Letter{Subject = "Letter 2", ActivityId = _letterIds[1]},
new Letter{Subject = "Letter 3", ActivityId = _letterIds[2]}
}
};
//Creates the reference between which relationship between Letter and
//Account we would like to use.
Relationship letterRelationship = new Relationship("Account_Letters");
//Adds the letters to the account under the specified relationship
accountToCreate.RelatedEntities.Add(letterRelationship, relatedLettersToCreate);
//Passes the Account (which contains the letters)
_accountId = _service.Create(accountToCreate);
After some additional testing, the Related Entities Collection must be populated before the PreOperation stage. So registering this to run on PreValidation works as expected.

WCF Transaction with multiple inserts

When creating a user, entries are required in multiple tables. I am trying to create a transaction that creates a new entry into one table and then pass the new entityid into the parent table and so on. The error I am getting is
The transaction manager has disabled its support for remote/network
transactions. (Exception from HRESULT: 0x8004D024)
I believe this is caused by creating multiple connections within a single TransactionScope, but I am unsure on what the best/most efficient way of doing this is.
[OperationBehavior(TransactionScopeRequired = true)]
public int CreateUser(CreateUserData createData)
{
// Create a new family group and get the ID
var familyGroupId = createData.FamilyGroupId ?? CreateFamilyGroup();
// Create the APUser and get the Id
var apUserId = CreateAPUser(createData.UserId, familyGroupId);
// Create the institution user and get the Id
var institutionUserId = CreateInsUser(apUserId, createData.AlternateId, createData.InstitutionId);
// Create the investigator group user and return the Id
return AddUserToGroup(createData.InvestigatorGroupId, institutionUserId);
}
This is an example of one of the function calls, all the other ones follow the same format
public int CreateFamilyGroup(string familyGroupName)
{
var familyRepo = _FamilyRepo ?? new FamilyGroupRepository();
var familyGroup = new FamilyGroup() {CreationDate = DateTime.Now};
return familyRepo.AddFamilyGroup(familyGroup);
}
And the repository call for this is as follows
public int AddFamilyGroup(FamilyGroup familyGroup)
{
using (var context = new GameDbContext())
{
var newGroup = context.FamilyGroups.Add(familyGroup);
context.SaveChanges();
return newGroup.FamilyGroupId;
}
}
I believe this is caused by creating multiple connections within a single TransactionScope
Yes, that is the problem. It does not really matter how you avoid that as long you avoid it. A common thing to do is to have one connection and one EF context per WCF request. You need to find a way to pass that EF context along.
The method AddFamilyGroup illustrates a common anti-pattern with EF: You are using EF as a CRUD facility. It's supposed to me more like a live object graph connected to the database. The entire WCF request should share the same EF context. If you move in that direction the problem goes away.

Assign Business entity to hidden variable

Say for example if I have a business entity -> Customer, which has customerId, customerName and customerType. I have created an asp:Hidden Variable hdnCustomer to runat="server"
If I wanted to serialize the value of the customer business entity (in the code behind) to the hdnCustomer then how would I do that? Also once serialized how would I deserialize it?
// Pseudo code
Collection<Customer> customerList = new Collection<Customer>();
customerList = BusinessAccess.GetCustomerList();
hdnCustomer = serialize and assign the value of 'customerList' to hdnCustomer;
...
...
// Later on a select index change of one of the drop down lists
Inside the event handler for the drop down list:
{
Collection<Customer> customerList = new Collection<Customer>();
customerList = deserialize the value from hdnCustomer
int a = Convert.ToInt32(ddlDropDown.SelectedValue);
foreach(a in customerList)
{
// Do something
}
}
You can serialise to and from XML using XmlSerializer:
http://support.microsoft.com/kb/815813
However, if you just store the object in the ViewState[] collection that should work better:
ViewState["Customer"] = customerList;
It does the same thing: store the serialisable object in the page, hidden from the user: but it won't be in a human-readable format.
(edit: To deserialise, just get the value of ViewState["Customer"], checking for a null before using it!)
edit 2: a useful link about storing objects in ViewState:
http://www.beansoftware.com/ASP.NET-Tutorials/ViewState-In-ASP.NET.aspx
Hope that helps.
I think .net has already providing some classes to do so, look at this example

Resources