.net object to object declarative mapping with the rules in a file - .net-core

I need to map objects of similar domain but from different systems and different internal representation.
I would prefer to have the mapping rules in some external file and update the rules without re-compiling the mapping tool.
Internally it's xml to json mapping and I was thinking about xslt. But I'm not sure if it's a good choice - mostly because not a lot of developers will be able to read and support it easily nowadays.
I can generate C# domain objects for both systems and use auto-mapper but it means a lot of C# code and every change would require a new build and release.
I found the following: https://github.com/camous/acmemapper
It's close to what I need but it's too json centric.
To give some example.
I would need to convert that:
<Statement name="SendEmail" continueOnFail="false">
<Parameter xsi:type="StringParameter" name="To" value="someone#fake.com" />
<Parameter xsi:type="StringParameter" name="Subject" value="test subj" />
<Parameter xsi:type="MultiLineStringParameter" name="Body" value="test body" />
</Statement>
into the
{
"workflowStepType": 0,
"actionType": 9,
"parameters": {
"recipients": [
"someone#fake.com"
],
"subject": "test subj",
"body": "test body",
"variables": []
},
"outcome": false,
"id": 1674239989193,
"displayName": "Send Email"
}
I was not able to find a library which would allow to have mapping rules in an external file and it wouldn't be as complicated as xslt to read.

Related

Windchill REST API endpoint to fill BOM from file

We are developing an internal project to use the Windchill OData REST API to fill the eBOM for a given part. What we are trying to do here is to read data from another software we have to get the BOM info and send it to the part in windchill. But we cannot find an endpoint in servlet/odata to do it.
We guess the idea is to replicate the manual process. So we already know how to create, check out and check in a part. However we still cannot find an endpoint to modify the part and add the eBOM.
We know PartList, PartListItem, GetPartStructure in the PTC Product Management Domain. But these are GET endpoints and are only useful to retrieve data, including the BOM. But we cannot use them to modify the content.
I've found the solution.
The endpoint to use is:
POST /ProdMgmt/Parts('VR:wt.part.WTPart:xxxxxxxxx')/Uses
The body of the request must contain:
{
"Quantity": 1,
"Unit": {
"Value": "ea",
"Display": "Each"
},
"TraceCode": {
"Value": "0",
"Display": "Untraced"
},
"Uses#odata.bind": "Parts('OR:wt.part.WTPart:yyyyyyyyy')"
}
Where Uses#odata.bind contains the ID of the part we want to link

IPFS uri format: https://ipfs.io/ipfs/<CID> vs. ipfs://<CID>?

Here is my test tokenURI.json file w/ the imageURI I pass to my token contract.setTokenURI():
{
"attributes": [
{
"trait_type": "location",
"value": "West Awesomeville"
},
{
"display_type": "date",
"trait_type": "created",
"value": 1535250800
}
],
"description": "My awesome NFT.",
"image": "https://ipfs.io/ipfs/QmaUXii41ESnUMxLJUoVcrEeXowz7RHcdTiumvrBmUvcwG?filename=test4.png",
"name": "NFT 1"
}
Which is the best IPFS uri form to use esp. if I want to load this NFT into Opensea?
The docs in IPFS recommend:
https://ipfs.io/ipfs/<CID>
but the docs in Opensea recommend:
ipfs://<CID>
Which form is better and why?
In the above json I'm using the first form recommended by IPFS. It works but loading into Opensea is slow/somewhat unpredictable.
The form Opensea recommends is shorter, no gateway. Would the image load faster in Opensea if I used the 2nd form?
IPFS docs: Address IPFS on the Web
Opensea docs:
If you use IPFS to host your metadata, your URL should be in the format ipfs://CID. For example, ipfs://QmTy8w65yBXgyfG2ZBg5TrfB2hPjrDQH3RCQFJGkARStJb.
The ipfs:// url is the better way. Because gateways can go down. Now the ipfs pinner that you're using (pinata.cloud?) can also go down, or you can stop paying them and they will disappear your stuff.
Opensea is not likely to care, as long as they can find your metadata / images from the uri returned by the contract they will list your thing, and there's a way somewhere to do a metadata refresh (if you do a reveal)
And if I can also suggest, it probably might be a good idea to include a way to update the baseURI in the contract just in case.

Using webhooks with Google Analytics

I'm trying to integrate my CRM with Google Analytics to monitor lead changes (from lead to sell) and so on. As I understood, I need to use Google Measurement Protocol, to receive webhooks from CRM and translate it to Analytics Conversions.
But in fact, I don't really understand how to do it. I need to make some script, to translate webhook code to analytics, but where I need to place that script? Are there some templates? And so on.
So, If you know some tutorials/courses/freelancers to help me with intergrating webhooks with Analytics - I need your advice.
Example of webhook from CRM:
{
"leads": {
"status": {
"id": "25399013",
"name": "Lead title",
"old_status_id": "7039101",
"status_id": "142",
"price": "0",
"responsible_user_id": "102525",
"last_modified": "1413554372",
"modified_user_id": "102525",
"created_user_id": "102525",
"date_create": "1413554349",
"account_id": "7039099",
"custom_fields": [
{
"id": "427183",
"name": "Checkbox custom field",
"values": ["1"]
},
{
"id": "427271",
"name": "Date custom field",
"values": ["1412380800"]
},
{
"id": "1069602",
"name": "Checkbox custom field",
"values": ["0"]
},
{
"id": "427661",
"name": "Text custom field",
"values": ["Валера"]
},
{
"id": "1075272",
"name": "Date custom field",
"values": ["1413331200"]
}
]
}
}
}
"Webhook" is a fancy way of saying that your CRM can call a web based service whenever something interesting happens (i.e. the CRM can "hook" into a web based application). E.g. if a new lead is created you can call an url with the lead details as parameters.
Specifics depend on your CRM, but when you set up a webhook there should be a field to set a url; the script that evaluates the CRM data is located at the URL.
You have that big JSON thing as your example - No real way to tell without knowing your system, but I assume that is sent as request body. So in your script you evaluate the request body, extract the parameters you want to send to analytics (be mindful that you are not allowed to store personally identifiable information) and sent it via the measurement protocol as described in the documentation linked in the other answer.
Depending on the system you might even be able to call the measurement protocol without having a custom script in between (after all the measurement protocol is an url with a few parameters).
This is an awfully generic answer, but then the question is really broad.
I've done just this in my line of work.
You need to first decide your data model on how you would like the CRM data to look within Google Analytics. This could be just mapping Google Analytics' event category, event label, event action to your data, or perhpas using custom dimensions and metrics.
Then to make it most useful, you would like to be able to link the CRM activity of a customer to their online activity. You can do this if they login online. In that case, you can set the cid and/or uid of the user to your CRM id.
Then, if you send in a GA hit with the same cid/uid in your Measurement Protocol hit, you will link the online sessions with your offline CRM activity.
To make the actual record hit Google Analytics, you will need to program something that takes the CRM data and turns it into a Measurement Protocol hit, which is essentially just a URL with the correct parameters. Look here for reference: https://developers.google.com/analytics/devguides/collection/protocol/v1/reference
An example could be: http://www.google-analytics.com/collect?v=1&tid=UA-123456-1&cid=5555&t=pageview&dp=%2FpageA
We usually have this as a seperate process, that fires when the CRM data is written to its database (the webhook in your example). If its a lot of data, you should probably implement checks to see if the hit was sucessful, and caching in case the service is not online - you have an optional parameter that gives you 4 hours leeway in sending data.
Hope this gets you at least started.

Usergrid: How to delete add on Metadata from Usergrid entities

I created an entity on usergrid but I find that usergrid tacks on additional data in the JSON, that I really don't want appearing in the API layer. For example here is my entity:
{
**"uuid": "7cd5c98a-7b16-11e4-9085-b5397738dcd5",
"type": "summaries",
"created": 1417629724184,
"modified": 1417629993800,**
"accountId": "123123",
"accounts": [
{
"id": "123123",
"type": "Individual",
"category": "Prepaid",
The fields uuid/type/created/modified is not what I want to pull although usergrid tacks it along. I can write logic on the receiving side which parses this out, but we don't want to write any kind of business logic in the Proxy. How can I suppress this behaviour?
Unfortunately Usergrid isn't really good for an Open API and you should put it behind a management layer like Apigee Edige. Log into your Apigee account and click on Create and Manage APIs.
In there you can manipulate the JSON payload by extracting blocks or individual elements (like below where I grab either all accounts or just the ID for the first account in the list)
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<ExtractVariables async="false" continueOnError="false" enabled="true" name="Extract-Account-Response">
<DisplayName>Extract Account Response</DisplayName>
<FaultRules/>
<Properties/>
<IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
<JSONPayload>
<Variable name="allComments">
<JSONPath>$.accounts</JSONPath>
</Variable>
<Variable name="account0">
<JSONPath>$.accounts[0].id</JSONPath>
</Variable>
</JSONPayload>
<Source clearPayload="false">commentResponse</Source>
</ExtractVariables>
If you're not using Apigee you'll still need to put some kind of programatic facade in front of Usergrid to manipulate the responses.

Is it possible to use computed keys with KeyValueMaps?

I would like to use KeyValueMaps to store some simple values, but they keys I need to use would be computed at runtime. For example in my 'InitialEntries' I want to do something like this:
<KeyValueMapOperations async="false" continueOnError="false" enabled="true" name="Sandbox-Read-Count">
<DisplayName>Sandbox - Read Count</DisplayName>
<FaultRules/>
<Properties/>
<ExclusiveCache>false</ExclusiveCache>
<ExpiryTimeInSecs>-1</ExpiryTimeInSecs>
<InitialEntries>
<Entry>
<Key>
<Parameter>{variable}.sandbox.calls</Parameter>
</Key>
<Value>0</Value>
</Entry>
</InitialEntries>
<Scope>apiproxy</Scope>
</KeyValueMapOperations>
However, when doing this I get an error when I try to save the policy:
Error while Uploading file for API Test.
messaging.config.beans.InvalidBundle. Errors:[Entity : policy-Sandbox-Read-Count, Invalid Key Names For Entries: [{apikey}.sandbox.calls];]
Is it possible to use computed values in the KeyValueMap policy? Is there a different syntax that I should be using?
I've investigated this. What happens is when you save the proxy with InitialEntries in the apiproxy-scoped KVM, the KVM is immediately created with the initial entries. Therefore, there is no way to use runtime variables, because the priming of the KVM has happened before the proxy ever runs.
You didn't use the mapIdentifier field in your KeyValueMapOperations element (look at the KeyValueMap PUT Sample in the Apigee docs), so the KVM you would create would be named kvmap.
You can use the following management API call to get a list of the KVMs and their contents for a given apiproxy:
GET https://api.enterprise.apigee.com/v1/o/{org}/apis/{apiname}/keyvaluemaps?expand=true
Authorization: Basic {base64 username:password}
Since The InitialEntries section is only used when the proxy is first loaded successfully (even if you change the InitialEntries section and redeploy, no changes will be made if the KVM of that name already exists), I think the usefulness of the InitialEntries section is rather limited. I'd recommend manually priming your KVM's using the management API to initialize the KVM:
PUT https://api.enterprise.apigee.com/v1/o/{org}/apis/{apiname}/keyvaluemaps
Authorization: Basic {base64 username:password}
Content-Type: application/json
{
"entry" : [ {
"name" : "key",
"value" : "0"
} ],
"name" : "{kvmName}"
}

Resources