D365 DIXF - How to overwrite document attachment - axapta

While using DIXF to bulk import product images, I'm stucked at the part where an attachment (productimage in this case) has to be overwritten. Process terminates telling me "operation not allowed".
When I first manually delete the existing product image and then restart import, process is running without errors.
I already tried unchecked the call validate field in modify target mapping of entity.

Related

Delete Associated Workflows after document has been deleted

on delete of a document, by default the workflow is left in a hanging state and the reference of the document gets removed from the workflow side (bpm_package).
I want to change it as follows: if a document has been deleted in the repository then all the workflow that are associated with it should get deleted (each workflow package will always have a single document)
I tried to implement this by using rule/action (items are deleted or leave this folder) - was able to find workflows in js and cancel them, but it does not delete the document nor the workflow. on checking the XHR request i was able to find out that a concurrency exception occurs between the action and onDelete policy.
how do i delete/cancel/close the associated workflows of a document
i'm using alfresco community 5.2
You need to create Behavior/Policie to achieve this task.
http://docs.alfresco.com/6.0/references/dev-extension-points-behaviors.html
You can use beforeDeleteNode/onDeleteNode behaviour and write logic here to delete workflow.

how to programmaticaly get events from sterling file gateway?

We have Sterling file gateway with UI and everything and we also have control center where we see the file transfers from SFG. Trying to find out how i can subscribe to the events from Filegateway[SFG] programmaticaly. The documentation is not clear on if there is a way to do this.
The Database tables FG_EVENT and FG_EVENTATTR contain the details about Filegateway events.
Example of SQL query:
select * from fg_event t1,fg_eventattr t2 where t1.event_key=t2.event_key and
event_code='FG_0422'
You can add different criteria to the SQL query to filter on filename, type of delivery, date , etc ...
Then you can use SQL queries with any client to query the Database.
Sterling Control Center can monitor the following events:
•Arrived File events - every Sterling File Gateway Arrived File status code is recorded as a successful (FG_0411 - Arrived File Routed) or a failed (FG_0455 - Arrived File Failed) file transfer
•Route events
•Delivery events
More information about IBM Control Center.
There is also another way to invoke business processes by certain events:
Edit the listenerStartup.properties and listenerStartup.properties.in files to include the line:
Listener.Class.xx=com.sterlingcommerce.server1.
dmi.visibility.event.XpathBPLauncherEventListener
Where xx is the next available number according to how many listeners are already enabled in the file.
Edit the visibility.properties and visibility.properties.in files to add the necessary information to configure the listener to launch the proper business processes based on the correct events. The pattern for registering the events with the listener is:
bp_event_trigger.X=eventPreFilter,xPathExpression,bpname,userId
There is an example in this page:
https://www.ibm.com/support/knowledgecenter/SS3JSW_5.2.0/com.ibm.help.aft.doc/SI_AFT_InternalEvent.html

Simplest possible BAM Scenario

I’m trying to set up a very simple BAM scenario within BizTalk Server 2013R2 upon which to build, involving tracking just the time of all incoming messages processed by a port.
To this end I have :
Within Excel, created an Activity Definition (called
SimpleReceiveTest) containing a single Item called ReceiveTime of
type milestone (date/time), and a View Definition (also called
SimpleReceiveTest) containing just this Activity Definition and Item.
Imported this BAM definition spreadsheet using bm.exe
Added view rights to SimpleReceiveTest again using bm.exe
Launched the Tracking Profile Editor, imported the BAM Activity
Definition, and mapped ActivityID = MessageID and ReceiveTime =
PortStartTime by drag and drop from the Messaging Property Schema, as
shown below :
Set the Port Mappings for MessageID and PortStartTime to relate to a
test Receive Port ReceivePort1 that I am using for testing. This is
using a pass-through pipeline.
Saved and applied the above Tracking Profile
It is my understanding that for any messages received on port ReceivePort1 I should now get a tracking activity created. However this is not happening – there are no records in any of the BAM tables/views and no data is available within the BAM Portal.
I have tried restarting the hosts, and have verified that the TDDS_FailedTrackingData table is empty, there is nothing relevant in the event log, a Tracking host is running and the SQL Agent Jobs are running. I have also tried running these jobs manually.
Have I missed something, and am I correct in my expectation that this simple scenario should create tracked activities for any messages passing through the Receive Port? If so what can I try to further diagnose this?
Now fixed - it's actually a bug in vanilla BizTalk 2013R2 when using a standard pipeline that has been fixed in CU2.
FIX: BAM tracking doesn’t work when you use the XMLReceive or a custom pipeline in BizTalk Server

Can't import LDAP query feed data

I've set up an LDAP query that successfully pulls all data from the field physicaldeliveryofficename from our Windows Active Directory:
I also setup a View that uses the query to further refine the list, so I am confident that the query itself is working:
The problem occurs when I try to use Feeds Importers to grab that data and add it to my Offices content type. Here are my settings:
Basic settings
Name: Offices
Attach to content type: Use standalone form
Periodic import: 1 day
Import on submission: checked
Fecher
LDAP Query Fetcher is selected
LDAP Query Fetcher
LDAP Query: HPSI Offices (that's the right query)
Parser
LDAP Entry Parser for Feeds is selected
Processor
Node processor is selected
Node processor
Bundle: Office
Language: Language neutral
Insert new nodes: Insert new nodes
Update existing nodes: Update existing nodes
Text format: plain text (also tried HTML)
Action to take when previously imported nodes are missing in the feed: Delete non-existent nodes
Author: Currently using user 1, but also tried anonymous
Authorize: checked
Expire nodes: never
Mapping
Source: [physicaldeliveryofficename]
Target: Title (title)
Used as unique
When I run this feed importer, the only thing that happens is a single Office is created with a blank title (see last image.) Can anyone tell me why this importer isn't working when both the LDAP query and a View that depends on it are working?
UPDATE: I opened an issue against LDAP Feeds at Drupal.org and it appears I'm not the only one with this problem. https://www.drupal.org/node/2685009
Sounds a great deal like the issue reported on the LDAP module's issue queue. Try applying the patch from comment 11.

Drupal Content Access Issues

I just recently tried installing the Access Content Module to Drupal 5. The module didn't work correctly and I can not uninstall the module without getting an Internal Server Error.
Since I'm struck with the module I now have to try to use it.
Now I'm getting this error when an anonymous user views a page with a specific field_name that is connected to a content type that I can't grant full access to.
user warning: Column 'nid' in where clause is ambiguous query: SELECT
title FROM node INNER JOIN node_access na ON na.nid = n.nid WHERE
(na.grant_view >= 1 AND ((na.gid = 0 AND na.realm = 'all') OR (na.gid
= 1 AND na.realm = 'content_access_rid'))) AND ( nid=7626) in /includes/database.mysql.inc on line 174.
The nid=7626 is referring to a field_name that is connected to the content type.
When I try and grant access to the content type I get an Internal Server Error.
Now my logs are saying that my database schema is not up to date.
I may have accidentally placed a later version of this module on the server.
Any ideas?
You can manually turn off the module in the MySQL table in your Drupal db called system.
Browse the table and you should see the module names somewhere and an enabled flag. Remember to run update.php after you turn it off.
Even if you choose to keep the module running update.php will apply module schema changes that have not been applied, and thus fix the issue.

Resources