workflows to update data in IMIS enterprise cloud - crm

I am new to IMIS which is an Engagement management system with membership handling inbuilt. I need some inputs about the business process automation in IMIS. I see IMIS does not allow creating stored procedures to clients using IMIS cloud hosting. This seems to be a concern as every business has simple to complex processes and many processes updates data too. I am surprised to see that IMIS does not have a proper workflow builder (GUI based) where one can select as if some Business object is updated/created or have some value X then "update" some other business object/data i.e update the database table with specific values.
Best example if I want to use a custom Membership numbers (autonumbers) then I cannot do that. I cannot update table data via IMIS "process automation". IMIS "process builder" offers very limited functionality. It do allows calling stored procedure via it but what is the use of it as we cannot create a stored procedure in IMIS. IMIS does allow calling 3rd party REST api via process automation too for particular trigger. With in IMIS there is no way to add custom code handling or custom process handling or a UI based workflow with if/else (a BPM which is available in Salesforce, Zoho and other CRM).
SO I want to know if there is any IMIS consultant here then please guide me how do you suggest handling complex business process automations ? I know IMIS allows REST API but that means I need to write a custom code to handle the process automation and it also means I need to do it outside of IMIS cloud. I just need a confirmation that business process need to be handled via customization outside IMIS through an external system. The external system will get data from IMIS, make the data changes and push it back to IMIS via API. I can build it but before doing so I am trying to make sure I am not missing something which is inside in IMIS (99% sure it cannot be done inside IMIS)
I also believe IMIS does not have webhooks as well as there is no menu to configure the same in IMIS. Please guide on this.
Please do not point me back to "process automation" menu or feature of IMIS as I am aware of it and its limitation too. I have explained its limitation above after checking with the IMIS support desk.

Nope, you can't do complex workflows in iMIS cloud (yet, as of July 2022). No web hooks either. Both of these requirements can be covered with a cloud based RPA solution.

Related

Accessing data in Google Drive tables with Google App maker

How can I access the data contained in Google drive tables?
I created some basic apps after reading the tutorials, but I'm unable to to see the sample data I entered.
Option 1: Don't use Drive Tables
App Maker gives you power to connect to Cloud SQL instance, that you can access through Cloud Shell or connect with some tools like MySQL Workbench.
Pros
Works nice if you have your own Cloud SQL instance
Cons
This approach falls short in environments with shared Cloud SQL instances
Cloud SQL is not free
Option 2: Export data to spreadsheet
Regardless what data backend you are using you can dump all your data to spreadsheet.
Cons
Basically it is one-way data access (read only), unless you want to mess with importing edited data back to your deployemnt...
Option 3: Drag'n'drop way
There is an option to keep Debug page and drop on it tables/forms for models you want to look it.
Pros
Seamless access to data directly from you development environment (browser).
Cons
It is hard to maintain such Debug especially when you are actively working on your database structure.
Need to think about security and hide the page from your end users.
Need to keep dedicated datasource(s) for the debug tables/forms, that should have no filters applied (at this time there is no client-side analogue of app.models.MyModel.newQuery() server-side API).
Option 4: Drag'n'drop way (advanced edition!)
The basic idea is to create a page with dynamic table (to view/edit data) and form (to add records). Using this highly dynamic page will eliminate shortage #1 from the option #3:
This approach has similar pros and cons as previous one plus there are some App Maker limitations that make it hard to implement (but it is doable on some extent).

Validate data before insertion in Firebase

I'm building an app which uses user contributed content.
The contribution by each user should be available to all others in real time.
I was looking into firebase Realtime database for this.
However, when a user contributes content, there are quite heavy validations and calculations (read server side) to be done on the data before making it available to others.
Is it possible to have a server side validation in firebase ? Or should I look for alternatives ?
Initially, Firebase did not have a feature to implement server-side processing/calculations. All your processing had to be done on the client side.
Now, they've recently introduced a new feature called Cloud Functions For Firebase. Its a really useful new addition where you can write server-side code without the hassles of managing servers or instances. Read up more about it from the above link.
Also, this Youtube playlist by Jen Person is a great start. And, you can find examples similar to your use case here.

Create opportunities in Marketo

I want to integrate Marketo api with my .net project.
My client has given a username & password to Marketo. I want to retrieve "opportunities" from Marketo. I have written code for that. Currently there are not any opportunities so I'm not able to test my code. Has anyone an idea how to create opportunities in Marketo so I can whetehr check my code is retrieving that records or not?
Opportunities are not visible as standalone entities in Marketo as they are in SFDC or other CRMs. Rather, as #joev said, you have to find a lead that has an opportunity and view the opportunity details within the context of that lead's detail view, on the Opportunity tab.
If you want to use the GUI to create an opportunity, the right place to do that would be in the CRM — not in Marketo.
You need to have create/update right via API, ask your client for create/update rights then you will be able to create some data through API itself.
Like other people have mentioned there isn't a direct GUI inside Marketo for creating opportunities. These are typically created via the CRM (aka Salesforce, NetSuite, Microsoft Dynamics, etc...) and then sync'd to Marketo.
If you want to test it and have no CRM available -- technically you can just use the REST API to sync your own opportunity to a lead. Once done you could login to Marketo to visually validate it. Then now that you have an opportunity created you could then use the API to GET the opportunities and further validate your integration.
Here is a link to Marketo's documentation on how to sync an opportunity to Marketo. Also on that page is the documentation on the other REST API options.

Is there a direct way to query and update App data from within a proxy or do I have to use the management API?

I have a need to change Attributes of an App and I understand I can do it with management server API calls.
The two issues with using the management server APIs are:
performance: it’s making calls to the management server, when it
might be possible directly in the message processor. Performance
issues can probably be mitigated with caching.
availability: having to use management server APIs means that the system is
dependent on the management server being available. While if it were
done directly in the proxy itself, it would reduce the number of
failure points.
Any recommended alternatives?
Finally all entities are stored in the cassandra ( for the runtime )
Your best choice is using access entity policy for getting any info about an entity. That would not hit the MS. But just for your information - most of the time you do not even need an access entity policy. When you use a validate apikey or validate access token policy - all the related entity details are made available as flow variable by the MP. So no additional access entity calls should be required.
When you are updating any entity (like developer, application) - I really assume it is management type use case and not a runtime use case. Hence using management APIs should be fine.
If your use case requires a runtime API call to in-turn update an attribute in the application then possibly that attribute should not be part of the application. Think how you can take it out to a cache, KVM or some other place where you can access it from MP (Just a thought without completely knowing the use cases ).
The design of the system is that all entity editing goes through the Management Server, which in turn is responsible for performing the edits in a performant and scalable way. The Management Server is also responsible for knowing which message processors need to be informed of the changes via zookeeper registration. This also ensures that if a given Message Processor is unavailable because it, for example, is being upgraded, it will get the updates whenever it becomes available. The Management Server is the source of truth.
In the case of Developer App Attributes, (or really any App meta-data) the values are cached for 3 minutes (I think), so that the Message Processor may not see the new values for up to 3 minutes.
As far as availability, the Management Server is designed to be highly available, relying on the same underlying architecture as the message processor design.

Looking for guidance on WF4

We have a rather large document routing framework that's currently implemented in SharePoint (with a large set of cumbersome SP workflows), and it's running into the edge of what SP can do easily. It's slated for a rewrite into .NET
I've spent the past week or so reading and watching WF4 discussions and demonstrations to get an idea of WF4, because I think it's the right solution. I'm having difficulty envisioning how the system will be configured, though, so I need guidance on a few points from people with experience:
Let's say I have an approval that has to be made on a document. When the wf starts, it'll decide who should approve, and send that person an email notification. Inside the notification, the user would have an option to load an ASP.NET page to approve or reject. The workflow would then have to be resumed from the send email step. If I'm planning on running this as a WCF WF Service, how do I get back into the correct instance of the paused service? (considering I've configured AppFabric and persistence) I somewhat understand the idea of a correlation handle, but don't think it's meant for this case.
Logging and auditing will be key for this system. I see the AppFabric makes event logs of this data, but I haven't cracked the underlying database--is it simple to use for reporting, or should I create custom logging activities to put around my actions? From experience, which would you suggest?
Thanks for any guidance you can provide. I'm happy to give further examples if necessary.
To send messages to a specific workflow instance you need to set up message correlation between your different Receive activities. In order to do that you need some unique value as part of your message data.
The Appfabric logging works well but if you want to create custom a custom logging solution you don't need to add activities to your workflow. Instead you create a custom TrackingParticipant to do the work for you. How you store the data is then up to you.
Your scenario is very similar to the one I used for the Introduction to Workflow Services Hands On Lab in the Visual Studio 2010 Training Kit. I suggest you take a look at the hands on lab or the Windows Server AppFabric / Workflow Services Demo - Contoso HR sample code.

Resources