How can we create a redux store in DvaJs? - redux

I have a project with umi and dva and I can't create a global store...
My project is based on this template: https://ant.design/docs/react/practical-projects

Newest documents but hasn't been translated yet: https://umijs.org/plugins/plugin-dva
Create global model files in folder src/models.
If a model file defined an effective model, it can be listed by running command umi dva list model.

Related

System Templates version 1.31.0 and higher implementation

I have upgraded my cloud Artifactory to "7.52.0".
Prior to the upgrade I was using System Templates to deploy my pipelines.
Although after the upgrade there is still backward compatibility, The new way to deploy and use System Templates for creating new pipelines is not working for me.
From the release notes I got to this link to configure System Templates in the new way.
https://www.jfrog.com/confluence/display/JFROG/System+Templates
So in my repository A I have 2 files 'pipelines.yml' and 'values.yml'
pipelines.yml is configured as follows:
valuesFilePath: ./values.yml
Include:
template: myTemplates/TestTemplate/1.0.0
My values file contains values for the TestTemplate.
Then I go to https://example.jfrog.io/ui/admin/pipelines/pipelineSources and I try to create a new pipeline from repository A.
Looking at https://example.jfrog.io/ui/pipelines/myPipelines/myPipelines I don't see any pipeline created from the template.
Is that the right way to implement the new System Template?
I have also made sure that the templates are in the Artifactory by checking:
https://example.jfrog.io/ui/pipelines/templates
and also in the Artifactory directory tree.
Currently I am using the REST API in order to CRUD my Template Sources(https://example.jfrog.io/ui/pipelines/sources) and also use the REST API to create a new pipelines sources from a system template (apparently this is the old way).
As after the upgrade creating a source pipeline doesn't sync the old/new templates nor does it create a new pipeline from a system template that is located in the Artifactory.
You need to use the syntax documented in the Global template link.
Using the "jfrog/PublishTemplate" global template documentation
https://www.jfrog.com/confluence/display/JFROG/Global+Templates . I have noticed that in order to create and upload a system template you need to use the following syntax:
valuesFilePath: ./values.yml
include:
template: jfrog/<global_template_name>/<template_version>
According to the system template documentation this is the syntax that got me confused:
valuesFilePath: ./values.yml
Include:
template: jfrog/PublishTemplate/1.0.0
So I have used capital "I" instead of small "i" and bad indentation in order to create a new pipeline from my system template, which failed.
You use the Global template "PublishTemplate" for uploading your system template into your artifactory.
And then use the uploaded templates in order to create your new pipelines.

How to import a Resource file within Test cases in Robot Framework?

I have 2 Testcases within a Robot suite. This suite is like the Initialization Suite which has a dependency on an underlying framework(UF).
UF has different folder structure for the main Initialization Suite, Functional Suites and few other tools and calls them with separate robot commands. So I cannot store variables with Set Global Variables during initialization but have to create resource files which I will import in Functional Suites.
TC1: Parses a json file and creates a variables.txt file.
TC2: Uses few variables stored in variables.txt and logs into server the gets the node details and stores in hostname.txt
Is there a way to import/source the variables.txt within TC2 ?
Looking for this implementation, as there are Common User Keywords(CUKW) which will also need this variables.txt. As this is dynamically generated I cannot define it as Resource in Settings section in CUKW.
New to Robot framework, Apologies for any misunderstanding. Any better implementation suggestions are most welcome.
Thanks in Advance!
You can use the import resource keyword.
Imports a resource file with the given path.
Resources imported with this keyword are set into the test suite scope similarly when importing them in the Setting table using the Resource setting.

Form Recognizer - model for each environment?

On our test environment we created custom Form Recognizer model. Is there a way to reuse this model on PROD environment? Prod environment is under different subscription.
I cannot find a way to somehow "export" model and move it to other environment. Do I need to create new model from scratch?
You can use the Copy Model API to copy a model between regions and subscriptions. See here for more details -
How to Copy API - https://learn.microsoft.com/en-us/azure/cognitive-services/form-recognizer/disaster-recovery#copy-api-overview
Copy Model API reference - https://westcentralus.dev.cognitive.microsoft.com/docs/services/form-recognizer-api-v2-1-preview-2/operations/CopyCustomFormModel

Alfresco share UI form for custom model

I'm using Alfresco one 5.1 Enterprise edition. I've created a custom content model using the Model Manager in Alfresco & it has some custom properties (ds:prority, ds:action, ds:actionText, ds:linkURL, etc) associated with it. I would like to customize Share UI to include these custom properties alongside the default cm:content properties (cm:content, cm:description, cm:title, etc). I'm referring to Jeff Potts post on ecmarchitect to use custom model & share UI customization.
Now my question is, can I use the model created in Alfresco Model Manager & create a customized Share form along with these custom properties. All the examples for this process I see content model definition done in a Alfresco repo AMP and then have the Share form customization done in an Alfresco share amp. Can I create the share AMP alone (for my Share UI Customization) & still refer the model which I've already created in Alfresco Model Manager ?
You can use model console to list out created & deployed models.
http://IP:Port/alfresco/s/enterprise/admin/admin-repoconsole
Command : show models
If you're able to see your model with loaded (isLoaded) status as "Yes" then you can deploy on the share amp should be fine.
Have you created this model in your development machine or production machine?
If it is development machine, you need the Repo amp to deploy the model in the production machine.
##
## Model Admin Commands
##
ok> show models
Show deployed models - that are stored in the repository data dictionary.
ok> deploy model
Upload model to repository and load into runtime data dictionary. This will also
set the repository model as active.
If a model is already deployed then it will be updated and re-deployed.
e.g. deploy model alfresco/extension/exampleModel.xml
ok> undeploy model
Permanently delete model from repository (all versions) and unload from runtime data dictionary.
e.g. undeploy model exampleModel.xml
ok> activate model
Set repository model to active and load into runtime data dictionary.
e.g. activate model exampleModel.xml
ok> deactivate model
Set repository model to inactive and unload from runtime data dictionary.
e.g. deactivate model exampleModel.xml
As you're using Enterpise edition, you can get in touch with Alfresco support also.
Hope this helps you.
As pointed out by Murali, once the model is active, we can create a Share AMP as pointed out in Jeff Potts tutorial, create a share amp archive using maven target (mvn package) and deploy the same to alfresco amps_share folder (/alfresco_one/amps_share) and then apply the same with the apply_amps.sh command in /bin/apply_amps.sh and then restart alfresco.
Note for my requirement I needed the custom properties to be part of inline-edit screen of share only. So I added the
<config evaluator="node-type" condition="<my model>">
...
<form id="doclib-inline-edit">
...
<show id="my:property" force="true" />
...
...
</config>
Initially I had cm:content in the condition evaluator & it didn't work. Only after updating it to my model name it started reflecting the changes.
Note: without restarting Alfresco the changes are not reflecting.

What is best way to export and import security permissions across environments?

We have large number of publications and currently we manually apply the CMS permissions across multiple environments (UAT and PROD mainly). This is tedious and often times error prone.
We are trying to export and import the CMS permissions across multiple environments, so this could be done once manually and ported to other environments using some sort of tool.
Environment: Tridion 2011 SP1 + IIS 7.5 + SQL Server 2008 r2
In Old PowerTools (VBScript) there used to be some tool to manage Access management which could be handy, still error prone. We are not interested to use the OLD powertools for obvious reasons and it is recurring operation function so DB option is ruled out as well.
We are considering to build a tool using Core Service, that could export and Import the permissions. We have the same Groups, Publications and Folder structure across these environments.
Has anyone tried this before? What are the experiences or practices that other fellow Tridioneers used in large implementations.?
Any view points are greatly appreciated.
I once wrote a tool that allowed you to describe your desired permissions settings as JSON, and apply them via the API. To be honest, if you were to write a DTAP-security tool, I'd follow a similar approach. Start by being able to express your desired settings in an open, text-based format, then write a tool that imports them. Once you have this, you can easily build a tool that exports them.
I created a security migration tool in Tridion 5.2 but the solution approach will still apply to current versions of Tridion
Summary
The solution used a set of simple vbscript export page templates to extract the security information as xml and store it in a set of components.
I then used Tridion Content Porter to move these security components, page templates and tbbs's to the target CMS's
Using set of simple import page templates to open the security xml components to apply the security setting to the target cms.
The tcmid's will be different in the target cms so the import functions must use webdav urls and build dictionaries of tcmid's for trustees etc
Details
Export Security Groups
iterate selected Groups
append group xml
save xml in component
Export Publication Rights
getlistpublications
iterate list of publications
get each publication xml
remove "//tcm:CategoriesXSD" node
appendChild publication xml
remove unwanted "//tcm:Trustee" nodes
save xml in component
Export Folder Permissions
recursively iterate folders
append Folder XML
remove trustee nodes that are inherited ("IsInheritanceRoot")
save xml in component
Export Structure Group Permissions
recursively iterate Structure groups
append structure group XML
remove metadata node "//tcm:Metadata"
filter out unwanted Trustees
save xml in component
Import Security Groups
load xml from security component
iterate group nodes
create groups if they don't already exist
Import Publication rights
load xml from security component
update xml tcmid's
iterate publications
load publication xml
build xml updategram
update publication xml
Import Folder Permissions
load xml from security component
update xml tcmid's
for each folder node
build updategram xml
open folder
update folder xml
Import Structure Group Permissions
load xml from security component
update xml tcmid's
for each structure group node
build updategram xml
open structure group
update structure group

Resources