How does the Tridion GUI Extension config map the names to the JS file? For example, I am using Jaime's HelloWorld post with example files. The most important part feels to be the CommandSet section.
<cfg:commandset id="HelloWorldCM.Interface">
<cfg:command name="HelloWorldCM" implementation="Extensions.HW"/>
<cfg:dependencies>
<cfg:dependency>HelloWorldCM.Commandset</cfg:dependency>
</cfg:dependencies>
</cfg:commandset>
Can someone please help me understand the following attributes and how they map to the underlying .js file for the extension?
name
implementation
cfg:dependency
I have tried changing these things in both config and js file but how they are mapped is a mystery.
The three attributes you mention are really all pointers that tie the whole extension together. If you look higher up in the Jamie's sample, you will see this:
<ext:contextmenus>
<ext:add>
<ext:extension name="HelloWorldCMExtension" assignid="" insertbefore="cm_refresh">
<ext:menudeclaration>
<cmenu:ContextMenuItem id="ext_HelloWorldCM" name="Hello World!" command="HelloWorldCM"/>
</ext:menudeclaration>
<ext:dependencies>
<cfg:dependency>HelloWorldCM.Example</cfg:dependency>
</ext:dependencies>
<ext:apply>
<ext:view name="DashboardView"/>
</ext:apply>
</ext:extension>
</ext:add>
</ext:contextmenus>
This XML adds a button to the CME's context menu.
command="HelloWorldCM" refers to the command with the matching name attribute in the commandset
implementation="Extensions.HW" in the command set actually refers to the namespace in the accompanying HellowWorldCM.js file
cfg:dependency points to the top of the config file at the <cfg:group name="HelloWorldCM.Commandset" merger="Tridion.Web.UI.Core.Configuration.Resources.CommandGroupProcessor" merge="always"> node in order to know which CSS and JS to include.
Related
I am trying to reference Newtonsoft.Json.xml as part of documenting my own code that is dependent on it.
When I try and compile my help file in Sandcastle I get the following error:
BuildAssembler : error : [...\BuildTopics.proj]
BuildAssembler : error : CodeBlockComponent: [N:Newtonsoft.Json] Unable to load source file '...\Visual Studio 2017\Projects\CBUSAlexa\trunk\Src\Newtonsoft.Json.Tests\Documentation\SerializationTests.cs'
Error: Could not find a part of the path '...src\Newtonsoft.Json.Tests\Documentation\SerializationTests.cs'. [...\BuildTopics.proj]
(some paths have been obfuscated)
Looking in the Newtonsoft.Json.Xml I see:
<member name="T:Newtonsoft.Json.DefaultValueHandling">
<summary>
Specifies default value handling options for the <see cref="T:Newtonsoft.Json.JsonSerializer"/>.
</summary>
<example>
<code lang="cs" source="..\Src\Newtonsoft.Json.Tests\Documentation\SerializationTests.cs" region="ReducingSerializedJsonSizeDefaultValueHandlingObject" title="DefaultValueHandling Class" />
<code lang="cs" source="..\Src\Newtonsoft.Json.Tests\Documentation\SerializationTests.cs" region="ReducingSerializedJsonSizeDefaultValueHandlingExample" title="DefaultValueHandling Ignore Example" />
</example>
</member>
This code is not shipped as part of the Json.NET install and hence the src path is missing (which makes sense as they are tests) and yet Sandcastle should not just abandon the build as they are missing.
I would like the build to complete and ignore these linked files - or should there be guidance around shipping any files referenced in the documentation xml file?
In the Components tab of the Sandcastle Help File Builder for your project, select the 'Code Block Component' and add it to your project. When configuring it make sure 'Allow missing source code files/regions' is checked.
It is kind of misleading, as it appears that if the component is not added that it shouldn't attempt to look for these items. Note also that the Build tab 'disable the custom code block component' is different from the above so checking that will have no impact on your build error.
I am modifying some existing projects that uses QT (version 5.10). I am updating them to use better, more concise msbuild syntax (Target Visual Studio 2015) on windows.
One project has about 170 header files, of which about 135 header files need to be run through MOC.exe.
Therefore I wrote a custom target to send 135 files to moc.exe. But the msbuild syntax to tell Moc which files to process is quite long.
i.e.
<QtMocs Include="A.h;
B.h;
C.h;
D.h;
etc...
I tried sending ALL of the header files through to moc.exe. But if a header file doesn't have Q_OBJECT, then moc.exe emits a warning about not needing to moc the header file. And to add insult to injury it still emits a cpp file, even though nothing needed to be moc'd.
So I'd like to write a nice short (one line?) concise way to tell QT to moc only the headers that are needed.
Is this possible?
So after two days with no response, I decided to write my own solution, which works really well. I wrote a custom task for MSBuild. It takes an array of ProjectItem's that is supposed to point to all the header files in your project. Then for each file in the array, it opens the files, searches for Q_OBJECT and if found saves off the Item into an output array. That output array is then queried later on and sent to moc.exe.
<!-- Task to automatically discover header files that need to be run through QT's MOC.exe compiler.
It does this by examing each file and checking if 'Q_OBJECT' is in the file. -->
<UsingTask TaskName="FindFilesForQtMoc" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll" >
<ParameterGroup>
<Files ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true" />
<MocFiles ParameterType="Microsoft.Build.Framework.ITaskItem[]" Output="true" />
</ParameterGroup>
<Task>
<Using Namespace="System" />
<Using Namespace="System.IO" />
<Using Namespace="System.Diagnostics" />
<Code Type="Fragment" Language="cs">
<![CDATA[
var result = new List<Microsoft.Build.Framework.ITaskItem>();
foreach(var item in Files)
{
String filePath = item.GetMetadata("FullPath");
var content = File.ReadAllText(filePath);
if (content.Contains("Q_OBJECT"))
{
result.Add(item);
}
}
MocFiles = result.ToArray();
]]>
</Code>
</Task>
</UsingTask>
I call the task like this:
<FindFilesForQtMoc Files="#(ClInclude)" >
<Output ItemName="FileForMoc" TaskParameter="MocFiles" />
</FindFilesForQtMoc>
<Message Text="Moc: %(FileForMoc.Identity)" />
Therefore I only have to declare all my header files in my .vcxproj like this:
<ClInclude Include="*.h" />
Which is way better than explicitly declaring each and every file that needs moc.exe.
I am trying to set up through the zcml the engine and session for connection to a database with the use of z3c.saconfig. I am using Plone 4.3.2.
I was following along with a book on Plone 4 called Professional Plone Develop book and under [instance] in buildout.cfg it says to place zcml-additional and it sets the engine and session.
Here is what the instance portion of the buildout.cfg looks like:
[instance]
<= instance_base
recipe = plone.recipe.zope2instance
http-address = 8080
zcml-additional =
<configure xmlns="http://namespaces.zope.org/zope"
xmlns="http://namespaces.zope.org/db"
>
<include package="z3c.saconfig" file="meta.zcml" />
<db:engine name="testA" url="mysql://uName:uPass#localhost/GPCL_Asset_Tracker"/>
<db:session engine="testA" />
</configure>
Also, I have a package called gpcl.calibration and in the setup.py file I added underneath install_requires 'MySQL-Python' and 'z3c.saconfig', which work and do not cause a problem in the buildout.
Unfortunately I am getting this error:
ZopeSAXParseException: File "/home/pjdowney/Plone/GPCLAssetTrackerD/parts/instance/etc/package-includes/999-additional-overrides.zcml", line 2.0, duplicate attribute
Is zcml-additional defined elsewhere not in buildout.cfg? In the book, I did notice it has underneath [instance] the http-address and user, which seem to have been moved to underneath [buildout] instead.
This is a typo: you cannot have two attributes both named xmlns on your configure element. Going by the <db:engine that follows, it probably should read
<configure xmlns="http://namespaces.zope.org/zope"
xmlns:db="http://namespaces.zope.org/db"
>
I have been struggling trying to figure out how to conditionally include Flex libraries in an ant build based on a property that is set on the command line. I have tried a number of approaches with the <condition/> task, but so far have not gotten it to work. Here is where I am currently.
I have an init target that includes condition tasks like this:
<condition property="automation.libs" value="automation.qtp">
<equals arg1="${automation}" arg2="qtp" casesensitive="false" trim="true"/>
</condition>
The purpose of this task is to set a property that determines the name of the patternset to be used when declaring the implicit fileset on a mxmlc or compc task. The pattern set referenced above is defined as:
<patternset id="automation.qtp">
<include name="automation*.swc"/>
<include name="qtp.swc"/>
</patternset>
The named patternset is then referenced by the mxmlc or compc task like this:
<compc>
<compiler.include-libraries dir="${FLEX_HOME}/frameworks/libs" append="true">
<patternset refid="${automation.libs}"/>
</compiler.include-libraries>
</compc>
This doesn't appear to work. At least the SWC size does not indicate that the additional automation libraries have been compiled in. I want to be able to specify a command line property that determine which patternset to use for various types of builds.
Does anyone have any ideas about how to accomplish this? Thanks!
If you can't get <patternset> to work correctly, you might want to take a look at the <if> <then> and <else> tasks provided by ant-contrib. We ended up doing something like this:
<target name = "build">
<if>
<equals arg1="automation.qtp" arg2="true"/>
<then>
<!--
- Build with QTP support.
-->
</then>
<else>
<!--
- Build without QTP support.
-->
</else>
</if>
</target>
There is some duplication of build logic between the if and else branch, but you can factor some of that out if you wrap <mxmlc> with a macrodef.
The mxmlc task supports loading configuration files <load-config filename="path/to/flex-config.xml" />. So, generate the config xml on the fly, by combining the echoxml task and if-then-else.
<echoxml file="path/to/flex-config.xml">
<flex-config>
<compiler>
<library-path append="true">
<path-element>${lib.qtp}</path-element>
</library-path>
</compiler>
</flex-config>
</echoxml>
If your needs are more complicated, you could even generate several xml configs and <load-config ... /> them all.
Personally, I find any logic very terse and ugly to write using Ant's conditions or if-then-else, XML is not a pretty language to use for programming. Luckily, it's possible to use more flexible approach - write a script to produce the config xml, before calling mxmlc. E.g. use the script task with your favorite scripting language
<script language="javascript">
<![CDATA[
// Create your XML dynamically here.
// Write that XML to an external file.
// Later, feed that file to mxmlc using `<load-config ... />`.
]]>
</script>
I have created a configuration section designer project to represent nodes of a custom section necessary to read and save from my web application. I am able to successfully create instances of the configuration elements and collections, however when I save the configuration using the referenced System.Configuration.Configuration object and issuing save, the elements get merged into their parents as attributes. An example of the issue is outlined below:
After calling the referenced Configuration.save, the output is as follows:
<savedReports xmlns="SavedReportSchema.xsd">
<resultsSets dataViewId="1" id="4203bb88-b0c4-4d57-8708-18e48f0a1d2d">
<selects keyId="1" sortOrder="1" />
</resultsSets>
</savedReports>
As defined in my configuration section designer project (confirmed by the resulting xsd as well) the output should match the following:
<savedReports xmlns="SavedReportSchema.xsd">
<resultsSets>
<savedReport id="1">
<selects>
<select keyId="1" sortOrder="1"/>
</selects>
</savedReport>
</resultsSets>
</savedReports>
Any ideas? The element collection types are set to BasicMapAlternate however when I set them to AddRemoveClearMapAlternate they are not merged but they are prefixed by "add" rather than "select" or "savedReport" causing the validation to be off.
Turns out AddRemoveClearMapAlternate was the option I needed to correct my problem referenced in the question.