We have installed openstack on Centos as per the procedures here...
http://docs.openstack.org/icehouse/install-guide/install/yum/content/index.html
I've been trying to access the administrative end points to list users (and then add users and tenants)
However the administrative extensions don't seem to be available as when I do a GET to...
http://horizonip:5000/v2.0/users
I get...
<error message="The resource could not be found." code="404" title="Not Found"/>
and when I do...
http://horizonip:5000/v2.0/extensions
I get...
<extensions>
<extension updated="2013-12-17T12:00:0-00:00" name="OpenStack Federation APIs" namespace="http://docs.openstack.org/identity/api/ext/OS-FEDERATION/v1.0" alias="OS-FEDERATION">
<links>
<link href="https://github.com/openstack/identity-api" type="text/html" rel="describedby"/>
</links>
<description>OpenStack Identity Providers Mechanism.</description>
</extension>
<extension updated="2013-07-07T12:00:0-00:00" name="OpenStack Keystone User CRUD" namespace="http://docs.openstack.org/identity/api/ext/OS-KSCRUD/v1.0" alias="OS-KSCRUD">
<links>
<link href="https://github.com/openstack/identity-api" type="text/html" rel="describedby"/>
</links>
<description>OpenStack extensions to Keystone v2.0 API enabling User Operations.</description>
</extension>
<extension updated="2013-07-07T12:00:0-00:00" name="OpenStack EC2 API" namespace="http://docs.openstack.org/identity/api/ext/OS-EC2/v1.0" alias="OS-EC2">
<links>
<link href="https://github.com/openstack/identity-api" type="text/html" rel="describedby"/>
</links>
<description>OpenStack EC2 Credentials backend.</description>
</extension>
<extension updated="2014-01-20T12:00:0-00:00" name="OpenStack Simple Certificate API" namespace="http://docs.openstack.org/identity/api/ext/OS-SIMPLE-CERT/v1.0" alias="OS-SIMPLE-CERT">
<links>
<link href="https://github.com/openstack/identity-api" type="text/html" rel="describedby"/>
</links>
<description>OpenStack simple certificate retrieval extension</description>
</extension>
</extensions>
I can't find any documentation on this side of things but I've been routing around and have found a folder admin_crud on the server which may have what's needed but I have no idea what to put in keystone.conf
Any help appreciated.
I discovered that these are inbuilt extensions and don't appear in the output from...
http://horizonip:5000/v2.0/extensions
All the administrative extensions are accessed through a different port, the default is 35357 but may be different and can be checked in the keystone.conf file by searching for admin_port.
Just use...
http://horizonip:admin_port/
as the base url for any administrative api calls.
Related
I'm facing following issue:
In one of my content project exists the file "filter.xml". It contains following entries:
<?xml version="1.0" encoding="UTF-8"?>
<workspaceFilter version="1.0">
<filter root="/content/sites/de/produktpartner/a/rep:policy" mode="merge" />
...
<filter root="/content/sites/de/produktpartner/z/rep:policy" mode="merge" />
</workspaceFilter>
My question is: To limit the entries in filter.xml can I use wildcards? If so, how?
I tried
<filter root="/content/sites/de/produktpartner/*/rep:policy" mode="merge" />
but it seemed not work.
The root must be a path but you can further specify filters that allow regular expressions.
<filter root="/content/sites" mode="merge" />
<include pattern="/content/sites/[a-z]{2}/produktpartner/(.*)/rep:policy"/>
</filter>
However, managing permissions using CRX packages can be very cumbersome. Check out AEM Permission Management
It's a tool that supports a permission management DSL that makes the whole ordeal a lot easier. I work for the company that developed it and we use it on the vast majority of our projects.
The Access Control Tool for Adobe Experience Manager is another option that has worked for me in the past.
I have multiple ASP .NET MVC and Web Api apps hosted on Azure.
I use two differents stages for deployment : stage and PROD. Basically, I want to have two different folders for logs : logs-stage and logs-PROD. This is working well based on my web.config and properties set directly in Azure.
The issue is that each time I deploy, all the previous logs are deleted. How can I avoid that ?
My NLog config looks like the following :
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<targets>
<target xsi:type="File" name="f" fileName="${basedir}/logs-${appsetting:name=Version:default=DEV}/website.${shortdate}.log"
layout="${longdate} ${uppercase:${level}} ${logger} - ${message}" />
<target name="email-Errors" xsi:type="Mail"
smtpServer="smtp.sendgrid.net"
smtpPort="myPort"
enableSsl="false"
smtpUsername="myUsername"
smtpPassword="myPassword"
smtpAuthentication="Basic"
from="myEmail"
to="${appsetting:name=WEBSITE_EMAIL_DEVELOPERS:default=myEmail}"
subject="[${appsetting:name=Version:default=DEV}][WEB][${uppercase:${level}}]"
layout="${longdate} ${uppercase:${level}} ${logger} - ${message}"
html="false" />
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="f" />
<logger name="*" minlevel="Error" writeTo="email-Errors" />
</rules>
</nlog>
The issue is that each time I deploy, all the previous logs are deleted. How can I avoid that ?
According to your description, it seems that when you publish the WebApp then remove additional file in the Azure WebApp. If it is that case please have a try to uncheck the [Remove additional files as destination] option during publish the WebApp.
Updated:
From the blog, we could know that if we swap the slots, just the DNS pointer changes, so if you have the nlog in the product after swap then the Nlog file is in the staging slot that not deleted. The following the snippet from the blog.
In short, the Swap operation will exchange the website's content between 2 deployment slots.
Swapped and what is not but note that swap is not about copying the content of the website but more about swapping DNS pointers.
I am new to using NLog with ASP.NET Core, so I have followed the guide here:
https://github.com/NLog/NLog.Web/wiki/Getting-started-with-ASP.NET-Core-(project.json)
I have created the following nlog.config file at the root of the project directory:
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
autoReload="true"
internalLogLevel="Warn"
internalLogFile="c:\temp\internal-nlog.txt">
<extensions>
<add assembly="NLog.Web.AspNetCore"/>
</extensions>
<!-- define various log targets -->
<targets>
<!-- write logs to file -->
<target xsi:type="File" name="allfile" fileName="${basedir}\nlog-all-${shortdate}.log"
layout="${longdate}|${event-properties:item=EventId.Id}|${logger}|${uppercase:${level}}|${message} ${exception}" />
<target xsi:type="File" name="ownFile-web" fileName="${basedir}\nlog-own-${shortdate}.log"
layout="${longdate}|${event-properties:item=EventId.Id}|${logger}|${uppercase:${level}}| ${message} ${exception}|url: ${aspnet-request-url}|action: ${aspnet-mvc-action}" />
<target xsi:type="Null" name="blackhole" />
</targets>
<rules>
<!--All logs, including from Microsoft-->
<logger name="*" minlevel="Trace" writeTo="allfile" />
<!--Skip Microsoft logs and so log only own logs-->
<logger name="Microsoft.*" minlevel="Trace" writeTo="blackhole" final="true" />
<logger name="*" minlevel="Trace" writeTo="ownFile-web" />
</rules>
</nlog>
Inside a controller, I call a line like this one:
_logger.LogInformation("Entered CustomerRange method");
which returns the following in the output window in Visual Studio:
CustomerMgmtAPI.Controllers.CustomerController:Information: Entered CustomerRange method
However, the actual log files are never created by NLog. I was wondering if someone can point out the error in the NLog configuration here, since I have been reviewing the documentation of NLog for ASP.NET Core project and I can't find the error myself.
So the actual fix to the problem was the remove the first line from the nlog.config file:
<?xml version="1.0" encoding="utf-8" ?>
I remove this line and everything started working as expected. I also noticed that Visual Studio was giving me these errors when that line was present:
Invalid token 'Text' at root level of document.
Unexpected XML declaration. The XML declaration must be the first node in the document and no white space characters are allowed to appear before it.
It seems in this case that the NLog tutorial is broken, as I just took over this file from the sample for ASP.NET Core. I am using VS2017, so perhaps there is an incompatibility with this version of VS?
The dirty little secret about using NLog with ASP.NET Core is that you can configure and create logs just as you did in ASP.NET and ASP.NET MVC. You just use the regular NLog Nuget package like you normally would.
Just create an NLog.config in your root, etc. You don't even have to make any extra configurations in the config or elsewhere to get it to work. You just reference NLog in your class and then create a logger with the LogManager.
What this means is that you don't have all of the wireup in Program.cs etc.
I used the following code in c# to get policies\rules from deployed application in BizTalk server.
BTSTask.exe ListApp -ApplicationName:"EAISolution" -ResourceSpec:"c:\EAISolution.PolicyInf
o.xml" /Server:VHYDTRBELSUP-02 /Database:BizTalkMgmtDb
From above command I got the output as below
<?xml version="1.0" encoding="utf-16" ?>
<ResourceSpec xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" ApplicationName="EAISolution" xmlns="http://schemas.microsoft.com/BizTalk/ApplicationDeployment/ResourceSpec/2004/12">
<Resources>
<Resource Type="System.BizTalk:BizTalkAssembly" Luid="EAIOrchestration, Version=1.0.0.0, Culture=neutral, PublicKeyToken=97e0f507fd7fd10d" />
<Resource Type="System.BizTalk:BizTalkAssembly" Luid="EAIServices, Version=1.0.0.0, Culture=neutral, PublicKeyToken=97e0f507fd7fd10d" />
<Resource Type="System.BizTalk:BizTalkAssembly" Luid="FFSchemasTest, Version=1.0.0.0, Culture=neutral, PublicKeyToken=97e0f507fd7fd10d" />
<Resource Type="System.BizTalk:Rules" Luid="RULE/ProcessPurchaseOrder/1.0" />
<Resource Type="System.BizTalk:BizTalkBinding" Luid="Application/EAISolution" />
</Resources>
</ResourceSpec>
and from BizTalk server I got the below output using policy export in BizTalk server administration
<?xml version="1.0" encoding="utf-8" ?>
<brl xmlns="http://schemas.microsoft.com/businessruleslanguage/2002">
<ruleset name="ProcessPurchaseOrder">
<version major="1" minor="0" description="" modifiedby="username" date="2013-05- 27T12:04:55.6121122+05:30" />
<configuration />
<bindings>
<xmldocument ref="xml_31" doctype="RuleTest.PO" instances="16" selectivity="1" instance="0">
<selector>/*[local-name()='PurchaseOrder' and namespace-uri() ='http://EAISolution.PurchaseOrder']/*[local-name()='Item' and namespace-uri()='']</selector>
<selectoralias>/PurchaseOrder/Item</selectoralias>
<schema>....\PO.xsd</schema>
</xmldocument>
<xmldocument ref="xml_32" doctype="RuleTest.PO" instances="16" selectivity="1" instance="0">
<selector>/*[local-name()='PurchaseOrder' and namespace-uri()='http://EAISolution.PurchaseOrder']
</selector>
<selectoralias>/PurchaseOrder</selectoralias>
<schema>....\PO.xsd</schema>
</xmldocument>
</bindings>
<rule name="ApprovalRule" priority="0" active="true">
<if>
<compare operator="less than or equal to">
<vocabularylink uri="3f0e9bcc-6212-4e6a-853c-e517f157a626" element="d4eb2deb-06d3-42c4-af49-ceb21331b1cc" />
<lhs>
<function>
<xmldocumentmember xmldocumentref="xml_31" type="int" sideeffects="false">
<field>*[local-name()='Quantity' and namespace-uri()='']</field>
<fieldalias>Quantity</fieldalias>
</xmldocumentmember>
</function>
</lhs>
<rhs>
<constant>
<int>500</int>
</constant>
</rhs>
</compare>
</if>
<then>
<function>
<xmldocumentmember xmldocumentref="xml_32" type="string" sideeffects="true">
<field>*[local-name()='Status' and namespace-uri()='']</field>
<fieldalias>Status</fieldalias>
<argument>
<constant>
<string>Approved</string>
</constant>
</argument>
</xmldocumentmember>
</function>
</then>
</rule>
</ruleset>
</brl>
So please let me know how to get the output of second using command line.
BTSTask will only export the policy as part of an MSI (see below).
You could then extract the MSI (see How to extract msu/msp/msi fileds from the command line) to get the policy file.
From How to Import a Policy
BTSTask does not provide a specific command for importing (or exporting) policies; however you can use the ExportApp command of BTSTask to selectively export only the policies in an application that you want, including no other application artifacts. Then you can use the ImportApp command to import the .msi file into an application in a different BizTalk group. This is the approach described in this topic. When you do this, the policy is automatically imported and published in the BizTalk group and added to the specified application.
The below steps will get export the policy, but as part of an MSI.
From How to Export a Policy
Use the BTSTask ListApp command with the /ResourceSpec option to generate an XML file that lists the artifacts in the BizTalk application from which you want to export a policy, as described in ListApp Command.
Edit the XML file generated in the previous step, deleting all of the artifacts except for the policy or policies that you want to export.
Use the BTSTask ExportApp command, and specify the modified XML file for the /ResourceSpec parameter. For more information, see ExportApp Command.
BTSTask exports the specified policies and all of their associated vocabularies into an application .msi file.
I have installed Tridion UI and getting below warning in the log file. Except this warning nothing is logged wrong in the log files even in debug mode.
“WARN AmbientDataContext - There is no current ambient data context -
the ambient data framework is not properly initialised”
This warning get logged in my session preview web service cd_core.2012-11-11.log file and in same log file of staging web application also.
I am suspecting that due to this warning I am not getting updated preview of page in the UI interface. Please see below attached screenshot for the UI error –
I also tried to update the preview by clicking on "update the page preview" button, but no luck.
To resolve this error I followed almost all answers related to this in the stack overflow.
If I refer the answer in this below question.
Tridion UI - Preview Not Updating
So we are correct on this point as content get published in the right place always. I would like to explore the similar point from the setting in the cd_dynamic_conf.xml of Session Preview service is something like below –
<URLMappings>
<StaticMappings>
<Publications>
<Publication Id="241">
<Host Domain="xyz" Port="80" Protocol="http" Path="/" />
</Publication>
<Publication Id="121">
<Host Domain="xyz" Port="80" Protocol="http" Path="/" />
</Publication>
</Publications>
</StaticMappings>
<StorageMapping IdentifyPublicationByProperty="publicationUrl"/>
</URLMappings>
And in the of the staging application
<URLMappings>
<StaticMappings>
<Publications>
<Publication Id="241">
<Host Domain="xyz" Port="80" Protocol="http" Path="/" />
</Publication>
</Publications>
</StaticMappings>
<StorageMapping IdentifyPublicationByProperty="publicationUrl"/>
</URLMappings>
And SiteEdit mark-up get created something like this –
For component presentation
<!-- Start SiteEdit Component Presentation: {"ID":"cp_5", "ComponentID":"tcm:240-22393", "ComponentVersion":19, "ComponentTemplateID":"tcm:240-23899-32", "IsQueryBased":false} -->
For page setting
<!-- SiteEdit Settings: {"PageID":"tcm:240-22507-64", "PageVersion":49, "TargetTypeID":"tcm:0-1-65438", "ComponentPresentationLocation":1} -->
Here I want to highlight few Important points as below -
1- I have web site in 2 languages one with /en and one with /fr under the same IIS directory.
In above setting publication id 241 is for my en version of website .so I also tried below setting
<Publication Id="240">
<Host Domain="xyz" Port="80" Protocol="http" Path="/en" />
</Publication>
But again no luck.
I can provide more information such as log files etc., if they are still required to investigate the issue.
Please help me to get rid of this very irritating issue on very earlier basis.
Edit -1 Please also find below config files for same
cd_ambient_conf.xml for Session Preview webservice
<?xml version="1.0" encoding="UTF-8" standalone="no" ?>
<Configuration xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" Version="6.1" xsi:noNamespaceSchemaLocation="schemas/cd_ambient_conf.xsd">
<!-- Cookies settings -->
<!-- <Cookies> <Cookie Type="Tracking" Name="myTrackingCookie" Path="/"/> <Cookie Type="Session" Name="mySessionCookie" Path="/"/> </Cookies> -->
<Cartridges>
<!-- Example cartridge definition -->
<!--
<Cartridge File="cd_ambient_cartridge_conf.xml"/>
-->
<Cartridge File="cd_webservice_preview_cartridge.xml"/>
</Cartridges>
</Configuration>
cd_ambient_conf.xml for stagging website
<?xml version="1.0" encoding="UTF-8"?>
<Configuration Version="6.1"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="schemas/cd_ambient_conf.xsd">
<Cartridges>
<Cartridge File="cd_webservice_preview_cartridge.xml"/>
<Cartridge File="footprint_cartridge_conf.xml"/>
</Cartridges>
<ClaimStoreProvider>com.tridion.siteedit.preview.PreviewClaimStoreProvider</ClaimStoreProvider>
</Configuration>
cd_ambient_cartridge_conf.xml for Staging website
<ClaimProcessorDefinitions>
<ClaimProcessorDefinition Uri="tcd:claimprocessor:example:userdetails" ImplementationClass="com.tridion.ambientdata.processing.ExampleClaimProcessor1"
Description="Example claim processor that gets user details.">
<RequestStart>
<InputClaims>
<ClaimDefinition Uri="tcd:claim:userid" />
</InputClaims>
<OutputClaims>
<ClaimDefinition Uri="tcd:claim:username" />
<ClaimDefinition Uri="tcd:claim:usersex" />
<ClaimDefinition Uri="tcd:claim:userage" />
</OutputClaims>
</RequestStart>
</ClaimProcessorDefinition>
<ClaimProcessorDefinition Uri="tcd:claimprocessor:example:example2"
ConfigProviderClass="com.tridion.ambientdata.processing.ExampleClaimProcessorConfigProvider" />
</ClaimProcessorDefinitions>
</CartridgeDefinition>
NOTE:- Reason why we have cd_ambient_cartridge_conf.xml for staging website and not for
Session preview website
during setting UI up , staging website was throwing an error in which it was expecting cd_ambient_cartridge_conf.xml file.So we put a sample file in this website.Even it is no where mentioned in the documentation to have this file. but in case of session preview website ,it was not expecting any such file.
You can safely ignore the WARN message in the log. Preview is not up to date message is unrelated to this WARN message.
If you are using virtual paths for your web sites (like /en, /fr etc..) then you need to have the hotfix "CD_2011.1.1.81686" installed on preview application. You do not need to add virtual paths to cd_dynamic_conf.xml file. You should keep it just "/"
Lastly, related to "preview is not up to date", You need to add the cd_ambient_catridge_conf.xml to your web service also in addition to your preview too. I don't believe this is documented but AFAIR you need to add this, I don't have my VM readily accessible but I can confirm this later. Please make sure you comment out all the Example ClaimProcessors.
Also, make sure you have Session Preview enabled in CMS , Inline Editing settings (Disable Session Preview should be NO)
Hope this helps.