SuluArticleBundle installed, but Articles option not available in Admin menu - symfony

I started working with Sulu, and tried to implement the optional ArticleBundle. I used this documentation and composer installed elasticsearch 5.0.6 along with the bundle. All configuration files have been configured like the documentation, and while the server is starting, there is still no "Article" options like in the Sulu demo (you can log in with admin / admin and check it out).
I also encountered the error No alive nodes in your cluster while performing the last commands that are related to ElasticSearch. What Am I missing?

Have you seen that the SuluArticleBundle has a dependency to elasticsearch (which is a third party application). This has to be started before running sulu with the article-bundle. Additionally you have to add the security-context to the user-role. Open the User Role of your user and simply add it.

Related

Alfresco share repository section not working

I installed fresh Alfresco 7.1.1. Everything is working fine except the repository section is not working properly in the share. I am able to create the sites and user but when I go to repository or site document library it shows the following.
I sent a document to the repository and it got created successfully. Even I can access the document using the Admin Console Node browser and see all the folders like company_home.
There are no documents or icons to create a new content. I tried to drag and drop the document but that also did not work.
My tomcat is running on 8081 and I have replaced all the localhost:8080 ports in the share-config-custom.xml with localhost:8081.
It turned out that I did not installed the Alfresco-Share-Services.amp. However it was not showing any warnings in the logs about not having Alfresco-Share-Services.amp installed.
Than you everyone for their time.
This will also occur if you accidentally install the Alfresco-share-services.amp to the share.war. It needs to be installed to the alfresco.war file. The module will show as installed but it will not allow you see the repositories in the screen shot on this question. It took a little while for me to debug my miss-step in installation so trying to save others some time if they make the same mistake.

Azure Devops publishing to own feed suddenly results in 403 forbidden

I have been using Azure DevOps for a project for quite some time, but suddenly publishing to my own organisation/collection feed results in a 403.
I created a feed and I can select it on the nuget push build step, but it does not work. I created a new feed to publish the NuGet packages to and this works perfectly again. It seems to me like a token expired, but I never created one or used it to authenticate. I also do not want to change my NuGet feed to the new one, as I want to use older packages as well.
This is the buildpipeline:
And this is the stack trace:
Active code page: 65001 SYSTEMVSSCONNECTION exists true
SYSTEMVSSCONNECTION exists true SYSTEMVSSCONNECTION exists true
[warning]Could not create provenance session: {"statusCode":500,"result":{"$id":"1","innerException":null,"message":"User
'a831bb9f-aef5-4b63-91cd-4027b16710cf' lacks permission to complete
this action. You need to have
'ReadPackages'.","typeName":"Microsoft.VisualStudio.Services.Feed.WebApi.FeedNeedsPermissionsException,
Microsoft.VisualStudio.Services.Feed.WebApi","typeKey":"FeedNeedsPermissionsException","errorCode":0,"eventId":3000}}
Saving NuGet.config to a temporary config file. Saving NuGet.config to
a temporary config file. [command]"C:\Program Files\dotnet\dotnet.exe"
nuget push d:\a\1\a\Microwave.0.13.3.2019072215-beta.nupkg --source
https://simonheiss87.pkgs.visualstudio.com/_packaging/5f0802e1-99c5-450f-b02d-6d5f1c946cff/nuget/v3/index.json
--api-key VSTS error: Unable to load the service index for source https://simonheiss87.pkgs.visualstudio.com/_packaging/5f0802e1-99c5-450f-b02d-6d5f1c946cff/nuget/v3/index.json.
error: Response status code does not indicate success: 403
(Forbidden - User 'a831bb9f-aef5-4b63-91cd-4027b16710cf' lacks
permission to complete this action. You need to have 'ReadPackages'.
(DevOps Activity ID: 2D81C262-96A3-457B-B792-0B73514AAB5E)).
[error]Error: The process 'C:\Program Files\dotnet\dotnet.exe' failed with exit code 1
[error]Packages failed to publish
[section]Finishing: dotnet push to own feed
Is there an option I am overlooking where I have to authenticate myself somehow? It is just so weird.
"message":"User 'a831bb9f-aef5-4b63-91cd-4027b16710cf' lacks
permission to complete this action. You need to have 'ReadPackages'.
According to this error message, the error you received caused by the user(a831bb9f-aef5-4b63-91cd-4027b16710cf) does not have the access permission to your feed.
And also, as I checked from backend, a831bb9f-aef5-4b63-91cd-4027b16710cf is the VSID of your Build Service account. So, please try with adding this user(Micxxxave Build Service (sixxxxss87)) into your target feed, and assign this user the role of Contributor or higher permissions on the feed.
In addition, here has the doc you can refer:
There is a new UI in the Feed Permissions:
To further expand on Merlin's solution & related links (specifically this one about scope), if your solution has only ONE project within it, Azure Pipelines seems to automatically restrict the scope of the job agent to the agent itself. As a result, it has no visibility of any services outside of it, including your own private NuGet repos held in Pipelines.
Solutions with multiple projects automatically have their scope unlocked, giving build agents visibility of your private NuGet feeds held in Pipelines.
I've found the easiest way to remove the scope restrictions on single project builds is to:
In the pipelines project, click the "Settings" cog at the bottom left of the screen.
Go to Pipelines > Settings
Uncheck "Limit job authorization scope to current project"
Hey presto, your 403 error during your builds involving private NuGet feeds should now disappear!
I want to add a bit more information just in case somebody ends up having the same kind of problem. All information shared by the other users is correct, there is one more caveat to keep into consideration.
The policies settings are superseded by the organization settings. If you find yourself unable to modify the settings or they are grayed out click on "Azure DevOps" logo at the left top of the screen.
Click on Organization Settings at the bottom left.
Go to Pipeline --> Settings and verify the current configuration.
When I created my organization it was limiting the scope at the organization level. It took me a while to realize it was superseding the project.
Still wondering where that "Limit job authorization scope to current project" setting is, took me a while to find it, its in the project settings, below screenshot should help
It may not be immediately obvious or intuitive, but this error will also occur when the project your pipeline is running under is public, but the feed it is accessing is not. That might be the case, for instance, when accessing an organization-level feed.
In that scenario, there are three possible resolutions:
Make the feed public, in which case authentication isn't required; or
Make the project private, thus forcing the service to authenticate; or
Include the Allow project-scoped builds under your feed permissions.
The instructions for the last option are included in #Merlin Liang - MSFT's excellent answer, but the other options might be preferable depending on your requirements.
At minimum, this hopefully provides additional insight into the types of circumstances that can lead to this error.
Another thing to check, if using a yaml file for the Pipelines, is if the feed name is correct.
I know this might seem like a moot point, but I spent a long time debugging the ..lacks permission to complete this action. You need to have 'AddPackage'. error only to find I had referenced the wrong feed in my azure-pipelines.yaml file.
If you don't want to/cannot change Project-level settings like here
You can set this per feed by clicking 'Allow Project-scoped builds' (for me greyed out as it's already enabled).
That's different from the accepted answer, as you don't have to explicitly add the user and set the permissions.
Adding these two permissions solved my issue.
Project Collection Build Service (PROJECT_NAME)
[PROJECT_NAME]\Project Collection Build Service Accounts
https://learn.microsoft.com/en-us/answers/questions/723164/granting-read-privileges-to-azure-artifact-feed.html
If I clone an existing pipeline that works and modify it for a new project the build works fine.
But if I try to create a new pipeline I get the 403 forbidden error.
This may not be a solution but I have tried everything else suggest here and elsewhere but I still cannot get it to work.
Cloning worked for me.

Build Kaa from source code - missing panels from Admin UI

I have built a Kaa server from the source code using the guide at the following link. After which, I followed the guide at the following link to install the Kaa node service.
I am deploying the build on Ubuntu 16.04 and using MariaDB and MongoDB as the SQL and NoSQL databases respectively. I followed every step from both the guides and the server starts successfully.
I am able to navigate to the Kaa admin page using the link http://YOUR_SERVER_HOST:8080/kaaAdmin, and have created an admin user as well.
However, when I log in, I don't see all the panels. I only see the following panels.
I don't see the user, tenant or the application management panels. The tenant section only allows me to add a new tenant where I can only enter the tenant name and nothing else.
Is there additional step to be performed to enable the other panels? I am not sure what I am missing.
Any help is appreciated.

How to check if Drupal modules are up to date ?

I am using a monitoring tool (Sensu) to execute multiple checks to know if a server has problems.
I have already written a ruby script to know if a wordpress is up to date, to do that I connect through a ssh tunnel to the server, and I connect to his wordpress database, and then I check a table where I parse some data. For exemple if response=lastest, the core is up to date.
I want to do the same for Drupal, but I can't find useful data in the drupal database which says me that a module or the core is up to date, I only the find version number in system table.
Have you got an idea how can I check if drupal modules are up to date, if possible from an another server than the one where drupal is installed ?
Thanks.
There is a module called nagios (https://www.drupal.org/project/nagios) that will allow you to visit a "check page" and it will check the status of a number of different things that you can monitor.
I would only caution if you are using a Drupal Distribution, not all the modules get updated in a timely fashion, but if you are using the standard Drupal installation you should be fine.
There is one nagios plugin I found which does not require any modules to be installed into drupal. However, it requires drush (http://www.drush.org) on the server which holds the drupal site:
https://github.com/cytopia/check_drupal

Installing third-party Drupal modules on Azure

I've just started playing around with the new "Website" feature in Azure that allows you to create websites with just one step - and also allows you to create websites from a "Gallery", including Drupal. And I can get my Drupal site up and running, no problem. But if I try to add a third-party module (for instance, Mindtree's ODataDrupal), then I get this error message:
Installation failed! See the log below for more information.
odata_support
Error installing / updating
File Transfer failed, reason: Cannot chmod /DWASFiles/Sites/theparentsunion/VirtualDirectory0/site/wwwroot/sites/all/modules/odata_support.
More-or-less the same thing happens if I try to update some of the existing modules (which Drupal warns, with big red flashing letters, are out of date), except then my Drupal install is left crippled, with no way to fix it that I've been able to find.
Is this as-designed, or some limitation of the beta website integration? (Because a Drupal installation is kinda worthless if you can't add new modules to it, or update existing ones.) Or am I doing something wrong?
If you are trying to use plugins and 3rd party modules to Drupal based Windows Azure Websites, the results may vary person to person. This is mainly because the kind of configuration needed by specific module or plugin may or may not be supported by Windows Azure Websites model and not all kind of custom configuration will work on Windows Azure Websites and you would need to move to Windows Azure Virtual Machines.
About application specific structure, what you can do is open the websites FTP folder and whatever you could see there is user configurable, so you can configure it the way you want. However if you application will try to make changes to outside its limited scope, you will hit errors as above.
Here is a case study where Azure VM was used for Drupal based migration which shows that for complex application you may need to use AZure VM rather then Azure Websites.

Resources