How to run custom codeunit when user uninstalls custom extension - dynamics-business-central

We have an extension that creates a Job Queue Entry to ensure that custom table values based on certain rolling dates stay in sync. The extension also creates some PermissionSets. When the extension is uninstalled, the Job Queue Entry needs to be deleted and the PermissionSets need to be removed from any users to which they were assigned.
The Codeunit class exposes a Subtype property but the non test values are only Normal, Insert and Upgrade. I was expecting an Uninstall and/or Unpublish Codeunit as well.
How do I detect a user extension uninstall so I can remove the items in question?

The feature you are looking for does not exist at the moment. It is on the product backlog as far as I know.
There are several issues on Github where people request the feature:
Issue 191
Issue 1271

Related

How can I update dynamodb GSI when projection type is changed?

I am using serverless framework to manage AWS infra and I have defined a dynamodb table with a GSI. Now I need to update the projection type of the GSi and I know that it is not allowed. So I manually delete the GSI from dynamodb table but I still get below error when running sls deploy:
An error occurred: DeviceTable - Cannot update GSI's properties other than Provisioned Throughput. You can create a new GSI with a different name..
What I don't understand is that the GSI has been removed but why it still complains about the GSI's properties. Is there any place remember the GIS? What is the right way to do that?
I had the same error. I work with the CDK but I think the problem is the same with the Serverless Framework. I solved it as follows:
Delete the affected indexes from your configuration file (serverless.yaml or CDK files).
Deploy the complete stack, not only one function
Add the new or changed index again
Deploy the complete stack, not only one function
This is how it worked for me.
ps: you can't replace the index with a new one, as mentioned above, you need to delete the old one and then add the new one. Or you will end up with an error message like this one: "Cannot perform more than one GSI creation or deletion in a single update"
I have faced the same issue. I tried to changed my ProjectionType but after that my serverless deployment started failing with the same error.
As mentioned in the question, I also deleted my GSI from DynamoDb, but that did not help me as well.
After Googling a lot, I did not get a proper answer. But this is how I resolved it finally.
When you get this error, you also get a link to the CloudFormation
stack which you are trying to update with your GSI changes.
Follow this link and go to your CloudFormation stack, there click on Template tab.
After that click on View in Designer button or link whatever you call it.
It will load your stack in Designer template.
At the botton window, it will ask you to Choose template Language, chose YAML as the option.
Locate your DynamoDb table where you are trying to apply GSI. Here you will see the old previous GSI settings.
Now please remove only the GSI settings carefully without removing anything related to DynamoDb table settings.
Now from the top left, you will see a couple of options to Redo, Undo, and Create Stack etc. There you will also find a link Save.
Save the template as a Serverless file to your local computer.
Now go back to Template tab and this time click on the button Update.
Chose the option Replace current template and the Update a template file. Here upload the Serverless file which you saved earlier, and click on next to finally upload it.
Now wait for sometime to let the CloudFormation update the template for you.
That's it, now if you run your yarn sls command again, it won't stop you to let your new DynamoDb GSI projection settings get deployed.
One more tip, it takes some time to create a new GSI on Cloud, so please be patient and look for the Status Active on your DynamoDb table under the tab Indexes.

Azure Devops publishing to own feed suddenly results in 403 forbidden

I have been using Azure DevOps for a project for quite some time, but suddenly publishing to my own organisation/collection feed results in a 403.
I created a feed and I can select it on the nuget push build step, but it does not work. I created a new feed to publish the NuGet packages to and this works perfectly again. It seems to me like a token expired, but I never created one or used it to authenticate. I also do not want to change my NuGet feed to the new one, as I want to use older packages as well.
This is the buildpipeline:
And this is the stack trace:
Active code page: 65001 SYSTEMVSSCONNECTION exists true
SYSTEMVSSCONNECTION exists true SYSTEMVSSCONNECTION exists true
[warning]Could not create provenance session: {"statusCode":500,"result":{"$id":"1","innerException":null,"message":"User
'a831bb9f-aef5-4b63-91cd-4027b16710cf' lacks permission to complete
this action. You need to have
'ReadPackages'.","typeName":"Microsoft.VisualStudio.Services.Feed.WebApi.FeedNeedsPermissionsException,
Microsoft.VisualStudio.Services.Feed.WebApi","typeKey":"FeedNeedsPermissionsException","errorCode":0,"eventId":3000}}
Saving NuGet.config to a temporary config file. Saving NuGet.config to
a temporary config file. [command]"C:\Program Files\dotnet\dotnet.exe"
nuget push d:\a\1\a\Microwave.0.13.3.2019072215-beta.nupkg --source
https://simonheiss87.pkgs.visualstudio.com/_packaging/5f0802e1-99c5-450f-b02d-6d5f1c946cff/nuget/v3/index.json
--api-key VSTS error: Unable to load the service index for source https://simonheiss87.pkgs.visualstudio.com/_packaging/5f0802e1-99c5-450f-b02d-6d5f1c946cff/nuget/v3/index.json.
error: Response status code does not indicate success: 403
(Forbidden - User 'a831bb9f-aef5-4b63-91cd-4027b16710cf' lacks
permission to complete this action. You need to have 'ReadPackages'.
(DevOps Activity ID: 2D81C262-96A3-457B-B792-0B73514AAB5E)).
[error]Error: The process 'C:\Program Files\dotnet\dotnet.exe' failed with exit code 1
[error]Packages failed to publish
[section]Finishing: dotnet push to own feed
Is there an option I am overlooking where I have to authenticate myself somehow? It is just so weird.
"message":"User 'a831bb9f-aef5-4b63-91cd-4027b16710cf' lacks
permission to complete this action. You need to have 'ReadPackages'.
According to this error message, the error you received caused by the user(a831bb9f-aef5-4b63-91cd-4027b16710cf) does not have the access permission to your feed.
And also, as I checked from backend, a831bb9f-aef5-4b63-91cd-4027b16710cf is the VSID of your Build Service account. So, please try with adding this user(Micxxxave Build Service (sixxxxss87)) into your target feed, and assign this user the role of Contributor or higher permissions on the feed.
In addition, here has the doc you can refer:
There is a new UI in the Feed Permissions:
To further expand on Merlin's solution & related links (specifically this one about scope), if your solution has only ONE project within it, Azure Pipelines seems to automatically restrict the scope of the job agent to the agent itself. As a result, it has no visibility of any services outside of it, including your own private NuGet repos held in Pipelines.
Solutions with multiple projects automatically have their scope unlocked, giving build agents visibility of your private NuGet feeds held in Pipelines.
I've found the easiest way to remove the scope restrictions on single project builds is to:
In the pipelines project, click the "Settings" cog at the bottom left of the screen.
Go to Pipelines > Settings
Uncheck "Limit job authorization scope to current project"
Hey presto, your 403 error during your builds involving private NuGet feeds should now disappear!
I want to add a bit more information just in case somebody ends up having the same kind of problem. All information shared by the other users is correct, there is one more caveat to keep into consideration.
The policies settings are superseded by the organization settings. If you find yourself unable to modify the settings or they are grayed out click on "Azure DevOps" logo at the left top of the screen.
Click on Organization Settings at the bottom left.
Go to Pipeline --> Settings and verify the current configuration.
When I created my organization it was limiting the scope at the organization level. It took me a while to realize it was superseding the project.
Still wondering where that "Limit job authorization scope to current project" setting is, took me a while to find it, its in the project settings, below screenshot should help
It may not be immediately obvious or intuitive, but this error will also occur when the project your pipeline is running under is public, but the feed it is accessing is not. That might be the case, for instance, when accessing an organization-level feed.
In that scenario, there are three possible resolutions:
Make the feed public, in which case authentication isn't required; or
Make the project private, thus forcing the service to authenticate; or
Include the Allow project-scoped builds under your feed permissions.
The instructions for the last option are included in #Merlin Liang - MSFT's excellent answer, but the other options might be preferable depending on your requirements.
At minimum, this hopefully provides additional insight into the types of circumstances that can lead to this error.
Another thing to check, if using a yaml file for the Pipelines, is if the feed name is correct.
I know this might seem like a moot point, but I spent a long time debugging the ..lacks permission to complete this action. You need to have 'AddPackage'. error only to find I had referenced the wrong feed in my azure-pipelines.yaml file.
If you don't want to/cannot change Project-level settings like here
You can set this per feed by clicking 'Allow Project-scoped builds' (for me greyed out as it's already enabled).
That's different from the accepted answer, as you don't have to explicitly add the user and set the permissions.
Adding these two permissions solved my issue.
Project Collection Build Service (PROJECT_NAME)
[PROJECT_NAME]\Project Collection Build Service Accounts
https://learn.microsoft.com/en-us/answers/questions/723164/granting-read-privileges-to-azure-artifact-feed.html
If I clone an existing pipeline that works and modify it for a new project the build works fine.
But if I try to create a new pipeline I get the 403 forbidden error.
This may not be a solution but I have tried everything else suggest here and elsewhere but I still cannot get it to work.
Cloning worked for me.

Telepat API Doc -Create App

I have probably multiple newbie questions but I am unsure about how to work with telepat based on just the document.
While creating an APP, we are expected to give a Key. However the field name is keys. Is there any reason for it? I am assuming that it would have to be unique but document does not mention if that is the case or the error we should expect in case the rule is violated.
Referring to http://docs.telepat.io/api.html#api-Admin-AdminCreateContext Admin Create does not seem to require authentication even when doing it from API. It also misses the response on success. Just a 200 may be sufficient but..
There is no way to get App ID. What am I missing?
First of all, what version of Telepat are you using ? Changes to the infrastructure happen often. The latest version is 0.2.5 (although I'd try to download from develop branch since improvements and bug-fixes appear on a day-by-day basis).
You can add multiple API keys for an application and distribute them in whatever way you want. The system is not bothered if you add a key that already exists at the moment.
May be because of old Telepat build. Can't get into detail with this.
admin/app/create returns the application object, including its ID. Also /admin/apps returns a list of all applications you have.

Phabricator restrict git push

I want my team including myself to review commits of each other. None of commits should be pushed including mine into repo until it's not audited by other team member. I kind of lost in phabricator documentation, so i'm asking here, is there any way to setup that kind of workflow?
You can only restrict pushes to repositories hosted by Phabricator. If your repository is hosted elsewhere (like GitHub), Phabricator obviously can't prevent users from pushing to it.
To restrict pushes, create a new Herald rule (in the Herald application), like this:
Create a new "Commit Hook: Commit Content" rule.
Select "Global" as the rule type.
Then configure the rule like this:
When [all of] these conditions are met:
[Accepted Differential revision][does not exist]
Take these actions every time this rule matches:
[Block change with message][Review is required for all changes.]
You may want to use additional conditions like this, to run the rule only in certain repositories:
[Repository][is any of][ ... list of review-requied repositories ... ]
Or a condition like this, to let users bypass the rule by writing some string like "#bypass-review" in a message in an emergency:
[Body][does not contain][#bypass-review]
If you add a bypass like this, you can mention it in the rejection message.
It seems that you want a Pre-Commit Code Review. We set this up by doing the following (we use Git repos. If you use some other type, these steps may be different):
Setup a Herald rule
New rule for: Commit Hook: Commit Content
If you only want one repo, you can use Rule Type: Object, however, we used Global
For the conditions: we selected Accepted Differential revision and does not exist
Action: Block Change with message For the message, we refer them to an article that walks them through using Arcanist
Each project will need a .arcconfig with at least this line:
{
"phabricator.uri": "http://your.phabricator.url"
}
Day-day developers are going to have to use arcanist.
Developer creates a local branch.
Changes and commits code to the local branch.
When finished, run arc diff [base_branch_name]
This will create a Differential revision that will allow another developer to code review.
If changes are needed, the developer checks out his local branch, makes changes, makes commits, and re-runs arc diff [base_branch_name] to update the diff.
After all revisions are done, run arc land [local_branch_name] --onto [base_branch_name]
I hope this helps. Also, Phabricator developers hangout in freenode.net IRC channel called #phabricator. Come join the community; they have always been very helpful for me.

fossil dvcs difference between update and checkout commands

After reading the builtin help, it seems to me that both commads can be used for modifying the workspace to match a certain revision. But I don't understand the differences between update and checkout. Please include some trivial workflows in your answer which show when update/checkout are appropriate.
First major difference is that if you have a remote url set, update will pull first latest artifacts from the remote repository.
Another difference is that if you have uncomitted changes, checkout will not run (unless you force it), whereas update will retain your changes and reapply them. With update you can therefore integrate changes from other users before committing.
So:
Update is what you need when you collaborate on a project, in order to prevent forks.
Checkout lets you deploy a particular version.

Resources