Wakanda 2.2.1 Enterprise Server DataBrowser - console

Regarding the 2.2.1 Wakanda Enterprise Server, I'm looking for basic guidance regarding some of the features that were more easily accessed with prior Studio versions (buttons):
Data Browser - using the simply CRUD example from Wakanda. Following address does not yield a Data Browser:
http://127.0.0.1:8080/walib/dataBrowser/index.html. Tried port 8081 as well, the server is published on 8080.
Administration Console. Possibly due to a license key issue; but should I get the full administration console functionality as with prior versions? I am seeing only a small subset of features on the admin console.
Thanks

I found an easy workaround for the databrowser. I copied the whole walib-Directory in an folder (eg. WebFolder) in the backend folder. In the app.waProject in the backend i have
<type name="backend" />
<folder path="./WebFolder/">
<tag name="webFolder"/>
</folder>
Now I can call it with 127.0.0.1:8081/walib/dataBrowser/index.html
I don't know if this is a good solution, but as long as it works and just for debugging ...

Data Browser is made with WAF of Wakanda V1 and is removed since V2 no longer supports WAF. The V2 CRUD example using Angular is available in latest Wakanda doc.
The Administration Console, aka Admin Dashboard in v2, has received an update in v2 with more features include:
Display the running solution basic information.
Run WakandaDB maintenance actions: run, verify and restore.
Schedule tasks (CRON jobs): create, update, schedule and remove.
Analyze your tasks execution (calendar, output, statistics..).
You will need an enterprise license to access admin dashboard at http://127.0.0.1:8080/admin/.

Related

With valid App Registration, v4 BotFramework SDK Returning Unauthorized

I have been working with Kyle H. in Azure Support Chat and he recommended that I post this here.
Please be understanding, as I have read every related Stack Overflow post relevant and have exercised due diligence.
I have completely deleted and restarted both my Azure resources AND source code multiple times. Using this guide: https://learn.microsoft.com/en-us/azure/bot-service/bot-builder-howto-deploy-azure?view=azure-bot-service-4.0
The bot project is using the v4 template 'Echo'.
In Azure Portal, Test in Web Chat: "There was an error sending this message to your bot: HTTP status code Forbidden"
In Bot Emulator with ngrok configured: Cannot post activity. Unauthorized.
Following instructions here: https://learn.microsoft.com/en-us/azure/bot-service/bot-service-troubleshoot-authentication-problems?view=azure-bot-service-4.0#step-2
I ran: curl -k -X POST https://login.microsoftonline.com/botframework.com/oauth2/v2.0/token -d "grant_type=client_credentials&client_id=b7404e9f-0e74-4174-aa4f-447fdd96d7f0&client_secret=REMOVED&scope=https%3A%2F%2Fapi.botframework.com%2F.default" and I am returned a valid access token.
I have confirmed multiple times that my app registration is using https, as well as the /api/messages route, and that my app service has the two keys MicrosoftAppId and MicrosoftAppPassword and they are set appropriately. Including stopping and starting and the restart functionality. I have also ensured that during deployment(VS 2017 Publish), it removes existing files.
My last attempt was to upgrade the template 'Echo' bot to .net core 2.2 - which deployed successfully. I again verified app id and app password set as described above, and unfortunately both the Bot Emulator and Test in Web Chat failed with the same errors.
Edit 1:
I used 'Create new resource' and picked the 'Web App Bot' template. I chose the 'Echo' bot. I chose 'Automatically Generate Application ID and Password' as well. This resulted in a bot deployment that did not have authorization errors. However, I noticed that instead of a 'Bot Channel Registration' resource - it instead created a 'Web App Bot'. When inspecting 'Application Settings', application ID and password aren't present, yet it functions just fine.
Edit 2:
I researched deeper, and learned that the 'Web App Bot' created in azure uses the botFilePath and botFileSecret application settings and likely keeps the app id and app password there.
Edit 3:
There is a huge difference between the v4 BotBuilder EchoBot template you use in visual studio when creating a new project and the EchoBot template used in Azure when creating a new resource. Narrowing it down now.
Edit 4:
I was able to use the Azure created web bot and modify it to continue my work. Was not possible with the Vsix templates even with ensuring app config and .bot file config were correct.
Edit 5:
I also learned that v4 doesn't support Microsoft Teams - and that was the entire purpose of my endeavor. v3 documentation is near nonexistent. So I think I'll be using an entirely different framework to integrate with Teams. I even attempted to implement: https://github.com/OfficeDev/BotBuilder-MicrosoftTeams-dotnet but my bot only responds with "Sorry, there was a problem encountered with your request" in Teams.
Edit 6:
I managed to get my bot functioning with Microsoft Teams, during the process solving the unauthorized error.
I created the Echo bot in Azure - and used App Studio in Microsoft Teams to add a manifest and add the bot to our team.
After that, I imported the project located here: https://github.com/OfficeDev/BotBuilder-MicrosoftTeams-dotnet/tree/master/CSharp/Samples/Microsoft.Bot.Builder.Teams.TeamEchoBot
Then I modified setup.cs to work with the encrypted .bot file, based on the source code that created the Echo Bot in Azure.
The VSIX templates are going to generate Startup code that uses the .bot file for appId/secret and, when deployed to Azure, is going to be looking for an endpoint named "production" which you would have had to add yourself to the default .bot file with the appid and secret from your provisioned bot.
For more details, check out this previous answer I gave on this topic .
If you are struggling to deploy your bot, I would recommend looking at this guide. The steps will walk you through creating a bot on Azure, and then you can download the source code, develop your bot, and redeploy it.
Glad I could help!

IBM BPM 8.6 upgrade to IBM Business Automation Workflow is not working?

I have updated IBM BPM version 8.6.0 to IBM Business Automation Workflow V18.0.0.2 by following below documentation.
IBM BPM upgarde to IBM Business Automation Workflow V18.0.0.2
In the above documentation I have executed all the commands, only one command createProcedure_ProcessServer.sql was not successful and the optional commands i have not executed.
Now after doing all these things IBM BPM was upgraded as i can see the process portal/admin/center login page name is chnaged and also the additional rest api for sharing "saved searches" and RPA task is available. but when I am trying to access case builder it is giving me below error.
You mentioned skipping the optional steps in the upgrade instructions when performing your upgrade.
However, several of the optional steps specifically mention that they are needed in order enable the new case management functionality:
Optional: To use case management, follow these instructions to enable it.
Note: Steps 22-24 are about configuring case management. If you have a Db2 for z/OS database or an AdvancedOnly deployment environment, or you want to configure case management later, or you do not intend to use case management, skip these steps.
Thus, if you follow the optional steps 22-24 that would most likely solve your issue.
As Jan said,
BAW uses an additional Database/Schema the CPEDB, so if you have upgraded, you must export the current Dmgr profile, check if there you have the option to set the CPEDB database and credentials, fill that and then update the Dmgr profile.
If after the export you don't have the CPEDB options, open the samples config files and look for the differences and add them to the exported file.

Installing third-party Drupal modules on Azure

I've just started playing around with the new "Website" feature in Azure that allows you to create websites with just one step - and also allows you to create websites from a "Gallery", including Drupal. And I can get my Drupal site up and running, no problem. But if I try to add a third-party module (for instance, Mindtree's ODataDrupal), then I get this error message:
Installation failed! See the log below for more information.
odata_support
Error installing / updating
File Transfer failed, reason: Cannot chmod /DWASFiles/Sites/theparentsunion/VirtualDirectory0/site/wwwroot/sites/all/modules/odata_support.
More-or-less the same thing happens if I try to update some of the existing modules (which Drupal warns, with big red flashing letters, are out of date), except then my Drupal install is left crippled, with no way to fix it that I've been able to find.
Is this as-designed, or some limitation of the beta website integration? (Because a Drupal installation is kinda worthless if you can't add new modules to it, or update existing ones.) Or am I doing something wrong?
If you are trying to use plugins and 3rd party modules to Drupal based Windows Azure Websites, the results may vary person to person. This is mainly because the kind of configuration needed by specific module or plugin may or may not be supported by Windows Azure Websites model and not all kind of custom configuration will work on Windows Azure Websites and you would need to move to Windows Azure Virtual Machines.
About application specific structure, what you can do is open the websites FTP folder and whatever you could see there is user configurable, so you can configure it the way you want. However if you application will try to make changes to outside its limited scope, you will hit errors as above.
Here is a case study where Azure VM was used for Drupal based migration which shows that for complex application you may need to use AZure VM rather then Azure Websites.

How to use 51Degrees via NuGet with Azure?

I'm tryign to use 51Degrees in a .NET project that I deploy to Azure. August 2011, they released v1.2.1.3 marked as "Azure Compatible":
Foundation can now be deployed on to the Windows Azure Cloud service.
See the release note for full details on requirements and how to
setup. Azure related changes include: Instead of a log file, log
entries are written to a log table Instead of a devices file, previous
device requests are written to a device table A new conditional
compilation symbol - 'AZURE'. AZURE enabled builds will not work in
traditional ASP.NET.
Since then there have been a dozen releases and they are up to v2.1.4.9. However, their documentaiton is super light on how to use it with Azure. In fact, there was a bug originally because v1.2.1.3 stated
To make use of the changes you must create a storage account called
‘fiftyonedegrees’. The foundation will then create two tables, one for
previous devices, and another for logs.
This isn't possible because Azure storage accounts need to be unique across all instances so everyone can't create ones named fifityonedegrees.
Their response was:
After rereading the blog it seems I've made an oversight in this
regard, and will update shortly.
The storage account that the Foundation looks for can be changed in
the Foundation source code. Go to Foundation/Properties/Constants.cs
and change the string 'AZURE_STORAGE_NAME' to the name of your storage
account.
However, I'm still at a loss at how to utilize it within my project. Here's my issues:
I'm not clear whether v1.2.1.3 is the only Azure compatible release, or every release after is Azure compatible. Their documentation doesn't say.
When I install 51Degrees via NuGet, my project doesn't get an App_Data folder created which contradicts their documentation. The web.config file even has entries in it that reference the App_Data folder such as <log logFile="~/App_Data/Log.txt" logLevel="Info"/>.
Based on the response to the Azure storage account bug I quoted earlier, they are sayign IN need to edit the file Foundation/Properties/Constants.cs. However, since I'm installing via NuGet and it's a DLL, NuGet is presumably the wrong approach? Do I need to download the source and compile it myself and wire it up to my project manually?
I'm generally new to .NET, NuGet, VS, etc so appreciate the help.
All versions are Azure compatible from 1.2.1.3 onwards. I'm assuming this is the blog post you were talking about. After you've created your azure storage account, you'll have to edit the Constants.cs file in the source code and add in your account name. It's my understanding that this means you'll have to get access to the source code and edit it directly. One you have done this you'll need to recompile for the software to work correctly. I'm not sure if there is a way to perform the same task using NuGet, but I'll look into it. Hope this helps.

Step-By-Step ASP.NET Automated Build/Deploy

Seems like there are so many different ways of automating one's build/deployment that it becomes difficult to parse through all the different scenarios that people support in tutorials on the web. So I wanted to present the question to the stackoverflow crowd ... what would be the best way to set up an automated build and deployment system using the following configuration:
Visual Studio 2008
Web Application Project
CruiseControl.NET
One of the first things I tried was to have CCnet automatically zip the output and copy it to the server, but then that requires manual work to unzip at the destination. However, if we try to copy all the files individually, then it could potentially take a long time if it's a large application (build server lives outside of the datacenter in our office ... I know).
Also of particular interest is how we would support multiple environments as we have dev, qa, uat, and then of course prod.
MSDeploy seems really interesting, but unless I'm interpreting the literature incorrectly, doesn't help in the scenario of deploying from the output of a build server. If anything, it seems like it'll be useful in deploying one build across a build farm ... but even for deploying from one environment to another, one would have to manually change config settings and web service URLs, etc.
I recently spent a few days working on automating deployments at my company.
We use a combination of CruiseControl, NAnt, MSBuild to generate a release version of the app. Then a separate script uses MSDeploy and XCopy to backup the live site and transfer the new files over.
Our solution is briefly described in an answer to this question Automate Deployment for Web Applications?
You might be interested in MSDeploy. Here's a Scott Hanselman post on this. It's only available as a technical preview at the moment (September 2008) but is worth evaluating against your requirements.
There is another new build tool (a very intelligent wrapper) called NUBuild. Its lightweight, open source and extremely easy to setup and provides almost no-touch maintenance. I really like this new tool and we have made it standard tool for our continuous build and integration process of our projects (we have about 400 projects across 75 developers). Try it out.
http://nubuild.codeplex.com/
Easy to use command line interface
Ability to target all .Net framework
version i.e. 1.1, 2.0, 3.0 and 3.5
Supports XML based configuration
Supports both project and file
references
Automatically generates the “complete
ordered build list” for a given
project – No touch maintenance.
Ability to detect and display
circular dependencies
Perform parallel build -
automatically decides which of the
projects in the generated build list
can be built independently.
Ability to handle proxy assemblies
Provides visual clue to the build
process e.g. showing “% completed”,
“current status” etc.
Generates detailed execution log both
in XML and text format
Easily integrated with
Cruise-Control.Net continuous
integration system
Can use custom logger like XMLLogger
when targeting 2.0 + version
Ability to parse error logs
Ability to deploy built assemblies to
user specified location
Ability to synchronize source code
with source-control system
Version management capability
Do you have the ability to run commands remotely? The PsExec utility from Systinternals would let run a command line unzip program on the remote machine. If you have a script that copies the build as a .zip file to the remote site, you would just need one more line for the PsExec call to unzip the files.
I had a related question about getting a deployable set of files from an automated build. I found Web Deployment Projects (links and all in the old question) did what I needed - they're a VS and MSBuild add-on.
This is a common problem (and I wish I had read it sooner) for all development, not just ASP.NET. Being one of its developers, my team naturally uses BuildMaster internally for the entire release process, and for most scenarios it's free. Within the tool, we are able to perform all the standard CI builds to create artifacts and then set up an automation process to deploy these artifacts to any one of the 40+ servers we have internally or externally hosted, depending on the specific application or environment.
Since you specifically mentioned deployment to different testing environments, this is a fundamental aspect of the tool. The idea is to model the environment workflow (e.g. Integration -> QA -> Production) you already have in place and essentially promote a build all the way from source control to production. Most times, it's as simple as adding a deployment action that deploys an artifact to the environment, other times it can be much more complex.
You also casually mentioned configuration file changes are part of deployment, which is another built-in component to BuildMaster. The idea we had was to use the tool itself as the central hub for all configuration files and deployments, thus ensuring the latest changes are applied automatically with a simple "deploy configuration files" action in your deployment plan.
One thing you didn't mention with regard to this process is the database deployment aspect. Most ASP.NET applications require an associated database, otherwise they could just be static HTML files. It is crucial that the database schema gets updated to the appropriate database version with every deployment. There is, not surprisingly, a module within BuildMaster that handles this for you as well. The idea is to store DDL-DML scripts within the tool itself, and by executing scripts only once per environment, it ensures that all of your databases across each environment are up-to-date as your builds are deployed through them. Other scripts (e.g. stored procedures, views, triggers, etc.) are essentially code files and therefore belong in source control. These DROP-CREATE-CONFIGURE type scripts can be run each and every time in most cases with a simple deployment action.
Another piece of the deployment puzzle that most developers do not think about is process automation. Many developers need to perform sign-offs or fill out change request forms in order to manually perform these processes. Again, this is all available as part of the automated workflow setup within BuildMaster. You can setup blockers that do not allow promotion to say the QA environment unless all unit tests have passed, or block promotion to the Staging environment unless someone from the QA team approves the build and all issues in your issue tracking tool are resolved/closed for that particular release.
While I realize I left out CC.NET from the answer, our applications are all built and deployed through BuildMaster so we no longer need it, though we could however just as easily pickup the artifacts from a drop location and deploy them in later environments.
I see that many people use CC for their .NET projects, but why not use Jenkins, Sonarqube? They got all you need. I setup all this in 3 days. I have a Win 2008 server R2, MSSQL, Jenkins, VIsual SVN and Sonarqube.
It all works great and u get all metrics on your project. Sonarqube uses Gallio, Gendarme, FXcop, Stylecop, NDepths and PartCover to get your metrics and all this is pretty straight forward since SonarQube do this automatically without much configuration.
i post som pictures for u too get a feeling for it. Here is Jenkins witch builds and get Sonar metrics and a another job for deploying automatically to IIS
And Sonarqube, all metrics for my project. This is a simple MVC4 app, but it works great!:
If you want more information i can be more specific but i think you should at least consider jenkins. If CC suites you better, at least you looked at good alternative before you chose.
This whole setup uses MSBuild, too build and deploy the apps.

Resources