Changing the Odata service of WebIDE Archive (FIORI APP) - data-binding

For reasons of security I cannot use the cloud connector in conjunction with the on premise system which we have. Never the less I am very comfortable using WebIDE in order to create the necessary applications. I create the applications in WebIDE using a .edmx file which represents the on premise ODATA service which i plan to use.
After the application is made. I import the archive and then make necessary modifications to the archive in HANA Studio (inside Java EE perspective).
I am doing things according to this guide: https://www.sap.com/developer/tutorials/hcp-webide-switch-live-odata.html
My understanding is that the manifest.json and neo-app.json are both to be modified pointing to the real ODATA service.
In the manifest.json:
According to the guide above, within the neo-app.json, I need to point it to the gateway.
Here is the information of our on premise system.
System Name: sapewp01.xxxxx.com
localURI: /sap/opu/odata/sap/zbw_odata_q3_srv/
Port: 8012 (i assume, the odata service link works on my end)
SAP Gateway (sapgw12)
Another other required information can be provided on request.
How does one change a web ide app to be able link into an on premise ODATA service, WITHOUT disrupting all the work done on webIDE?

In the sap.app part of the manifest.json can't you just specify?
"dataSources": {
"myService": {
"uri": "/sap/opu/odata/sap/zbw_odata_q3_srv/",
"type": "OData",
"settings": {
"odataVersion": "2.0"
}
}
}
At least with my experience working in Eclipse, all I had to do was have the URI to create the model.
oModel = new sap.ui.model.odata.v2.ODataModel(sServiceUrl, oConfig);
but this was using a legacy app, maybe Web IDE does it a different way?

Related

Handling development time web.config conflicts

I am looking for a way to handle this challenge: we are a geographically dispersed dev team using ASP.NET Web API and Angular to build a web app.
The thing that causes the grief is the fact that not all team members use the same database setup for their dev work. Yes, I know - I can use web.config transforms to set the proper connection strings for test, staging and production (and I'm already doing this) - but this is not what I'm talking about.
Due to reasons beyond our control at this time, we have
some developers working on a local SQL Server instance using server=(local);database=OurDB as their connection string
other developers using a central developer SQL Server in their location, using something like server=someserver.mycorp.com;database=OurDB
and a few exotic cases with yet other settings
Now every time someone commits a change to the Git repo, and happens to also change something in the web.config, his connection string is committed to the repo. So when I then go pull that latest commit, my settings to my local DB server are overwritten by this other guy's settings.
I am looking for a way to handle this - I was hoping I might be able to
hook into the Git pull process and automagically update the web.config connection string to my local needs whenever I pull something
somehow reference a connection string (or external config file) based on e.g. my currently logged in user's name or something like that
But I can't seem to find any way of doing this. I was wondering if I need to build a VS extension to handle this - any starters for that? Has anyone done something like this before and could share his code? (or has it up on Github)
The web.config configuration system used in ASP.NET is not flexible enough to support the more advanced scenario you have described. So, why use it? You could store the configuration in files within the repository, one per developer. Or they could be stored outside the repository or otherwise ignored.
The real trick is that most older applications don't have a single root that retrieve the configuration, so you have to refactor your application to utilize a flexible configuration system. For your staging/production environments you probably still want to use the config in web.config. The following code can give you a basic idea of one way to structure it:
public class MyApplicationConfiguration
{
public string MainConnectionString { get; set; }
}
public class ConfigurationRetriever
{
public MyApplicationConfiguration GetConfiguration()
{
// You might look for the absence or presence of an environment variable to determine this
bool isLocalDevelopment = IsApplicationLocalDevelopment();
var config = new MyApplicationConfiguration();
if(isLocalDevelopment)
{
config.MainConnectionString = Environment.GetEnvironmentVariable("MyApplication_MainConnectionString");
//or get it from a JSON file or XML file or config database
}
else
{
config.MainConnectionString = ConfigurationManager.ConnectionStrings["MainConnectionString"].ConnectionString;
}
}
}
Rather than rolling your own config building logic, you might refactor your application to leverage Microsoft.Extensions.Configuration. It's not just for .NET Core. It's for .NET Standard. So you can use it even in your legacy ASP.NET applications. For reading the web.config, you could probably use Microsoft.Extensions.Configuration.Xml. Or you can write your own adapter that pulls values out of ConfigurationManager. I did a basic test, and this worked as expected.

How to deploy to Azure Resource Group using VSTS release management

I am new to Visual Studio Team Services Release Management. My goal is to automate a deployment of an ASP.NET MVC application to the Azure App Service.
Trying different approaches, I created a Service Endpoint that is certificate based and one that uses a service principal (SPN). My build definition already builds a web deploy package, and the release definition is linked against that and can use this artifact.
Success 1:
A deployment of the app using the Azure Web App Deployment Task already succeeded - almost.
Shortcoming 1: I do not understand how I can specify the correct Resource Group using this task. This uses the certificate based endpoint, and for this task I cannot use the other (SPN) endpoint.
Success 2:
Using the Azure Resource Group Deployment task, I was able to use a JSON ARM template to create a new resource group with a web app in it. This way I can specify the resource group, addressing Shortcoming 1
Shortcoming 2: But now I don't understand how I can actually deploy the binaries of the build definition that has been linked against my release definition. The web application that gets created by the resource group deployment is empty, and a subsequent Web App Deployment Task seemingly cannot target this newly created web app, since it is probably not ARM based.
I get the feeling that I am missing something obvious here - any help is appreciated.
Update 1
Thanks to #bmoore-msft, I got a deployment working using the child resource extension example he linked to. Essentially, the corresponding snippet of my ARM template now looks like this:
"resources": [
{
"apiVersion": "2015-08-01",
"type": "Microsoft.Web/sites",
"name": "[variables('fullEnvName')]",
"location": "[parameters('siteLocation')]",
"properties": {
"name": "[variables('fullEnvName')]"
},
"resources": [
{
"apiVersion": "2014-06-01",
"name": "MSDeploy",
"type": "Extensions",
"dependsOn": [
"[concat('Microsoft.Web/Sites/', variables('fullEnvName'))]"
],
"properties": {
"packageUri": "https://dl.dropboxusercontent.com/u/<myId>/<WebDeploymentPackage>.zip",
"dbType": "None",
"connectionString": "",
"mode": "Complete"
}
}
]
}
]
But the problem is that this places a static link into my template - as you can see, I used Dropbox as temporary solution. But of course I don't want to upload my web deployment package to Dropbox, neither manually nor automatically. I want to link to the artifact created by my build definition, which unfortunately is dynamic and I can't find any information on how to construct this link. For example, build 1 is located at the following path
https://<tenant>.visualstudio.com/DefaultCollection/_apis/resources/Containers/800850?itemPath=<PathToWebDeploymentPackage>.zip
while build 2 is available here
https://<tenant>.visualstudio.com/DefaultCollection/_apis/resources/Containers/801968?itemPath=<PathToWebDeploymentPackage>.zip
So there is a number changing inside the link which means the link I refer to in my template must be dynamic which means I need to understand where to get that number from, which I don't.
Maybe there is another way of referencing artifact uploads?
Take a look at this sample:
https://github.com/Azure/azure-quickstart-templates/blob/75d0588fbd2702288bd35ed24cb00e43dcf980c2/wordpress-mysql-replication/website.json
The website in that template resource has a child resource extension named "MSDeploy". This will deploy a package to the web site during deployment. So in your task that does the deployment you can create the web app, and deploy the package all in the one deployment task in RM.
You will need to use user or SPN authn for anything using ARM (no certs).
Update: Staging the Package
Ok, usually what I do here is "stage" my artifacts in Azure Storage (secured with a sasToken). The uri you provide in the template must be accessible to AzureRM. You VSTS build output is likely secured, so even though you could access it interactively, AzureRM cannot.
Essentially what you need is a task in RM (or build) that will 1) copy the artifacts to Azure (securely) and then 2) tell the next task where those artifacts are... Here's one option:
https://azure.microsoft.com/en-us/documentation/articles/vs-azure-tools-resource-groups-ci-in-vsts/
This doc is using VSTS build, but RM works the same way. The other part that's different is the doc is using a PS script used by Visual Studio in the Azure Resource Group projects. There's nothing special about that script (it will work anywhere just like any other PS script) but that's the example. It doesn't use the Azure Resource Group Deployment Task because that task cannot do the staging of the artifacts.
Essentially what you need to do is:
parameterize that URI property (see example & repo below)
copy the webdeploy package to Azure (PowerShell in this case)
deploy the template and pass in the uri of the package
e.g.
"packageUri": "[concat(parameters('artifactsLocation'), webdeploy.zip, parameters('sasToken')]"
That doc shows you how VS does it, and you should be able to adapt that for your scenario. If you go this route, you would use the Azure PowerShell task and no longer need the Azure Resource Group Deployment Task.
Another way to do this is with the Azure File Copy task, but currently that task does not output the URI or sasToken, so you couldn't pass it in to the deployment task (there's a PR in the queue to make that work).
Another option if you don't have access to Visual Studio is this repo:
https://github.com/Azure/azure-xplat-arm-tooling/tree/master/PowerShell
It has the same PS script that VS uses, and the templates show an example of the parameterized URL (for a dsc.zip file in this example) but would work the same way for msdeploy.
You've actually hit on one of the more sophisticated scenarios and at the moment not doc'd real well, but it's pretty cool when it works. LMK if you need more help here.

Setting project url in VS2015 ASP.NET 5 Web API application

I'm trying to create a Web API project and a client-side web project, where the web project can access the API via ajax. Currently my project looks like this:
I saw this answer on here: Setting app a separate Web API project and ASP.NET app, which explains how the project url can be set to localhost:[port]/api.
But for ASP.NET 5 projects, the properties only have 3 tabs (as opposed to the several found in ASP.NET 4 projects):
What I'm wondering is:
Do I have to set this option somewhere else? (i.e project.json)
How would this work when I publish? Ideally I'd want [websiteURL]/api to serve up my API, whereas that link explicitly put localhost:8080.
Is having these as two projects a good idea? I could easily put API and web in the same project, but I like the separation of client-side and server-side logic.
Any help would be appreciated!
First Point:
Generally speaking in ASP.NET 5, the routing defaults are very good and should work out of the box without much in the way of configuration. You can use configuration and/or attribute based routing in your application (with a detailed overview of both here), although my personal preference is for the attributed approach. Provided you have the following line in your Startup.cs file (which you should have in a new project):
app.UseMvc();
you should be able to route requests to your api controllers in the fashion required (i.e. "/api/...") simply by using [Route] attributes as below (example taken from a standard generated ASP.NET 5 Web API application)
[Route("api/[controller]")]
public class ValuesController : Controller
{
[HttpGet]
public IEnumerable<string> Get()
{
return new string[] { "value1", "value2" };
}
}
The above example will route any GET request made to "/api/values".
While this approach can be used to handle requests made to your api, in order to deliver the files needed for your front end javascript application/single page app, you will need to enable static file serving. If you add the following to the Configure method in your Startup.cs class:
app.UseStaticFiles();
this will allow your application to serve those static files - by default, these are served from the ‘wwwroot’ folder, although this can be changed in the project.json file if required. The files needed for your front end app should then be added to this folder. A tutorial on serving static files can be found here.
Second Point:
In my experience this will not be an issue when you publish your website - provided your server is set up correctly, you will not need to include the port when making a request - navigating to [yourwebsitename]/api/... will suffice.
Third point:
In my opinion this entirely depends on how large the project is likely to grow, although preference and opinion will vary from developer to developer. Generally speaking, if the project will remain small in scope then keeping both in a single project is perfectly ok, as unnecessary complexity is reduced. However it is also very useful as you have pointed out, to maintain a separation of concerns between projects. So aside from the organisational advantage of your approach, the respective dependencies of the two projects are/will be kept separate also.

Swagger w/ ASP.NET v5 Azure Api App

I'm attempting to set up a Api App (Azure) with Swagger + Swashbuckle as demonstrated by Scott Hanselman at the //Build conference here: http://channel9.msdn.com/Events/Build/2015/2-628.
I have installed (using NuGet) the packages Swagger.Api and Swashbuckle.Core. It hasn't added any controller or settings that I would expect in order to have a swagger page. When I navigate to {baseUrl}\swagger, I get a 404 error.
I would think that since it has a UI it would require a Web App in addition to the Api App, but I've rewatched the demo and Scott clearly says you can add Swagger + Swashbuckle to any Api App. In a 2nd app though I'd think there may be issues with Api discovery. Has anyone set this up yet successfully?
Time rolls on and now Swashbuckle works for vNext (beta8 for me, probably other versions too) - thank you to the team and contributors!
In project.json add the package:
"Swashbuckle": "6.0.0-*",
In startup.cs in ConfigureServices():
services.AddSwaggerGen();
services.ConfigureSwaggerDocument(options =>
{
options.SingleApiVersion(new Info
{
Version = "v1",
Title = "My Super API",
Description = "A Super API using Swagger and Swashbuckle",
TermsOfService = ""
});
});
services.ConfigureSwaggerSchema(options =>{
options.DescribeAllEnumsAsStrings = true;
});
In startup.cs in Configure():
app.UseSwaggerGen();
app.UseSwaggerUi();
Now you can access your API doco - http://domain:port/swagger/ui/index.html
Access your Swagger definition - http://domain:port/swagger/v1/swagger.json
Then assuming you have at least one internet facing dev/uat/staging/prod environment, grab the definition URL then do File-> Import URI at http://editor.swagger.io/ - now you have code-gen for about 20 clients too :)
You can also setup your own code-gen server if you are so inclined too (https://github.com/swagger-api/swagger-codegen), however I leveraged the existing online generator. The online editor also has full model and relationship definitions too at least in my case where I fully defined my model using EF7 (I know, ick... but it's much better than <= EF6 and ServiceStack doesn't support CoreCLR, yet). Depending on the size of your project this could save you a few hours to a few weeks of work documenting, plus it is dynamically updating itself as you code more. And you can use it to test your endpoints too, but I still prefer PostMan.
Full sample project at https://github.com/mrsheepuk/ASPNETSelfCreatedTokenAuthExample/tree/beta8
Big ups to MrSheepUK
HTH
This answer is now outdated. See the other answer.
There is a nice blogpost describing the problem you have: https://alexanderzeitler.com/articles/Deploying-a-ASP-NET-MVC-6-API-as-Azure-API-App-in-Azure-App-Services/
This describes how you can add the Ahoy! package to an ASP.NET v6 Web Api project and adding this as an API app to Azure.
Here is another source: http://devmeetsbi.ghost.io/help-and-test-page-for-asp-net-web-api-asp-net-5-and-mvc-6/
You did all the right steps, but unfortunately for ASP.NET 5, Swashbuckle doesn't work yet.
You can get Ahoy! which is the next version of Swashbuckle that has ASP.NET v6 support here. That should make everything work.

How to do mocks for Web tests?

I want to write a few web tests (over WatiN/Selenium + CassiniDev web server) for my asp.net web application.
Problem I encountered is that I dont know what to do in such situations:
there is a page where user can click the button to call some third-party service. In my web test i want to create mock of this service, which will always return static value (some value in these test case and other value in other test case).
How can i do that?
Currently i use IoC/DI container Microsoft Unity. And my pages gets his dependencies in a manner described in http://msdn.microsoft.com/en-us/library/ff664622%28v=pandp.50%29.aspx.
The only solution that comes to my head is: place all dependencies in web.config for each test case and copy necessary web.config on SetUp of test. This solution completly painful!
Any ideas?
I use WatiN and Cassini-dev in my integration tests as well and have had to deal with similar issues. In my setup fixture I deploy my Asp.Net web application to a temporary folder in my test folder which allows me to play around with the configuration before starting up cassini-dev. I use Windsor for my CI which allows me to change injected components at the configuration level. You may also be able to acheive this with Unity.
If the service you are referring to is a web service you just mock out a web service using the interface you have been coding to.
Here are the steps that I take when running my integration tests:
Create a temp web directory
Publish the Asp.Net web application to the temp directory (I use MSBuild to do this)
Deploy temp database (Using MSbuild and database projects but could be done a number of ways)
Deploy temp membership database (see my blog post on how to do this in code)
Update the web.config of the deployed Asp.Net web application to point to the temp databases and change any other settings relevant for testing.
Start up the website using Cassini-Dev. I also hit the site with a http request so that I can verify the site is up before running any tests.
Run the tests.
After running the tests you should clean up.
Stop cassini-dev
Delete the temp hosting folder
Delete the temp databases. I use Sql server SMO objects that allow me to query the Sql Server which I use to delete up any old databases that have been left lying around after any previously failed test runs.
How to deploy a website using MSbuild in code
var properties = new Dictionary<string, string>
{
{"Configuration", isDebug ? "Debug" : "Release"},
{"WebProjectOutputDir", tempHostingDirectory.FullName},
{"DeployToDatabase", "true"},
{"OutDir", Path.Combine(tempHostingDirectory.FullName, "bin\\")}
};
using (var engine = new ProjectCollection(properties))
{
engine
.LoadProject(<web project path>, "4.0")
.Build(new[] {"Build", "ResolveReferences", "_CopyWebApplication"});
}
Unity configuration section usage: http://www.pnpguidance.net/Post/UnityContainerUnityConfigurationSectionAppConfigWebConfig.aspx
Generating asp.net membership database in code: http://bronumski.blogspot.com/2011/06/generating-creating-aspnet-application.html
Msbuild ProjectCollection on MSDN: http://msdn.microsoft.com/en-us/library/microsoft.build.evaluation.projectcollection.aspx
It sounds like you are trying to mock a web service.
Web services usually inherit from MarshalByRefObject, this means you can create a mock by inheriting from RealProxy to create a transparent proxy that pretends to be the webservice:
class Mock : RealProxy
{
public Mock()
: base(typeof(IStuff)) { }
public IStuff GetStuff()
{
return (IStuff)GetTransparentProxy();
}
public override IMessage Invoke(IMessage msg)
{
IMethodCallMessage message = (IMethodCallMessage)msg;
// the message object provides the MethodInfo that was called
// as well as the arguments.
// <Insert logic here>
return new ReturnMessage(new NotImplementedException("comming soon to a test near you ..."), message);
}
}
I belieave NMock2 uses RealProxy for it's mocks, so you should be able to use it to mock the web service instead.

Resources