In our current on-prem setup we have 20+ .net core 3.1 API apps (separate ASP.NET Core API Apps). We have started migrating 2 APi app to Azure App Service tagged with a single Application Insights instance.
In On-Prem, we use some other log framework which the rest of the 18 Apps. All these API apps talk to each other and all the logs are tied to some unique_id in on-prem.
Now, for the apis which is in Azure, we need to leverage the same unique_Id and co-relate everything.
In order to achieve it, I started exploring the functionality of setting a same Operation Id for the 2 apps which are hosted in azure.
Created TelemetrInitializer in both the APIs. and if i set Operational Id as shown below in both the APIs, it's works. All the logs are tied to Single Operation Id "12345"
telemetry.Context.Operation.Id = "12345";
However, as it is obvious to make the Operation Id to be dynamic, I have changed it to the below in my First API
telemetry.Context.Operation.Id = "CR" + Guid.NewGuid().ToString();
So, the next challenge is, I need to tie this new Operation Id in my second API's TelemetryInitiializer. In order to achieve that I tried to grab the Request-Id Header in the TelemetryInitializer of 2nd API. It's always NULL.
Is there a way to achieve this?
Thanks,
Praveen Sreeram.
tldr: This is possible by disabling the built in dependency tracking in .NET Core and App Insights and handling it on your own. In most cases, the best thing to do is let .NET Core and App Insights do the tracking.
I uploaded a simple WebAPI app with the code I'm going to go over to Github: https://github.com/SamaraSoucy-MSFT/customoperationid
There are two things that need to be overridden to get both the headers and App Insights to get the custom operation Id. The first is the Activity the wraps the HttpClient as that controls the correlation headers. The second is the dependency tracing in App Insights.
It is possible to disable Actions completely in your HttpClients, but to minimize side effects, you can just remove the one in the client by setting Activity.Current = null
var operationId = "CR" + Guid.NewGuid().ToString();
var url = "https://www.microsoft.com";
using (var client = new HttpClient())
{
using (var requestMessage =
new HttpRequestMessage(HttpMethod.Get, url))
{
//Makes the headers configurable
Activity.Current = null;
//set correlation header manually
requestMessage.Headers.Add("Request-Id", operationId);
await client.SendAsync(requestMessage);
}
}
The next step is to remove the App Insights default tracking for this request. Again, you can disable dependency tracking completely, or you can filter out the default telemetry for this request. Processors are registered inside the Startup class just like initializers.
services.AddApplicationInsightsTelemetryProcessor<CustomFilter>();
public class CustomFilter : ITelemetryProcessor
{
private ITelemetryProcessor Next { get; set; }
// next will point to the next TelemetryProcessor in the chain.
public CustomFilter(ITelemetryProcessor next)
{
this.Next = next;
}
public void Process(ITelemetry item)
{
// To filter out an item, return without calling the next processor.
if (!OKtoSend(item)) { return; }
this.Next.Process(item);
}
// Example: replace with your own criteria.
private bool OKtoSend(ITelemetry item)
{
var dependency = item as DependencyTelemetry;
if (dependency == null) return true;
if (dependency.Type == "Http"
&& dependency.Data.Contains("microsoft.com")
//This key is just there to help identify the custom tracking
&& !dependency.Context.GlobalProperties.ContainsKey("keep"))
{
return false;
}
return true;
}
}
Finally, in the method that makes the remote call, you need to inject a telemetry client and call TelemetryClient.TrackDependency()
var operationId = "CR" + Guid.NewGuid().ToString();
//setup telemetry client
telemetry.Context.Operation.Id = operationId;
if (!telemetry.Context.GlobalProperties.ContainsKey("keep"))
{
telemetry.Context.GlobalProperties.Add("keep", "true");
}
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
//continue setting up context if needed
var url = "https:microsoft.com";
using (var client = new HttpClient())
{
//Makes the headers configurable
Activity.Current = null;
using (var requestMessage =
new HttpRequestMessage(HttpMethod.Get, url))
{
//Makes the headers configurable
Activity.Current = null;
//set header manually
requestMessage.Headers.Add("Request-Id", operationId);
await client.SendAsync(requestMessage);
}
}
//send custom telemetry
telemetry.TrackDependency("Http", url, "myCall", startTime, timer.Elapsed, true);
Related
I am trying to use Serilog with Application Insights sink for logging purposes. I can see the logs in Search bar in Azure Portal (Application Insights) but same logs are not visible if we view the timeline of events in Failures or Performance Tab. Thanks
Below is the code am using for registering Logger in FunctionStartup, which then gets injected in Function for logging:
var logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.Enrich.WithProperty("ApplicationName", "testApp")
.Enrich.WithProperty("Environment", "Dev")
.WriteTo.ApplicationInsights(GetTelemetryClient("Instrumentationkey"), TelemetryConverter.Traces)
.CreateLogger();
builder.Services.AddSingleton<ILogger>(logger);
Telementory Client is getting fetched from a helper method:
public static TelemetryClient GetTelemetryClient(string key)
{
var teleConfig = new TelemetryConfiguration { InstrumentationKey = key };
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingExcludedTypes": "Request",
"samplingSettings": {
"isEnabled": true
}
}
}
}
I got your mean, and pls allow me to sum up my testing result here.
First, the failure blade is not designed for providing a timeline which used to trace the details(what happened before the exception take place), but to show all the exceptions, how often the error happened, how many users be affected, etc, it's more likely stand in a high place to see the whole program.
And to achieve your goal, I think you can use this kql query in the Logs blade or watching it in transaction blade.
union traces, requests,exceptions
| where operation_Id == "178845c426975d4eb96ba5f7b5f376e1"
Basically, we may add many logs in the executing chain, e.g. in the controller, log the input parameter, then log the result of data combining or formatting, log the exception information in catch, so here's my testing code. I can't see any other information in failure blade as you, but in the transaction blade, I can see the timeline.
public class HelloController : Controller
{
public string greet(string name)
{
Log.Verbose("come to greet function");
Log.Debug("serilog_debug_info");
Log.Information("greet name input " + name);
int count = int.Parse(name);
Log.Warning("enter greet name is : {0}", count);
return "hello " + name;
}
}
And we can easily find that, the whole chain shares the same operationId, and via all these logs, we can pinpoint the wrong line code. By the way, if I surround the code with try/catch, exception won't be captured in the failure blade.
==================================
Using Serilog integrate app insights, we need to send serilog to application insights, and we will see lots of Traces in transaction search, so it's better to made the MinimumLevel to be information and higher. The sreenshot below is my log details, and we can also use kql query by operationId to see the whole chain.
You can easily solve this by following the solution provided by Azure Application Insights on their GitHub repo, as per this Github Issue, you can either use the DI to configure TelemetryConfiguration, i.e
services.Configure<TelemetryConfiguration>(
(o) => {
o.InstrumentationKey = "123";
o.TelemetryInitializers.Add(new OperationCorrelationTelemetryInitializer());
});
or you can configure it manually like this:
var config = TelemetryConfiguration.CreateDefault();
var client = new TelemetryClient(config);
So in your code, you have to change your GetTelemetryClient from
public static TelemetryClient GetTelemetryClient(string key)
{
var teleConfig = new TelemetryConfiguration { InstrumentationKey = key };
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
to this
public static TelemetryClient GetTelemetryClient(string key)
{
var teleConfig = TelemetryConfiguration.CreateDefault();
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
In order to use logging using Telemetry Configuration as mentioned in the answer above for Azure Functions, we just need to update the function as in below snippet and on deployment it should fetch Instrumentation key itself
public static TelemetryClient GetTelemetryClient()
{
var teleConfig = TelemetryConfiguration.CreateDefault();
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
But to run both locally and after deployment on Azure. We need to add something like this in function Startup and get rid of the Function above.
builder.Services.Configure<TelemetryConfiguration>((o) =>
{
o.InstrumentationKey = "KEY";
o.TelemetryInitializers.Add(new OperationCorrelationTelemetryInitializer());
});
builder.Services.AddSingleton<ILogger>(sp =>
{
var logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.Enrich.WithProperty("ApplicationName", "TEST")
.Enrich.WithProperty("Environment", "DEV")
.WriteTo.ApplicationInsights(
sp.GetRequiredService<TelemetryConfiguration>(), TelemetryConverter.Traces).CreateLogger();
return logger;
});
After wards we just need to use typical DI in our classes/azure function to use ILogger
public class Test{
public ILogger _log;
public void Test(ILogger log){
_log=log;
}
}
I am developing a multi-tenant application registered on my Azure AD that consumes Office 365 apis, Graph API etc.
I followed this Microsoft sample to build my work which uses ADAL .NET library and OpenIdConnect: Microsoft.IdentityModel.Clients.ActiveDirectory, Version=2.19.0.0
In ADAL.NET, we use an AuthenticationContext instance with a custom inherited class for the TokenCache (see code the sample code here).
For each request to the authorized resources, depending on the API, we invoke one of these methods (see code below) to get the auth_token that will be put in the request Bearer parameter. Is it the correct way to do it?
We never make use of the method AcquireTokenByRefreshTokenAsync, does it mean that our application never uses the refresh_token? Does it mean that our user will have to relog after one hour? Should we implement a kind of refreshing procedure with AcquireTokenByRefreshTokenAsync in the catch statement? Can it be made without prompting anything to the end-user?
REMARK: I posted a question regarding OpenIdConnect authentication ticket lifetime. To me these two questions are unrelated but they may be.
string signInUserId = ClaimsPrincipal.Current.FindFirst(ClaimTypes.NameIdentifier).Value;
string userObjectId = ClaimsPrincipal.Current.FindFirst("http://schemas.microsoft.com/identity/claims/objectidentifier").Value;
string tenantId = ClaimsPrincipal.Current.FindFirst("http://schemas.microsoft.com/identity/claims/tenantid").Value;
public async Task<string> AcquireOutlook365TokenAsync()
{
AuthenticationContext authContext = new AuthenticationContext(string.Format("{0}/{1}", SettingsHelper.AuthorizationUri, tenantId), new ADALTokenCache(signInUserId));
try
{
var result = await authContext.AcquireTokenSilentAsync(#"https://outlook.office365.com/",
new ClientCredential(SettingsHelper.ClientId, SettingsHelper.AppKey),
new UserIdentifier(userObjectId, UserIdentifierType.UniqueId));
return result.AccessToken;
}
catch (AdalException exception)
{
//handle token acquisition failure
if (exception.ErrorCode == AdalError.FailedToAcquireTokenSilently)
{
authContext.TokenCache.Clear();
}
throw new HttpResponseException(new HttpResponseMessage(HttpStatusCode.Unauthorized));
}
}
public async Task<string> AcquireAzureGraphTokenAsync()
{
AuthenticationContext authContext = new AuthenticationContext(string.Format("{0}/{1}", SettingsHelper.AuthorizationUri, tenantId), new ADALTokenCache(signInUserId));
try
{
var result = await authContext.AcquireTokenSilentAsync(#"https://graph.windows.net/",
new ClientCredential(SettingsHelper.ClientId, SettingsHelper.AppKey),
new UserIdentifier(userObjectId, UserIdentifierType.UniqueId));
return result.AccessToken;
}
catch (AdalException exception)
{
//Same as other method
}
}
ADAL uses the stored refresh tokens automatically and transparently, you aren't required to perform any explicit action. AcquireTOkenByRefreshToken is in the ADAL surface for legacy reasons, and has been removed from version 3.x. More background at http://www.cloudidentity.com/blog/2015/08/13/adal-3-didnt-return-refresh-tokens-for-5-months-and-nobody-noticed/
I am trying to move from a WebAPI based REST service, to one encompassing the new implimentation of OData. I have the service working correctly, but am at a loss on how create unit tests that will test the odata query options.
when unit testing WebAPI methods, I am used to building the httpRequestMessage and injecting it in the constructure:
var request = new HttpRequestMessage();
request.Headers.Add("UserName", "TestUser");
request.Headers.Add("Password", password);
request.Headers.Add("OverRideToken", "false");
request.Headers.Add("AccessSystem", "Mobile");
request.Headers.Add("Seed", "testSeed");
var token = new Token();
var authController = new AuthorizationController(request);
try
{
var returnValue = authController.Get();
how would I go about injecting the odata request? I need to verify that $filter, $inlinecount, and other options are returning the proper records.
You can either test your controller or you can test against a running instance of your Web API (you should probably do both).
Testing your controller won't achieve what you are trying to do, so you will want to test by creating a self hosted in-memory instance of your Web API application. You can then either use HttpClient in your test classes (you will have to manually construct OData requests), or you can use the WCF Data Services Client in your test classes (this will allow you to query via LINQ).
Here's an example using WCF Data Services Client:
public class ODataContainerFactory
{
static HttpSelfHostServer server;
public static MyApplicationServer.Acceptance.ODataService.Container Create(Uri baseAddress)
{
var config = new HttpSelfHostConfiguration(baseAddress);
// Remove self host requirement to run with Adminprivileges
config.HostNameComparisonMode = System.ServiceModel.HostNameComparisonMode.Exact;
// Register Web API and OData Configuration
WebApiConfig.Register(config);
// Configure IoC
ConfigureIoC(dataSource, config);
// Do whatever else, e.g. setup fake data sources etc.
...
// Start server
server = new HttpSelfHostServer(config);
server.OpenAsync().Wait();
// Create container
var container = new MyApplicationServer.Acceptance.ODataService.Container(new Uri(baseAddress.ToString() + "odata/"));
// Configure container
container.IgnoreResourceNotFoundException = true;
container.IgnoreMissingProperties = true;
return container;
}
private static void ConfigureIoC(MockDatasource dataSource, HttpSelfHostConfiguration config)
{
var container = new UnityContainer();
container.RegisterType<TypeA, TypeB>();
...
...
config.DependencyResolver = new IoCContainer(container);
}
public static void Destroy()
{
server.CloseAsync().Wait();
server.Dispose();
}
}
The key here is the WebApiConfig.Register(HttpConfiguration config) method call, which is calling your Web API project.
Note that prior to the above you will need:
Fire up your Web API project
In your test class add a Service Reference to your OData root path.
This will create a Container object (in the example above MyApplicationServer.Acceptance.ODataService.Container), which you can use to query your OData feed in your tests as follows:
var odataContainer = ODataContainerFactory.Create(new Uri("http://localhost:19194/");
var result = odataContainer.MyEntities
.Expand(s => s.ChildReferenceType)
.Where(s => s.EntityKey == someValue).SingleOrDefault();
I am using MassTransit request and response with SignalR. The web site makes a request to a windows service that creates a file. When the file has been created the windows service will send a response message back to the web site. The web site will open the file and make it available for the users to see. I want to handle the scenario where the user closes the web page before the file is created. In that case I want the created file to be emailed to them.
Regardless of whether the user has closed the web page or not, the message handler for the response message will be run. What I want to be able to do is have some way of knowing within the response message handler that the web page has been closed. This is what I have done already. It doesnt work but it does illustrate my thinking. On the web page I have
$(window).unload(function () {
if (event.clientY < 0) {
// $.connection.hub.stop();
$.connection.exportcreate.setIsDisconnected();
}
});
exportcreate is my Hub name. In setIsDisconnected would I set a property on Caller? Lets say I successfully set a property to indicate that the web page has been closed. How do I find out that value in the response message handler. This is what it does now
protected void BasicResponseHandler(BasicResponse message)
{
string groupName = CorrelationIdGroupName(message.CorrelationId);
GetClients()[groupName].display(message.ExportGuid);
}
private static dynamic GetClients()
{
return AspNetHost.DependencyResolver.Resolve<IConnectionManager>().GetClients<ExportCreateHub>();
}
I am using the message correlation id as a group. Now for me the ExportGuid on the message is very important. That is used to identify the file. So if I am going to email the created file I have to do it within the response handler because I need the ExportGuid value. If I did store a value on Caller in my hub for the web page close, how would I access it in the response handler.
Just in case you need to know. display is defined on the web page as
exportCreate.display = function (guid) {
setTimeout(function () {
top.location.href = 'GetExport.ashx?guid=' + guid;
}, 500);
};
GetExport.ashx opens the file and returns it as a response.
Thank you,
Regards Ben
I think a better bet would be to implement proper connection handling. Specifically, have your hub implementing IDisconnect and IConnected. You would then have a mapping of connectionId to document Guid.
public Task Connect()
{
connectionManager.MapConnectionToUser(Context.ConnectionId, Context.User.Name);
}
public Task Disconnect()
{
var connectionId = Context.ConnectionId;
var docId = connectionManager.LookupDocumentId(connectionId);
if (docId != Guid.Empty)
{
var userName = connectionManager.GetUserFromConnectionId(connectionId);
var user = userRepository.GetUserByUserName(userName);
bus.Publish( new EmailDocumentToUserCommand(docId, user.Email));
}
}
// Call from client
public void GenerateDocument(ClientParameters docParameters)
{
var docId = Guid.NewGuid();
connectionManager.MapDocumentIdToConnection(Context.ConnectionId, docId);
var command = new CreateDocumentCommand(docParameters);
command.Correlationid = docId;
bus.Publish(command);
Caller.creatingDocument(docId);
}
// Acknowledge you got the doc.
// Call this from the display method on the client.
// If this is not called, the disconnect method will handle sending
// by email.
public void Ack(Guid docId)
{
connectionManager.UnmapDocumentFromConnectionId(connectionId, docId);
Caller.sendMessage("ok");
}
Of course this is from the top of my head.
I'm working with a programmatically configurated WCF Client (System.ServiceModel.ClientBase). This WCF Client is configured using a CustomBinding, which has a TextMessageEncodingBindingElement by default.
Now when I try to switch to Mtom encoding, I change the Client's Endpoint.Binding property, which works fine. The Endpoint.Binding property show's it has changed.
Unfortunately when I execute one of the methods the WCF service exposes, it still uses TextMessageEncoding and I can't figure out why.
I've got it working though, by constructing a new ClientBase and passing the new EndPointBinding in the constructor:
socialProxy = new SocialProxyClient(SocialProxyClientSettings.SocialProxyMTomEndPointBinding, new EndpointAddress(SocialProxyClientSettings.SocialProxyEndPointAddress));
But when I try this it doesn't work:
socialProxy.Endpoint.Binding = SocialProxyClientSettings.SocialProxyMTomEndPointBinding;
These are my definitions for the EndPointBindings:
public static TextMessageEncodingBindingElement TextMessageEncodingBindingElement
{
get
{
if (_textMessageEncodingBindingElement == null)
{
_textMessageEncodingBindingElement = new TextMessageEncodingBindingElement() { MessageVersion = MessageVersion.Soap11 };
_textMessageEncodingBindingElement.ReaderQuotas = new System.Xml.XmlDictionaryReaderQuotas()
{
MaxDepth = 32,
MaxStringContentLength = 5242880,
MaxArrayLength = 204800000,
MaxBytesPerRead = 5242880,
MaxNameTableCharCount = 5242880
};
}
return _textMessageEncodingBindingElement;
}
}
public static MtomMessageEncodingBindingElement MtomMessageEncodingBindingElement
{
get
{
if (_mtomMessageEncodingBindingElement == null)
{
_mtomMessageEncodingBindingElement = new MtomMessageEncodingBindingElement();
_mtomMessageEncodingBindingElement.MaxReadPoolSize = TextMessageEncodingBindingElement.MaxReadPoolSize;
_mtomMessageEncodingBindingElement.MaxWritePoolSize = TextMessageEncodingBindingElement.MaxWritePoolSize;
_mtomMessageEncodingBindingElement.MessageVersion = TextMessageEncodingBindingElement.MessageVersion;
_mtomMessageEncodingBindingElement.ReaderQuotas.MaxDepth = TextMessageEncodingBindingElement.ReaderQuotas.MaxDepth;
_mtomMessageEncodingBindingElement.ReaderQuotas.MaxStringContentLength = TextMessageEncodingBindingElement.ReaderQuotas.MaxStringContentLength;
_mtomMessageEncodingBindingElement.ReaderQuotas.MaxArrayLength = TextMessageEncodingBindingElement.ReaderQuotas.MaxArrayLength;
_mtomMessageEncodingBindingElement.ReaderQuotas.MaxBytesPerRead = TextMessageEncodingBindingElement.ReaderQuotas.MaxBytesPerRead;
_mtomMessageEncodingBindingElement.ReaderQuotas.MaxNameTableCharCount = TextMessageEncodingBindingElement.ReaderQuotas.MaxNameTableCharCount;
}
return _mtomMessageEncodingBindingElement;
}
}
Can someone explain why changing the Endpoint.Binding programmatically doesn't work?
I believe that during construction of the ClientBase, the original Binding is used to create some helper objects. Changing the binding later does not change those helper objects.
To make any adjustments after construction, you likely need a custom Binding Behavior that you can tweak the internals of the Binding as you need. Use that in the construction so all helper objects are prepared for your later changes. As usual, all you want is one simple behavior change, but you will need to also write the ancillary helper classes to support your one behavior change.
See the SO thread: ONVIF Authentication in .NET 4.0 with Visual Studio 2010
For a discussion of CustomBinding issues.
See the blog post: Supporting the WS-I Basic Profile Password Digest in a WCF Client Proxy
For an example of a custom Behavior that lets you change the Username Token on the fly.
Perhaps something similar can be done to let you control the local endpoint binding on the fly.
UPDATE: More reading here in StackOverflow, and pages it links to and I believe i have found the answer you are looking for.
For PasswordDigestBehavior:
see: ONVIF Authentication in .NET 4.0 with Visual Studios 2010
and: http://benpowell.org/supporting-the-ws-i-basic-profile-password-digest-in-a-wcf-client-proxy/
For local NIC binding:
see: Specify the outgoing IP address to use with WCF client
// ASSUMPTIONS:
// 1: DeviceClient is generated by svcutil from your WSDL.
// 1.1: DeviceClient is derived from
// System.ServiceModel.ClientBase<Your.Wsdl.Device>
// 2: serviceAddress is the Uri provided for your service.
//
private static DeviceClient CreateDeviceClient(IPAddress nicAddress,
Uri serviceAddress,
String username,
String password)
{
if (null == serviceAddress)
throw new ArgumentNullException("serviceAddress");
//////////////////////////////////////////////////////////////////////////////
// I didn't know how to put a variable set of credentials into a static
// app.config file.
// But I found this article that talks about how to set up the right kind
// of binding on the fly.
// I also found the implementation of PasswordDigestBehavior to get it all to work.
//
// from: https://stackoverflow.com/questions/5638247/onvif-authentication-in-net-4-0-with-visual-studios-2010
// see: http://benpowell.org/supporting-the-ws-i-basic-profile-password-digest-in-a-wcf-client-proxy/
//
EndpointAddress serviceEndpointAddress = new EndpointAddress(serviceAddress);
HttpTransportBindingElement httpBinding = new HttpTransportBindingElement();
if (!String.IsNullOrEmpty(username))
{
httpBinding.AuthenticationScheme = AuthenticationSchemes.Digest;
}
else
{
httpBinding.AuthenticationScheme = AuthenticationSchemes.Anonymous;
}
var messageElement = new TextMessageEncodingBindingElement();
messageElement.MessageVersion =
MessageVersion.CreateVersion(EnvelopeVersion.Soap12, AddressingVersion.None);
CustomBinding bind = new CustomBinding(messageElement, httpBinding);
////////////////////////////////////////////////////////////////////////////////
// from: https://stackoverflow.com/questions/3249846/specify-the-outgoing-ip-address-to-use-with-wcf-client
// Adjust the serviceEndpointAddress to bind to the local NIC, if at all possible.
//
ServicePoint sPoint = ServicePointManager.FindServicePoint(serviceAddress);
sPoint.BindIPEndPointDelegate = delegate(
System.Net.ServicePoint servicePoint,
System.Net.IPEndPoint remoteEndPoint,
int retryCount)
{
// if we know our NIC local address, use it
//
if ((null != nicAddress)
&& (nicAddress.AddressFamily == remoteEndPoint.AddressFamily))
{
return new System.Net.IPEndPoint(nicAddress, 0);
}
else if (System.Net.Sockets.AddressFamily.InterNetworkV6 == remoteEndPoint.AddressFamily)
{
return new System.Net.IPEndPoint(System.Net.IPAddress.IPv6Any, 0);
}
else // if (System.Net.Sockets.AddressFamily.InterNetwork == remoteEndPoint.AddressFamily)
{
return new System.Net.IPEndPoint(System.Net.IPAddress.Any, 0);
}
};
/////////////////////////////////////////////////////////////////////////////
DeviceClient client = new DeviceClient(bind, serviceEndpointAddress);
// Add our custom behavior
// - this requires the Microsoft WSE 3.0 SDK file: Microsoft.Web.Services3.dll
//
PasswordDigestBehavior behavior = new PasswordDigestBehavior(username, password);
client.Endpoint.Behaviors.Add(behavior);
return client;
}