I was using below code in Xamarion.Android,
GoogleApiClient client = new GoogleApiClient.Builder(CrossCurrentActivity.Current.Activity)
.AddApi(PlacesClass.GEO_DATA_API)
.Build();
Client.Connect();
AutocompletePredictionBuffer buffer = await PlacesClass.GeoDataApi
.GetAutocompletePredictionsAsync(client, query, bounds, null);
Buffer use to get the required details. After deprecation of Android.Gms.Location.Places. I enable the Places API from the console and started using the new package Google.Places.
AutocompleteSessionToken token = AutocompleteSessionToken.NewInstance();
PlacesClient placesClient = Places.createClient(this);
var request = FindAutocompletePredictionsRequest.InvokeBuilder().SetLocationBias(bounds1).SetCountry("US").SetSessionToken(token).SetQuery(query).Build();
var prediction = placesClient.FindAutocompletePredictions(request);
var resp = prediction.Result;
Application breaks on prediction.Result. Due to no result. What i am missing?
Related
Im trying to fetch messages on-behalf-from-user from Graph. I'm able to create a client using the usual Microsoft Graph, but i just cant get it to work with Asposes version, IGraphClient. How do i gain access on behalf??
Below is how i create a Microsoft Graph Client, it works, but i need the Aspose version of IGraphClient.
var tenantId = tenantID;
var clientId = clientID;
var clientSecret = secret;
var options = new TokenCredentialOptions
{
AuthorityHost = AzureAuthorityHosts.AzurePublicCloud
};
var clientSecretCredential = new ClientSecretCredential(
tenantId, clientId, clientSecret, options);
var client = new GraphServiceClient(clientSecretCredential);
I have a xamarin app --> azureFunction --->BlobStorage. so far so good.
The AzureFunction is set with AuthorizationLevel.Function.
I have set the azure function Managed identity "ON"
I have assigned a role to the BlobStorage (Blob data Contributor)
I can successfully call the function using postman using the function key.
I would like to store the functionKey in the KeyVault and call it from my mobile app
Question
As anybody got a walkthrough and snippet how to integrate the keyvault with a function key and call it from a mobile app (xamarin forms) c#?
I do not want to hardcode any keys in my mobile app.
I would be very grateful.Lots of googling and nothing.
thanks
Suppose your requirement is call the function from the code. Maybe you could refer to the below code.
AzureServiceTokenProvider azureServiceTokenProvider = new AzureServiceTokenProvider();
KeyVaultClient keyVaultClient = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
var secret = await keyVaultClient.GetSecretAsync("your Secret Identifier")
.ConfigureAwait(false);
string functionkey = secret.Value;
string functionhost = "https://your function.azurewebsites.net/api/function name";
var param = new Dictionary<string, string>() { { "code", functionkey } ,{ "name","george"} };
Uri functionurl = new Uri(QueryHelpers.AddQueryString(functionhost, param));
var request = (HttpWebRequest)WebRequest.Create(functionurl);
var response = (HttpWebResponse)request.GetResponse();
string responseString;
using (var stream = response.GetResponseStream())
{
using (var reader = new StreamReader(stream))
{
responseString = reader.ReadToEnd();
Console.WriteLine(responseString);
}
}
I wish to create a large (multi-GB) file in an AWS S3 bucket from an ASP.NET Core Web API. The file is sufficiently large that I wish not to load the Stream into memory prior to uploading it to AWS S3.
Using PutObjectAsync() I'm forced to pre-populate the Stream prior to passing it on to the AWS SDK, illustrated below:
var putObjectRequest = new PutObjectRequest
{
BucketName = "my-s3-bucket",
Key = "my-file-name.txt",
InputStream = stream
};
var putObjectResponse = await amazonS3Client.PutObjectAsync(putObjectRequest);
My ideal pattern would involve the AWS SDK returning a StreamWriter (of sorts) I could Write() to many times and then Finalise() when I'm done.
Two questions concerning my challenge:
Am I misinformed about having to pre-populate the Stream prior to calling on PutObjectAsync()?
How should I go about uploading my large (multi-GB) file?
For such situations AWS docs provides two options:
Using the AWS .NET SDK for Multipart Upload (High-Level API)
Using the AWS .NET SDK for Multipart Upload (Low-Level API)
High-level API simply suggests you to create a TransferUtilityUploadRequest with a PartSize specified, so the class itself could upload the file without any need to maintain the upload by yourself. In this case you can get the progress on the multipart upload with subscribing to StreamTransferProgress event. You can upload a file, a stream, or a directory.
Low-level API, obviously, is more complicated, but more flexible - you can initiate the upload, and after that you do upload the next part of a file in a loop. Sample code from documentation:
var s3Client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);
// List to store upload part responses.
var uploadResponses = new List<UploadPartResponse>();
// 1. Initialize.
var initiateRequest = new InitiateMultipartUploadRequest
{
BucketName = existingBucketName,
Key = keyName
};
var initResponse = s3Client.InitiateMultipartUpload(initRequest);
// 2. Upload Parts.
var contentLength = new FileInfo(filePath).Length;
var partSize = 5242880; // 5 MB
try
{
long filePosition = 0;
for (var i = 1; filePosition < contentLength; ++i)
{
// Create request to upload a part.
var uploadRequest = new UploadPartRequest
{
BucketName = existingBucketName,
Key = keyName,
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = filePath
};
// Upload part and add response to our list.
uploadResponses.Add(s3Client.UploadPart(uploadRequest));
filePosition += partSize;
}
// Step 3: complete.
var completeRequest = new CompleteMultipartUploadRequest
{
BucketName = existingBucketName,
Key = keyName,
UploadId = initResponse.UploadId,
};
// add ETags for uploaded files
completeRequest.AddPartETags(uploadResponses);
var completeUploadResponse = s3Client.CompleteMultipartUpload(completeRequest);
}
catch (Exception exception)
{
Console.WriteLine("Exception occurred: {0}", exception.ToString());
var abortMPURequest = new AbortMultipartUploadRequest
{
BucketName = existingBucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
s3Client.AbortMultipartUpload(abortMPURequest);
}
Asynchronous version of UploadPart is available too, so you should investigate that path, if you need a full control for your uploads.
I have two identityservers and one Web API.
What Im trying to do is having the API authenticate with one or both of the IdentityServers and being able to switch if one goes down. If possbile I would also like to be able to add a new IdentityServer at runtime.
Is there any best practice here?
As of now it looks like this.
app.UseIdentityServerAuthentication(new IdentityServerAuthenticationOptions
{
Authority = $"http://localhost:5000",
ScopeName = "my.scope",
RequireHttpsMetadata = false,
ScopeSecret = "secret",
});
If I shut down the IdentityServer at port 5000 I can't use the API anymore. Which is to be expected.
Im not sure if this is a good way to solve it. But it's one way.
I ask my routing service for the "first identityservice" to set the Authroity in options.
And then I add a custom IntrospectionBackChannelHandler
app.UseIdentityServerAuthentication(new IdentityServerAuthenticationOptions
{
Authority = $"http://{v.Address}:{v.Port}",
IntrospectionBackChannelHandler = new CustomIntrospectionBackChannelHandler(consulService)
Since all my identity servers look the same but are on different addresses I dont really have to bother to do the Authority thing again.
Inside the custom Introspect.... I check each introspect and send it to the "correct" identityserver. If its not working I try another identityserver.
var qs = await request.Content.ReadAsStringAsync();
var queryDic = QueryHelpers.ParseQuery(await request.Content.ReadAsStringAsync());
var token = queryDic["token"];
var client_id = queryDic["client_id"];
var client_secret = queryDic["client_secret"];
var iRequest = new IntrospectionRequest
{
ClientId = client_id,
ClientSecret = client_secret,
TokenTypeHint = "access_token",
Token = token
};
IntrospectionResponse result = null;
var svc = await _Consul.GetService(OrbitServices.IdentityServer);
result = await TrySendAsync(iRequest, svc);
if (!result.IsActive && result.IsError)
{
svc = await _Consul.GetService(OrbitServices.IdentityServer, true);
result = await TrySendAsync(iRequest, svc);
}
var message = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(result.Raw, Encoding.UTF8, "application/json")
};
return message;
I am using ASP.NET Web API and Google.Apis.Drive.v2 Client Library for .NET to upload files to users Drive.
All examples of using the Drive Client Library for .NET require a authentication flow. But how should I create the DriveService when I already know the access token?
Despite the fact that have been 2 years since the question has been asked, today I've encountered the same situation and my solution is:
var valid_token = "Pass_the_valid_token_here";
var token = new Google.Apis.Auth.OAuth2.Responses.TokenResponse()
{
AccessToken = valid_token,
ExpiresInSeconds = 3600,
Issued = DateTime.Now
};
var fakeflow = new GoogleAuthorizationCodeFlow(new GoogleAuthorizationCodeFlow.Initializer
{
ClientSecrets = new ClientSecrets
{
ClientId = "fakeClientId",
ClientSecret = "fakeClientSecret"
}
});
UserCredential credential = new UserCredential(fakeflow, "fakeUserId", token);
var serviceInitializer = new BaseClientService.Initializer()
{
//ApplicationName = "Storage Sample",
HttpClientInitializer = credential
};
DriveService service = new DriveService(serviceInitializer);
Update
You could create your own custom token but the issue with this is going to be that the client library will not be able to refresh your access without the refresh token.
var token = new Google.Apis.Auth.OAuth2.Responses.TokenResponse()
{
AccessToken = valid_token,
ExpiresInSeconds = 3600,
Issued = DateTime.Now
};
var authorization = new GoogleAuthorizationCodeFlow(new GoogleAuthorizationCodeFlow.Initializer
{
ClientSecrets = new ClientSecrets
{
ClientId = "lientId",
ClientSecret = "ClientSecret"
}
});
var credential = new UserCredential(authorization, "user", token);
The issue you are going to have with this is that the client library is not going to be able refersh the access token after it has expired since you are not supplying a refresh token its only going to work for an hour.
The answer from Svetoslav Georgiev has so far worked well for me - Can't thank you enough. Google really don't help themselves with the lack of .Net (Asp Core) samples etc. Anway, one problem I did run into was that of referer restriction, so a addition/slight modification to the answer - Once you have the "service" and want to say upload a file, you need to set the referer on a buried HttpClient property...
FilesResource.CreateMediaUpload uploadRequest;
byte[] byteArray = Encoding.UTF8.GetBytes(html);
using (var stream = new MemoryStream(byteArray))
{
uploadRequest = service.Files.Create(fileMetadata, stream, "text/html");
uploadRequest.Service.HttpClient.DefaultRequestHeaders.Referrer = new Uri($"{baseUrl}");
uploadRequest.Fields = "id";
var progress = uploadRequest.Upload();
if (progress.Exception != null)
{
throw progress.Exception;
}
var file = uploadRequest.ResponseBody;
.... do what you will with file ....
}