Null response creating file using Google Drive .NET API - asp.net

I am trying to upload a file onto my Drive using Google Drive .NET API v3. My code is below
static string[] Scopes = { DriveService.Scope.Drive,
DriveService.Scope.DriveAppdata,
DriveService.Scope.DriveFile,
DriveService.Scope.DriveMetadataReadonly,
DriveService.Scope.DriveReadonly,
DriveService.Scope.DriveScripts };
static string ApplicationName = "Drive API .NET Quickstart";
public ActionResult Index()
{
UserCredential credential;
using (var stream =
new FileStream("C:/Users/admin1/Documents/visual studio 2017/Projects/TryGoogleDrive/TryGoogleDrive/client_secret.json", FileMode.Open, FileAccess.Read))
{
string credPath = Environment.GetFolderPath(
System.Environment.SpecialFolder.Personal);
credPath = Path.Combine(credPath, ".credentials/drive-dotnet-quickstart.json");
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
Debug.WriteLine("Credential file saved to: " + credPath);
}
// Create Drive API service.
var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
// Define parameters of request.
FilesResource.ListRequest listRequest = service.Files.List();
listRequest.PageSize = 10;
listRequest.Fields = "nextPageToken, files(id, name)";
// List files.
IList<Google.Apis.Drive.v3.Data.File> files = listRequest.Execute()
.Files;
Debug.WriteLine("Files:");
if (files != null && files.Count > 0)
{
foreach (var file in files)
{
Debug.WriteLine("{0} ({1})", file.Name, file.Id);
}
}
else
{
Debug.WriteLine("No files found.");
}
var fileMetadata = new Google.Apis.Drive.v3.Data.File()
{
Name = "report.csv",
MimeType = "text/csv",
};
FilesResource.CreateMediaUpload request;
using (var stream = new FileStream("C:/debugging/report.csv",
FileMode.Open))
{
request = service.Files.Create(
fileMetadata, stream, "text/csv");
request.Fields = "id";
request.Upload();
}
var response = request.ResponseBody;
Console.WriteLine("File ID: " + response.Id);
return View();
}
The problem I'm facing is that response is always null. I looked into it a bit further and found that the request returned a 403 resultCode. I also took a look at some other questions on SO this and this but neither were of any help.
Edit: I forgot to mention that the first part of the code is working correctly - it lists all the files in my Drive. Only the second part is not working (the upload file part)

string[] Scopes = { DriveService.Scope.Drive };
Change the Drive scope then delete the file token.json
in vs2017 you can see token.json file in token.json folder when client_secret.json file present.

Try to visit this post from ASP.NET forum.
The same idea as what you want to do in your app, since you are dealing with uploading a file in Google Drive using .net.
You may try to call rest api directly to achieve your requirement :
The quickstart from .net will help you to make requests from/to the Drive API.
Upload Files:
The Drive API allows you to upload file data when create or
updating a File resource.
You can send upload requests in any of the following ways:
Simple upload: uploadType=media. For quick transfer of a small file (5 MB or less). To perform a simple upload, refer to Performing
a Simple Upload.
Multipart upload: uploadType=multipart. For quick transfer of a small file (5 MB or less) and metadata describing the file, all in a
single request. To perform a multipart upload, refer to Performing a
Multipart Upload.
Resumable upload: uploadType=resumable. For more reliable transfer, especially important with large files. Resumable uploads are
a good choice for most applications, since they also work for small
files at the cost of one additional HTTP request per upload. To
perform a resumable upload, refer to Performing a Resumable
Upload.
You may try this code from the documentation on uploading sample file.
var fileMetadata = new File()
{
Name = "photo.jpg"
};
FilesResource.CreateMediaUpload request;
using (var stream = new System.IO.FileStream("files/photo.jpg",
System.IO.FileMode.Open))
{
request = driveService.Files.Create(
fileMetadata, stream, "image/jpeg");
request.Fields = "id";
request.Upload();
}
var file = request.ResponseBody;
Console.WriteLine("File ID: " + file.Id);
You may check the errors you may encounter in this documentation.

Have a look at what request.Upload() returns. For me when I was having this issue it returned:
Insufficient Permission Errors [Message[Insufficient Permission] Location[ - ]
I changed my scope from DriveService.Scope.DriveReadonly to DriveService.Scope.Drive and I was in business.

Change static string[] Scopes = { DriveService.Scope.DriveReadonly }; to static string[] Scopes = { DriveService.Scope.Drive };.
After changes, take a look into token.json file and check does it change its scope from DriveReadonly to Drive.
If you are seeing DriveReadonly then delete the token.json file and run the application again.

Related

Is it possible to connect Azure to google drive?

I am writing a system in ASP.NET CORE and uploading it to AZURE.
I'm looking for a free place to store files that customers upload.
Azure storage costs money so I thought of connecting it to Google drive
Is it possible? And can you explain to me how?
You could defiantly store your files in Google drive. Use the google drive api, do it with a service account as it sounds like you will only be uploading to an account you control.
GoogleCredential credential;
using (var stream = new FileStream(serviceAccountCredentialFilePath, FileMode.Open, FileAccess.Read))
{
credential = GoogleCredential.FromStream(stream)
.CreateScoped(scopes);
}
// Create the Analytics service.
return new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "Drive Service account Authentication Sample",
});
Uploading files to Google Drive API with a service account
as for the hosting in azure part. I recommend using the json key file as opposed to the p12 certificate file. I have had issues with that in azure before.
Upload from memory
As the files your users are uploading may just be a memory stream rather then files stored on your hard drive this is also possible using a MemoryStream
var uploadString = "Test";
var fileName = "ploadFileString.txt";
// Upload file Metadata
var fileMetadata = new Google.Apis.Drive.v3.Data.File()
{
Name = fileName,
Parents = new List<string>() { "1R_QjyKyvET838G6loFSRu27C-3ASMJJa" } // folder to upload the file to
};
var fsSource = new MemoryStream(Encoding.UTF8.GetBytes(uploadString ?? ""));
string uploadedFileId;
// Create a new file, with metadata and stream.
var request = service.Files.Create(fileMetadata, fsSource, "text/plain");
request.Fields = "*";
var results = await request.UploadAsync(CancellationToken.None);
if (results.Status == UploadStatus.Failed)
{
Console.WriteLine($"Error uploading file: {results.Exception.Message}");
}
// the file id of the new file we created
uploadedFileId = request.ResponseBody?.Id;
How to upload to Google Drive API from memory with C#

Sending an Excel file from Bot to User in Microsoft Bot framework V4

I am developing a Bot application using MS bot framework V4. I want to send Excel file (.xlsx) from Bot to the user, it is a template which user later fills in and send back to Bot. But with the following code, I can see the file in the chat window but user is not able to download the file. How can I achieve this functionality where user can download it.
Note: Please see the ELSE part below where user is supposed to download the Excel template.
private async Task<DialogTurnResult> UploadFileTemplateCheckStepAsync(WaterfallStepContext stepContext, CancellationToken cancellationToken)
{
string choice = ((FoundChoice)stepContext.Result).Value;
if (choice.ToLower().Equals("yes"))
{
stepContext.Values["IsFileTemplate"] = (string)stepContext.Result;
return await stepContext.PromptAsync(nameof(TextPrompt), new PromptOptions
{ Prompt = MessageFactory.Text("Please upload the file.") }, cancellationToken);
}
else
{
//Provide the template.
var ImagePath = Path.Combine(_env.WebRootPath, "TeeUpTemplate.xlsx");
var ImageData = Convert.ToBase64String(File.ReadAllBytes(ImagePath));
Attachment attachment = new Attachment()
{
Name = "TeeUpTemplate.xlsx", ContentType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
ThumbnailUrl = Path.Combine(_env.WebRootPath, "Excel.png"),
ContentUrl = $"data:application/vnd.ms-excel;base64,{ImageData}"
};
IMessageActivity reply = MessageFactory.Text("Use this template to fill the file.");
reply.Attachments = new List<Attachment>();
reply.Attachments.Add(attachment);
reply.From = stepContext.Context.Activity.From;
await stepContext.Context.SendActivityAsync(reply, cancellationToken);
return await stepContext.PromptAsync(nameof(TextPrompt), new PromptOptions
{ Prompt = MessageFactory.Text("Upload the file") }, cancellationToken);
}
}
Here is the solution which I implemented. If your bot works with attachments then one of the method is One Drive subscription or sharepoint subscription. It is a mandate. I used One Drive subscription. All the attachment first gets uploaded or downloaded to your One Drive Account from where it goes to the bot.
The major drawback for developers is you can not test this in Bot Emulator, it has to be
tested directly in Teams as Content Type = application/vnd.microsoft.teams.file.download.info only has support in Teams.
One more point is you will have to upload Manifest file with file with
"supportsFiles": true,
You may refer this sample for C# to get more insights:
https://github.com/microsoft/BotBuilder-Samples/tree/main/samples/csharp_dotnetcore/56.teams-file-upload
Thank you.

Stream a multi-GB file to AWS S3 from ASP.NET Core Web API

I wish to create a large (multi-GB) file in an AWS S3 bucket from an ASP.NET Core Web API. The file is sufficiently large that I wish not to load the Stream into memory prior to uploading it to AWS S3.
Using PutObjectAsync() I'm forced to pre-populate the Stream prior to passing it on to the AWS SDK, illustrated below:
var putObjectRequest = new PutObjectRequest
{
BucketName = "my-s3-bucket",
Key = "my-file-name.txt",
InputStream = stream
};
var putObjectResponse = await amazonS3Client.PutObjectAsync(putObjectRequest);
My ideal pattern would involve the AWS SDK returning a StreamWriter (of sorts) I could Write() to many times and then Finalise() when I'm done.
Two questions concerning my challenge:
Am I misinformed about having to pre-populate the Stream prior to calling on PutObjectAsync()?
How should I go about uploading my large (multi-GB) file?
For such situations AWS docs provides two options:
Using the AWS .NET SDK for Multipart Upload (High-Level API)
Using the AWS .NET SDK for Multipart Upload (Low-Level API)
High-level API simply suggests you to create a TransferUtilityUploadRequest with a PartSize specified, so the class itself could upload the file without any need to maintain the upload by yourself. In this case you can get the progress on the multipart upload with subscribing to StreamTransferProgress event. You can upload a file, a stream, or a directory.
Low-level API, obviously, is more complicated, but more flexible - you can initiate the upload, and after that you do upload the next part of a file in a loop. Sample code from documentation:
var s3Client = new AmazonS3Client(Amazon.RegionEndpoint.USEast1);
// List to store upload part responses.
var uploadResponses = new List<UploadPartResponse>();
// 1. Initialize.
var initiateRequest = new InitiateMultipartUploadRequest
{
BucketName = existingBucketName,
Key = keyName
};
var initResponse = s3Client.InitiateMultipartUpload(initRequest);
// 2. Upload Parts.
var contentLength = new FileInfo(filePath).Length;
var partSize = 5242880; // 5 MB
try
{
long filePosition = 0;
for (var i = 1; filePosition < contentLength; ++i)
{
// Create request to upload a part.
var uploadRequest = new UploadPartRequest
{
BucketName = existingBucketName,
Key = keyName,
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = filePath
};
// Upload part and add response to our list.
uploadResponses.Add(s3Client.UploadPart(uploadRequest));
filePosition += partSize;
}
// Step 3: complete.
var completeRequest = new CompleteMultipartUploadRequest
{
BucketName = existingBucketName,
Key = keyName,
UploadId = initResponse.UploadId,
};
// add ETags for uploaded files
completeRequest.AddPartETags(uploadResponses);
var completeUploadResponse = s3Client.CompleteMultipartUpload(completeRequest);
}
catch (Exception exception)
{
Console.WriteLine("Exception occurred: {0}", exception.ToString());
var abortMPURequest = new AbortMultipartUploadRequest
{
BucketName = existingBucketName,
Key = keyName,
UploadId = initResponse.UploadId
};
s3Client.AbortMultipartUpload(abortMPURequest);
}
Asynchronous version of UploadPart is available too, so you should investigate that path, if you need a full control for your uploads.

Google Drive Access: At least one client secrets (Installed or Web) should be set

We have tested demo code of Google Drive (for console application) and that went correct, after that we are trying to implement the same to Web Application which at present is hosted at localhost. The application is giving us this exception:
An exception of type 'System.InvalidOperationException' occurred in Google.Apis.Auth.dll but was not handled in user code Additional
information: At least one client secrets (Installed or Web) should be
set
The code we are trying to run is this:
UserCredential credential;
GoogleClientSecrets s= new GoogleClientSecrets();
s.Secrets.ClientId="xxxxxxxxxx-xxxxxxxx.apps.googleusercontent.com";
s.Secrets.ClientSecret="yyyyyyyyyyyyyyyyyyyyy";
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(s.Secrets,Scopes,"user",CancellationToken.None,null).Result;
var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential
//ApplicationName = ApplicationName,
});
Before this we have put refresh token to get access token. We are also not aware what to do with this access token stores in a string variable.
We are trying to have it accessed unattanded, we used google playground to get first refresh token.
Please help us out, we are trying to get it done from last 7 days with no success.
Code for a service account:
string[] scopes = new string[] {DriveService.Scope.Drive}; // Full access
var keyFilePath = #"c:\file.p12" ; // Downloaded from https://console.developers.google.com
var serviceAccountEmail = "xx#developer.gserviceaccount.com"; // found https://console.developers.google.com
//loading the Key file
var certificate = new X509Certificate2(keyFilePath, "notasecret", X509KeyStorageFlags.Exportable);
var credential = new ServiceAccountCredential( new ServiceAccountCredential.Initializer(serviceAccountEmail) {
Scopes = scopes}.FromCertificate(certificate));
code ripped from Google Drive Authentication C# which also incudes an example for using Oauth2
Upload file
/// <summary>
/// Uploads a file
/// Documentation: https://developers.google.com/drive/v2/reference/files/insert
/// </summary>
/// <param name="_service">a Valid authenticated DriveService</param>
/// <param name="_uploadFile">path to the file to upload</param>
/// <param name="_parent">Collection of parent folders which contain this file.
/// Setting this field will put the file in all of the provided folders. root folder.</param>
/// <returns>If upload succeeded returns the File resource of the uploaded file
/// If the upload fails returns null</returns>
public static File uploadFile(DriveService _service, string _uploadFile, string _parent) {
if (System.IO.File.Exists(_uploadFile))
{
File body = new File();
body.Title = System.IO.Path.GetFileName(_uploadFile);
body.Description = "File uploaded by Diamto Drive Sample";
body.MimeType = GetMimeType(_uploadFile);
body.Parents = new List<ParentReference>() { new ParentReference() { Id = _parent } };
// File's content.
byte[] byteArray = System.IO.File.ReadAllBytes(_uploadFile);
System.IO.MemoryStream stream = new System.IO.MemoryStream(byteArray);
try
{
FilesResource.InsertMediaUpload request = _service.Files.Insert(body, stream, GetMimeType(_uploadFile));
//request.Convert = true; // uncomment this line if you want files to be converted to Drive format
request.Upload();
return request.ResponseBody;
}
catch (Exception e)
{
Console.WriteLine("An error occurred: " + e.Message);
return null;
}
}
else {
Console.WriteLine("File does not exist: " + _uploadFile);
return null;
}
}
Code ripped from DaimtoGoogleDriveHelper.cs
Tip:
The thing to remember with a service account is that it is not you, it is a dummy user account. A service account has its own google drive account so uploading files to it will upload to its account not yours. What you can do is take the service account email address add it as a user on a folder in YOUR personal google drive account giving it write access. This will then allow the Service account to upload a file to your personal google drive account. Remember to patch the file after upload granting yourself permissions to the file other wise the owner of the file will be the service account. There is currently a bug in the Google Drive api you have to patch the permissions on the file after you upload the file you cant do it at the time you upload it its a two step process. (been there done that)
Google drive sample project can be found on GitHub

OpenStack Rackspace Cloud Files .net SDK

I am trying to save an XML file to a non CDN Container from Sydney:
public void Save(XDocument document)
{
using (MemoryStream ms = new MemoryStream())
{
document.Save(ms);
ms.Position = 0;
RackspaceCloudIdentity identity = new RackspaceCloudIdentity { Username = "username", APIKey = "xxxxxxxxxxx", CloudInstance = CloudInstance.Default };
CloudFilesProvider provider = new CloudFilesProvider(identity);
provider.CreateObject("XMLFiles", ms, "xmlFile1.xml", region: "syd");
}
}
For a 1MB file, it takes about 50 seconds to upload (very long).
And, trying to download the file back, returns an empty result:
public void Read()
{
RackspaceCloudIdentity identity = new RackspaceCloudIdentity { Username = "username", APIKey = "xxxxxxxxxxx", CloudInstance = CloudInstance.Default };
CloudFilesProvider provider = new CloudFilesProvider(identity);
using (MemoryStream ms = new MemoryStream())
{
provider.GetObject("XMLFiles", "xmlFile1.xml", ms, region: "syd");
// ms.Length is 0
}
}
I am doing something wrong?
Ugh. I introduced this bug in commit 799f37c (first released in v1.1.3.0). I'm looking into the best workaround right now.
Edit: There is no workaround, but I filed issue #116 for the issue, and after the pull request for it is merged, we'll release version 1.1.3.1 of the library to correct the problem.
Are you able to access your control panel at mycloud.rackspace.com?
I used my control panel to upload an XML file, then used your code, above, to download the XML file. It worked fine.
I'm going now use the upload code you posted.
Just wanted you to know I'm looking into this.

Resources