Upload a large file into Shared Folder through Microsoft Graph API app only asp.net Access Denied - asp.net

I'm following the example of the aspnet-snippets-sample
and OneDriveTest.cs to upload a large file through the Microsoft Graph API into a shared folder with app only credentials
and following exception is thrown:
"Access denied. You do not have permission to perform this action or access this resource."
To initialize a large file upload session I do following:
var uploadSession = await _graph
.Drives[DRIVEID]
.Items[ITEMID]
.ItemWithPath("newFile.docx")
.CreateUploadSession()
.Request().PostAsync();
Both the driveId and itemId are valid.
I'm creating my GraphClient like following:
public MicrosoftGraphHelper()
{
_graph = new GraphServiceClient("https://graph.microsoft.com/beta",
new DelegateAuthenticationProvider((requestMessage) =>
{
requestMessage.Headers.Authorization = new AuthenticationHeaderValue("bearer", AccessToken);
return Task.FromResult(0);
}));
}
The access token is acquired through my app credentials and secret. In Azure, my app has read/write permissions for files through Microsoft Graph.
Also, creating a new folder like
POST https://graph.microsoft.com/beta/drives/DRIVEID/items/ITEMID/children
with JSON
{
"name": "NewFolderTest",
"folder": { }
}
works as well.
here is the Upload Task
internal async Task<string> UploadLargeFile(HttpPostedFileBase file)
{
try
{
using (Stream fileStream = file.InputStream)
{
// Create the upload session. The access token is no longer required as you have session established for the upload.
// POST /v1.0/drive/root:/UploadLargeFile.bmp:/microsoft.graph.createUploadSession
var driveItems = await _graph.Drives[DRIVEID]
.Items[ITEMID]
.Children
.Request()
.GetAsync();
var uploadSession = await _graph.Drives[DRIVEID]
.Items[ITEMID].ItemWithPath("test.docx").CreateUploadSession()
.Request().PostAsync();
var maxChunkSize = 320 * 1024; // 320 KB - Change this to your chunk size. 5MB is the default.
var provider = new ChunkedUploadProvider(uploadSession, _graph, fileStream, maxChunkSize);
// Setup the chunk request necessities
var chunkRequests = provider.GetUploadChunkRequests();
var readBuffer = new byte[maxChunkSize];
var trackedExceptions = new List<Exception>();
DriveItem itemResult = null;
//upload the chunks
foreach (var request in chunkRequests)
{
// Do your updates here: update progress bar, etc.
// ...
// Send chunk request
var result = await provider.GetChunkRequestResponseAsync(request, readBuffer, trackedExceptions);
if (result.UploadSucceeded)
{
itemResult = result.ItemResponse;
}
}
// Check that upload succeeded
if (itemResult == null)
{
// Retry the upload
// ...
}
return "Success";
}
}
catch (ServiceException e)
{
// Debug.WriteLine("We could not upload the file: " + e.Error.Message);
return null;
}
}
As soon as the code hits:
var result = await provider.GetChunkRequestResponseAsync(request, readBuffer, trackedExceptions);
trying to request:
"https://myapp.sharepoint.com:443/sites/dev/_api/v2.0/drives/DRIVEID/items/ITEMID/uploadSession"
is thrown.
What and where do I have to apply required permissions? The Documentation refers to Files.ReadWrite Does Sharepoint need additional permissions to Microsoft Graph API?
Within the shared folder I can see a new created tmp file with 0 bytes.

Related

EPPLUS package works fine locally but returns Internal server error when deployed to azure server

I have my web api that uploads and reads an excel file from the client app and then afterwards saves the data into the database, the application works perfect on locally server but the problem comes when the application is deployed to azure server it returns error 500 internal server error therefore i don't understand why this happens and and don't know how i can track to understand what might be the cause below are my code blocks.
My Interface Class
public interface UploadExcelInterface
{
Task UploadMultipleClients(Client obj);
}
My Service Implementation
public class UploadExcelService : UploadExcelInterface
{
private readonly DbContext _connect;
private readonly IHttpContextAccessor httpContextAccessor;
public UploadExcelService(DbContext _connect, IHttpContextAccessor httpContextAccessor)
{
this._connect = _connect;
this.httpContextAccessor = httpContextAccessor;
}
public async Task UploadMultipleClients(Client obj)
{
var file = httpContextAccessor.HttpContext.Request.Form.Files[0];
if (file != null && file.Length > 0)
{
var folderName = Path.Combine("Datas", "ClientUPloads");
var pathToSave = Path.Combine(Directory.GetCurrentDirectory(), folderName);
var fileName = Guid.NewGuid() + ContentDispositionHeaderValue.Parse(file.ContentDisposition).FileName.Trim('"');
var fullPath = Path.Combine(pathToSave, fileName);
var clientsList = new List<Client>();
using (var fileStream = new FileStream(fullPath, FileMode.Create))
{
await file.CopyToAsync(fileStream);
FileInfo excelFile = new FileInfo(Path.Combine(pathToSave, fileName));
ExcelPackage.LicenseContext = LicenseContext.NonCommercial;
using (ExcelPackage package = new ExcelPackage(excelFile))
{
ExcelWorksheet worksheet = package.Workbook.Worksheets[0];
var rowcount = worksheet.Dimension.Rows;
for (int row = 2; row <= rowcount; row++)
{
var Names = (worksheet.Cells[row,2].Value ?? string.Empty).ToString().Trim();
var Address = (worksheet.Cells[row,3].Value ?? string.Empty).ToString().Trim();
var Title = (worksheet.Cells[row,4].Value ?? string.Empty).ToString().Trim();
var Product = (worksheet.Cells[row,5].Value ?? string.Empty).ToString().Trim();
var Order = (worksheet.Cells[row,6].Value ?? string.Empty).ToString().Trim();
var Email = (worksheet.Cells[row,7].Value ?? string.Empty).ToString().Trim();
var Price = (worksheet.Cells[row,8].Value ?? string.Empty).ToString().Trim();
clientsList.Add(new Client
{
Names = Names,
Address = Address,
Title = Title,
Product = Product,
Order = Order,
Email = Email,
Price = Price,
}
}
//adding clients into the database
foreach (Client client in clientsList)
{
var exist = _connect.client.Any(x => x.Email == client.Email);
if (!exist)
{
await _connect.client.AddAsync(client);
}
}
await _connect.SaveChangesAsync();
}
}
}
My Controller Class
[HttpPost]
public async Task UploadMultipleClients([FromForm] Client obj)
{
await uploadExcelInterface.UploadMultipleClients(obj);
}
}
Please any help regarding this error that am getting from the server, and addition on that is it possible to get the data from the excel file without uploading it to server if yes how? because i tried adding the file to memory stream an reading it from memory but it appers not work, any suggestions thanks.
My answer may not help you solve the problem directly, but it can locate the error step by step. After we fix the error, we should be able to solve the problem in this thread.
Suggestions
Please make sure you have inclue EPPlus library in your deploy content.
Enabling ASP.NET Core stdout log (Windows Server)
Azure App Service - Configure Detailed Error Logging
Why
After tested, I am sure azure webapp can support EPPlus. For 500 error, as we don't have a more specific error message to refer to, we can't quickly locate the problem. Following the suggested method, you will surely see some useful information.
E.g:
The class library of EPPlus was not found.
Folders such as Datas are not created.
The database connection string, the test environment and the production environment may be different.
...

Unable to upload file as FormData to ASP.net from Angular

I am trying to upload a file from my angular code to an ASP.net backend.
My Angular code sends the object using FormData:
public uploadFiles(files) {
console.log(files);
if(files.length < 1) return;
const formData = new FormData();
files.forEach(file => {
console.log(file);
formData.append(file.name, file);
})
this._http.postFile('/order-processing/import-orders','application/x-www-form-urlencoded' ,formData).pipe(finalize(() => {
console.log("Finalized");
})).subscribe((val: any) => {
console.log('ORDER SUBMITTED', val);
}, error => {
console.log(error);
});
}
With the post file method looking like:
public postFile(path: string, contentType:string, body: FormData) : Observable<any> {
let headers = {
'Content-Type': contentType,
'Authorization': this.authToken
}
return this._http.post(environment.API_URL + path, body, {
headers
});
}
My ASP.net endpoint looks like:
[HttpPost, Route("hospitality/order-processing/import-orders")]
[RequestSizeLimit(2147483648)]
[DisableRequestSizeLimit]
public IActionResult UploadFile()
{
try
{
//var req = Request.Form.Files;
var file = Request.Form.Files;
string folderName = "Uploads";
string webRootPath = _hostingEnvironment.WebRootPath;
string newPath = Path.Combine(webRootPath, folderName);
if (!Directory.Exists(newPath))
{
Directory.CreateDirectory(newPath);
}
return Json("Upload Successful.");
}
catch (Exception e)
{
return Json("Failed:" + e);
}
}
If I check the network tab on my browser when I send the file, it says that my object is in the call, great, but for some reason it doesn't get picked up on the backend and when I step through the code it is not there.
I get different errors when I modify this code slightly. The error for the code in the state it is in now is "Form key or value length limit 2048 exceeded", however sometimes I get array out of bounds errors, or content boundary limit exceeded errors, it's enough to make you want to slam you face into your keyboard continually.
The whole point of this is to be able to upload an excel file to ASP.net code running in an AWS lambda, which then inserts rows in a RDS database. Am I going about this the right way? Is there a better way to achieve what I am trying to do? If not then what is wrong with my code that doesn't allow me to upload a file to a Web API?!
Thanks
It seems that you're trying to set the limit of the request but the message states that the problem is with form key or value length.
Try setting the RequestFormLimits and check if that helps.
[HttpPost, Route("hospitality/order-processing/import-orders")]
[RequestFormLimits(KeyLengthLimit = 8192, ValueLengthLimit = 8192)]
public IActionResult UploadFile()

ASP.NET Core RC-1 file upload

I am currently uploading a file via the kendo fileuploader to an api controller using ASP.NET core RC-1. I am receiving a periodic error of "object reference not set to instance of object" when attempting to read the stream following opening the stream with IFormFile.OpenReadStream().
My controller is:
[HttpPost]
[Route("api/{domain}/[controller]")]
public async Task<IActionResult> Post([FromRoute]string domain, [FromForm]IFormFile file, [FromForm]WebDocument document)
{
if (ModelState.IsValid)
{
if (file.Length > 0)
{
var userName =
Request.HttpContext.User.Claims
.FirstOrDefault(c => c.Type == ClaimTypesEx.FullName)?
.Value;
var uploadedFileName =
ContentDispositionHeaderValue.Parse(file.ContentDisposition).FileName.Trim('"');
document.Domain = domain;
document.MimeType = file.ContentType;
document.SizeInBytes = file.Length;
document.ChangedBy = userName;
document.FileName = (string.IsNullOrEmpty(document.FileName)) ? uploadedFileName : document.FileName;
try
{
document = await CommandStack.For<WebDocument>()
.AddOrUpdateAsync(document, file.OpenReadStream()).ConfigureAwait(false);
}
catch (Exception e)
{
return new HttpStatusCodeResult(500);
}
return Ok(document);
}
}
return new BadRequestResult();
}
And the error is being thrown when I actually try to read the stream when it is going into blob storage:
public async Task<Uri> CreateOrUpdateBlobAsync(string containerName, string fileName, string mimeType,
Stream fileStream)
{
var container = Client.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference(fileName);
//Error HERE
await blob.UploadFromStreamAsync(fileStream);
blob.Properties.ContentType = mimeType;
await blob.SetPropertiesAsync();
return blob.Uri;
}
What I am having trouble with is this is sporadic and there seems to be no defined pattern of which files are accepted and which ones generate the error. At first I thought it might be a size issue but that is not the case as I have several larger files uploaded successfully and then one small file will throw the error. Images seem to work fine and it is hit or miss on other file types with no rhyme or reason that I can figure out.

File Upload : ApiController

I have a file being uploaded using http post request using multipart/form-data to my class that is extending from ApiController.
In a dummy project, I am able to use:
HttpPostedFileBase hpf = Request.Files[file] as HttpPostedFileBase
to get the file inside my controller method where my Request is of type System.Web.HttpRequestWrapper.
But inside another production app where I have constraints of not adding any libraries/dlls, I don't see anything inside System.Web.HttpRequestWrapper.
My simple requirement is to get the posted file and convert it to a byte array to be able to store that into a database.
Any thoughts?
This code sample is from a ASP.NET Web API project I did sometime ago. It allowed uploading of an image file. I removed parts that were not relevant to your question.
public async Task<HttpResponseMessage> Post()
{
if (!Request.Content.IsMimeMultipartContent())
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
try
{
var provider = await Request.Content.ReadAsMultipartAsync(new MultipartMemoryStreamProvider());
var firstImage = provider.Contents.FirstOrDefault();
if (firstImage == null || firstImage.Headers.ContentDisposition.FileName == null)
return Request.CreateResponse(HttpStatusCode.BadRequest);
using (var ms = new MemoryStream())
{
await firstImage.CopyToAsync(ms);
var byteArray = ms.ToArray();
}
return Request.CreateResponse(HttpStatusCode.Created);
}
catch (Exception ex)
{
return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, ex);
}
}

Windows Phone 8 Hanging on GetFolderAsync and OpenStreamForReadAsync

I am making a windows phone 8 application. Part of this application requires state to be saved. I am saving it as a string of Json. If I open the application, save some data, exit the application and the load it again, it hangs on either GetFolderAsync or OpenStreamForReadAsync. It does not happen every time, but once it starts hanging, I have to kill the whole emulator and make a new one to start the application again.
I have even tried just making an empty file with no data in it and the problem still persistes.
Below is the code I am using to save and load the data. It does not matter where I call the data load whether it be on application start or on the form load it still breaks.
private async Task SaveLists()
{
//XmlSerializer serializer = new XmlSerializer(typeof(ListHolder));
// Get the local folder.
StorageFolder local = Windows.Storage.ApplicationData.Current.LocalFolder;
// Create a new folder name DataFolder.
var dataFolder = await local.CreateFolderAsync("DataFolder",
CreationCollisionOption.OpenIfExists);
// Create a new file named DataFile.txt.
var file = await dataFolder.CreateFileAsync("Lists.json",
CreationCollisionOption.ReplaceExisting);
string json = JsonConvert.SerializeObject(Lists, Formatting.Indented);
byte[] fileBytes = System.Text.Encoding.UTF8.GetBytes(json.ToCharArray());
using (var s = await file.OpenStreamForWriteAsync())
{
s.Write(fileBytes, 0, fileBytes.Length);
}
}
private async Task LoadLists()
{
// Get the local folder.
StorageFolder local = Windows.Storage.ApplicationData.Current.LocalFolder;
if (local != null)
{
try
{
// Get the DataFolder folder.
var dataFolder = await local.GetFolderAsync("DataFolder");
// Get the file.
var files = dataFolder.GetFilesAsync();
var file = await dataFolder.OpenStreamForReadAsync("Lists.json");
string jsonString = "";
// Read the data.
using (StreamReader streamReader = new StreamReader(file))
{
jsonString = streamReader.ReadToEnd();
}
if (jsonString.Length > 0)
{
Lists = JsonConvert.DeserializeObject<List<ItemList>>(jsonString);
}
else
{
Lists = new List<ItemList>();
}
}
catch (Exception ex)
{
Lists = new List<ItemList>();
}
}
}
You are causing a deadlock by calling Result. I explain this deadlock on my blog and in a recent MSDN article. In summary, await will (by default) attempt to resume execution within a context (the current SynchronizationContext unless it is null, in which case it uses the current TaskScheduler).
In your case, the current SynchronizationContext is the UI context, which is only used by the UI thread. So when you block the UI thread by calling Result, the async method cannot schedule back to the UI thread to complete.

Resources