Download an externalReference from PlannerTask using the Graph API - .net-core

I'd like to download a file attached to a PlannerTask. I already have the external references but I can't figure out how to access the file.
An external reference is a JSON object like this:
{
"https%3A//contoso%2Esharepoint%2Ecom/sites/GroupName/Shared%20Documents/AnnualReport%2Epptx":
{
// ... snip ...
}
}
I've tried to use the following endpoint
GET /groups/{group-id}/drive/root:/sites/GroupName/Shared%20Documents/AnnualReport%2Epptx
but I get a 404 response. Indeed, when I use the query in Graph Explorer it gives me a warning about "Invalid whitespace in URL" (?).
A workaround that I've found is to use the search endpoint to look for files like this:
GET /groups/{group-id}/drive/root/search(q='AnnualReport.pptx')
and hope the file name is unique.
Anyway, with both methods I need extra information (ie. the group-id) that may not be readily available by the time I have the external reference object.
What is the proper way to get a download url for a driveItem that is referenced by an external reference object in a PlannerTask?
Do I really need the group-id to access such file?

The keys in external references are webUrl instances, so they can be used with the /shares/ endpoint. See this answer for details on how to do it.
When you get a driveItem object, the download url is available from AdditionalData: driveItem.AdditionalData["#microsoft.graph.downloadUrl"]. You can use WebClient.DownloadFile to download the file on the local machine.
Here is the final code:
var remoteUri = "https%3A//contoso%2Esharepoint%2Ecom/sites/GroupName/Shared%20Documents/AnnualReport%2Epptx";
var localFile = "/tmp/foo.pptx";
string sharingUrl = remoteUri;
string base64Value = System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(sharingUrl));
string encodedUrl = "u!" + base64Value.TrimEnd('=').Replace('/','_').Replace('+','-');
DriveItem driveItem = m_graphClient
.Shares[encodedUrl]
.DriveItem
.Request()
.GetAsync().Result;
using (WebClient client = new WebClient())
{
client.DownloadFile(driveItem.AdditionalData["#microsoft.graph.downloadUrl"].ToString(),
localFile);
}

Related

POST method to upload file to Azure storage - what to return

I am creating an app where
user can upload the text file and then
find most used word and change that word in text and
show the changed text to the user.
if it is possible, I would like to
get the file’s text content before uploading when Post method is being called and save that content
so I add the “DownloadTextAsync()” method inside of the POST method, but it seems like I am calling this method to the wrong subject?
[HttpPost("UploadText")]
public async Task<IActionResult> Post(List<IFormFile> files)
{
string connectionString = Environment.GetEnvironmentVariable("mykeystringhere");
// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
//Create a unique name for the container
string containerName = "textdata" + Guid.NewGuid().ToString();
// Create the container and return a container client object
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "./data/";
string fileName = "textfiledata" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Get a reference to a blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
// Open the file and upload its data
using FileStream uploadFileStream = System.IO.File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
string downloadFilePath = localFilePath.Replace(".txt", "DOWNLOAD.txt");
// Get the blob file as text
string contents = blobClient.DownloadTextAsync().Result;
//return the string
return contents;
//if (uploadSuccess)
// return View("UploadSuccess");
//else
// return View("UploadError");
}
The issues I am having are
I understood that ‘blobClient’ is the reference to the blob, where I can get the file’s data but this must be wrong?
Also it seems like I cannot use “CloudBlobContainer” nor the “CloudBlockBlob blob”. Is it because inside of the POST method, the blob has been just initialized and does not exist when these twos are executed?
Also when I test the POST method, the console throws “Refused to load the font '' because it violates the following Content Security Policy directive: "default-src 'none'". Note that 'font-src' was not explicitly set, so 'default-src' is used as a fallback.” which I googled but have no idea what it means?
I have tried different ways but keep getting CANNOT POST/“ But could not really find the solid anwers. Could this be related to my POST method?
I understood that ‘blobClient’ is the reference to the blob, where I
can get the file’s data but this must be wrong?
That's correct in a sense that you can use blobClient to perform operations on blob like upload/download etc. I am not sure why you say but this must be wrong.
Also it seems like I cannot use “CloudBlobContainer” nor the
“CloudBlockBlob blob”. Is it because inside of the POST method, the
blob has been just initialized and does not exist when these twos are
executed?
No, this is happening because you're using a newer version of SDK (version 12.x.x) and CloudBlobContainer and CloudBlockBlob are available in the older version of the SDK.
Also when I test the POST method, the console throws “Refused to load
the font '' because it violates the following Content Security Policy
directive: "default-src 'none'". Note that 'font-src' was not
explicitly set, so 'default-src' is used as a fallback.” which I
googled but have no idea what it means? I have tried different ways
but keep getting CANNOT POST/“ But could not really find the solid
anwers. Could this be related to my POST method?
Not sure why this is happening. You may want to ask a separate question for this and when you do, please include the HTML portion of your code as well.

Read (audio) file from subfolder in AndroidAssets in Xamarin.Forms

I try to eneable audio in my Xamarin.Forms application. I want to have the audio files in a Subfolderof the Assetsfolder like this Assets/Subfolder/Audio.mp3
I have found a plugin SimpleAudioPlayer which provide an example.
The following code works like provided.
var player = Plugin.SimpleAudioPlayer.CrossSimpleAudioPlayer.Current;
player.Load("Audio.mp3");
Now I want to place the audio file in a subfolder and I tried to call
player.Load("Subfolder/Audio.mp3");
But I get an Java.IO.FileNotFoundException
I looked then into the implementation of the Load function and I fould the following code
public bool Load(string fileName)
{
player.Reset();
AssetFileDescriptor afd = Android.App.Application.Context.Assets.OpenFd(fileName);
player?.SetDataSource(afd.FileDescriptor, afd.StartOffset, afd.Length);
return PreparePlayer();
}
where the filename is pasted to the Assets.OpenFd() function. which returns an AndroidFileDesriptor
The documentation does not really provide any information from Microsoft and from the Android site.
My questions are
How can I receive the file from the subfolder in Android Assets?
What can I paste into the Assets.OpenFd() function (subfolders etc)?
I would appreciate any advice, since after a long time trying to resolve it I don't really have an idea.
According to the docs of the package, you won't be able to do just that because you have to use player.Load(GetStreamFromFile("mysound.wav")); where GetStreamFromFile is basically
Stream GetStreamFromFile(string filename)
{
var assembly = typeof(App).GetTypeInfo().Assembly;
var stream = assembly.GetManifestResourceStream("App." + filename);
// Obviously App in the line above should be replaced with your app
return stream;
}
And secondly you have to player.Load(GetStreamFromFile("Subfolder.mysound.wav")); where Subfolder is the name of your subfolder.

How to feed site details on controller in dashlet in alfresco

how to directly get SITE details (like ID and name ) on main() of controller js of dashelt in alfresco
i can use " Alfresco.constants.SITE" on FTL file to read site ID, but need to know is there any KEY to read data on controller
janaka
There isn't a service on the Share side which provides that information, because the information you want is only held on the repository. As such, you'll need to call one of the REST APIs on the Repo to get the information you need
Your code would want to look something like:
// Call the repository for the site profile
var json = remote.call("/api/sites/" + page.url.templateArgs.site);
if (json.status == 200)
{
// Create javascript objects from the repo response
var obj = eval('(' + json + ')');
if (obj)
{
var siteTitle = obj.title;
var siteShortName = obj.shortName;
}
}
You can see a fuller example of this in various Alfresco dashlets, eg the Dynamic Welcome dashlet

Alfresco Object is not available in extension module in alfresco share

I am trying to override the javascript controller node-header.js of components\node-details with the extension module of alfresco share
This is my node-header.get.js
<import resource="classpath:/alfresco/templates/org/alfresco/import/alfresco-util.js">
for (var i=0; i<model.widgets.length; i++)
{
if (model.widgets[i].id == "NodeHeader")
{
if(model.widgets[i].options.nodeRef!=null)
{
var jsNode = new Alfresco.util.Node(model.widgets[i].options.nodeRef);
if(jsNode.hasAspect("custom:intranetFile")){
model.widgets[i].options.showFavourite = false;
model.widgets[i].options.showLikes = false;
}
}
}
}
I am getting this error
Error Message: 05270002 Failed to execute script
'classpath*:webscripts/custom/nodeheader/hidelikesync/node-header.get.js':
05270001 ReferenceError: "Alfresco" is not defined.
(jar:file:/C:/Alfresco/Alfresco42/tomcat/webapps/share/WEB-INF/lib/customshare.jar!/webscripts/custom/nodeheader/hidelikesync/node-header.get.js#1555)
Error lies in this line
var jsNode = new Alfresco.util.Node(model.widgets[i].options.nodeRef);
as Alfresco object is not available how can I get it?
Based on my answer yesterday on the share-extras-devel list:
Your issue is that you are mixing up your web script JS with client-side JavaScript. Alfresco.util.Node is a client-side helper class and is therefore available to client-side JS running in the web browser, but not to your web script code which runs on the server.
If you look at the source of alfresco-util.js, which you are including, you will see that there is a helper class there but it is called AlfrescoUtil.
To get some information on this given node I would suggest that you want to use the static method AlfrescoUtil.getNodeDetails() from that class, e.g.
var jsNode = AlfrescoUtil.getNodeDetails(model.widgets[i].options.nodeRef);
The structure of the jsNode object will be as per the JSON returned by the doclist-v2 webscripts, so you should be able to check for the presence of your custom aspect in the aspects array property.
If you check the source of alfresco-util.js you will see that additional parameters are also supported by getNodeDetails(). It seems to me you can also pass in an optional site name, plus some options if you wish.

Upload files directly to Amazon S3 from ASP.NET application

My ASP.NET MVC application will take a lot of bandwidth and storage space. How can I setup an ASP.NET upload page so the file the user uploaded will go straight to Amazon S3 without using my web server's storage and bandwidth?
Update Feb 2016:
The AWS SDK can handle a lot more of this now. Check out how to build the form, and how to build the signature. That should prevent you from needing the bandwidth on your end, assuming you need to do no processing of the content yourself before sending it to S3.
If you need to upload large files and display a progress bar you should consider the flajaxian component.
It uses flash to upload files directly to amazon s3, saving your bandwidth.
The best and the easiest way to upload files to amazon S3 via asp.net . Have a look at following blog post by me . i think this one will help. Here i have explained from adding a S3 bucket to creating the API Key, Installing Amazon SDK and writing code to upload files. Following are are the sample code for uploading files to amazon S3 with asp.net C#.
using System
using System.Collections.Generic
using System.Linq
using System.Web
using Amazon
using Amazon.S3
using Amazon.S3.Transfer
///
/// Summary description for AmazonUploader
///
public class AmazonUploader
{
public bool sendMyFileToS3(System.IO.Stream localFilePath, string bucketName, string subDirectoryInBucket, string fileNameInS3)
{
// input explained :
// localFilePath = we will use a file stream , instead of path
// bucketName : the name of the bucket in S3 ,the bucket should be already created
// subDirectoryInBucket : if this string is not empty the file will be uploaded to
// a subdirectory with this name
// fileNameInS3 = the file name in the S3
// create an instance of IAmazonS3 class ,in my case i choose RegionEndpoint.EUWest1
// you can change that to APNortheast1 , APSoutheast1 , APSoutheast2 , CNNorth1
// SAEast1 , USEast1 , USGovCloudWest1 , USWest1 , USWest2 . this choice will not
// store your file in a different cloud storage but (i think) it differ in performance
// depending on your location
IAmazonS3 client = new AmazonS3Client("Your Access Key", "Your Secrete Key", Amazon.RegionEndpoint.USWest2);
// create a TransferUtility instance passing it the IAmazonS3 created in the first step
TransferUtility utility = new TransferUtility(client);
// making a TransferUtilityUploadRequest instance
TransferUtilityUploadRequest request = new TransferUtilityUploadRequest();
if (subDirectoryInBucket == "" || subDirectoryInBucket == null)
{
request.BucketName = bucketName; //no subdirectory just bucket name
}
else
{ // subdirectory and bucket name
request.BucketName = bucketName + #"/" + subDirectoryInBucket;
}
request.Key = fileNameInS3 ; //file name up in S3
//request.FilePath = localFilePath; //local file name
request.InputStream = localFilePath;
request.CannedACL = S3CannedACL.PublicReadWrite;
utility.Upload(request); //commensing the transfer
return true; //indicate that the file was sent
}
}
Here you can use the function sendMyFileToS3 to upload file stream to amazon S3.
For more details check my blog in the following link.
Upload File to Amazon S3 via asp.net
I hope the above mentioned link will help.
Look for a javascript library to handle the client side upload of these files. I stumbled upon a javascript and php example Dojo also seems to offer a clientside s3 file upload.
ThreeSharp is a library to facilitate interactions with Amazon S3 in a .NET environment.
You'll still need to host the logic to upload and send files to s3 in your mvc app, but you won't need to persist them on your server.
Save and GET data in aws s3 bucket in asp.net mvc :-
To save plain text data at amazon s3 bucket.
1.First you need a bucket created on aws than
2.You need your aws credentials like
a)aws key b) aws secretkey c) region
// code to save data at aws
// Note you can get access denied error. to remove this please check AWS account and give //read and write rights
Name space need to add from NuGet package
using Amazon;
using Amazon.S3;
using Amazon.S3.Model;
var credentials = new Amazon.Runtime.BasicAWSCredentials(awsKey, awsSecretKey);
try`
{
AmazonS3Client client = new AmazonS3Client(credentials, RegionEndpoint.APSouth1);
// simple object put
PutObjectRequest request = new PutObjectRequest()
{
ContentBody = "put your plain text here",
ContentType = "text/plain",
BucketName = "put your bucket name here",
Key = "1"
//put unique key to uniquly idenitify your data
// you can pass here any data with unique id like primary key
//in db
};
PutObjectResponse response = client.PutObject(request);
}
catch(exception ex)
{
//
}
Now go to your AWS account and check the bucket you can get data with "1" Name in the AWS s3 bucket.
Note:- if you get any other issue please ask me a question here will try to resolve it.
To get data from AWS s3 bucket:-
try
{
var credentials = new Amazon.Runtime.BasicAWSCredentials(awsKey, awsSecretKey);
AmazonS3Client client = new AmazonS3Client(credentials, RegionEndpoint.APSouth1);
GetObjectRequest request = new GetObjectRequest()
{
BucketName = bucketName,
Key = "1"// because we pass 1 as unique key while save
//data at the s3 bucket
};
using (GetObjectResponse response = client.GetObject(request))
{
StreamReader reader = new
StreamReader(response.ResponseStream);
vccEncryptedData = reader.ReadToEnd();
}
}
catch (AmazonS3Exception)
{
throw;
}

Resources