How to see Log Outputs of Azure Data Factory Custom Activity - .net-core

I have a custom activity implemented in Dotnet in a Azure Data Factory pipeline and I want to see the console outputs. How can I read these outputs to debug my pipeline? Some predefined activities have the possibility to export the logs to a storage account but this checkbox does not exist for custom activities.
Below is my activities code using Console.WriteLine which I would like to check in a log.
private static void DeleteBlobFiles(string blobConnectionString, string adlsConnectionString, string blobContainerName, string blobFolderPath, string adlsContainerName, string adlsFolderPath, string adfCustName, string adfSourceSystem, string adfFeedName, string environment)
{
try
{
Console.WriteLine("DeleteBlobFiles method called.");
string sourceToDestination = "B2R";
string transactionStatusSuccess = "S";
string transactionStatusFailure = "F";
string transactionDescriptionSuccess = "File has been copied to Raw Data Layer of ADLS";
string transactionDescriptionFailure = "File has not been copied to Raw Data Layer of ADLS";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(blobConnectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
Console.WriteLine("blobContainerName = " + blobContainerName);
Console.WriteLine("blobFolderPath = " + blobFolderPath);
// Ensure the container is exist.
if (!String.IsNullOrEmpty(blobContainerName))
{
var blobContainer = cloudBlobClient.GetContainerReference(blobContainerName);
blobContainer.CreateIfNotExists();
CloudBlobDirectory cbd = blobContainer.GetDirectoryReference(blobFolderPath);
}
}
......
}
This is the logging option which I would like to add in my custom activity or something doing essentially the same.

I found the logs. The MS docs say
The stdout and stderr of your custom application are saved to the
adfjobs container in the Azure Storage Linked Service you defined when
creating Azure Batch Linked Service with a GUID of the task. You can
get the detailed path from Activity Run output as shown in the
following snippet
So I just checked which service account was associated with the Batch Account my custom activity was using and found the associated stdout and stderr files in the container adfjobs.

Related

AWS AmazonSimpleSystemsManagementClient cannot read credentials in .NET Framework application

I have .NET Framework application where I try to read data from AWS parameter store using AmazonSimpleSystemsManagementClient on my local environment. Besides I have credentials generated by AWS CLI and located in
Users/MyUser/.aws
folder. When I try to connect to the parameter store from CMD using the creds it works fine. Though the AmazonSimpleSystemsManagementClient in the application with default constructor, it throws exception "Unable to get IAM security credentials from EC2 Instance Metadata Service." When I tried to pass BasicAWSParameters to the client with hardcoded working keys I got another exception "The security token included in the request is invalid".
Also I tried installing EC2Config, initializing AWS SDK Store from Visual Studio AWS Toolkit. Though it didn't change the game.
I would want to avoid using environment variables or hardcoding the keys since keys are generated and valid only 1 hour. Then I should regenerate so copying them somewhere every time is not convenient for me.
Please advice how to resolve the issue.
Some code
_client = new AmazonSimpleSystemsManagementClient()
public string GetValue(string key)
{
if (_client == null)
return null;
var request = new GetParameterRequest
{
Name = $"{_baseParameterPath}/{key}",
WithDecryption = true,
};
try
{
var response = _client.GetParameterAsync(request).Result;
return response.Parameter.Value;
}
catch (Exception exc)
{
return null;
}
}
credentials file looks as following (I removed key values not to expose):
[default]
aws_access_key_id= KEY VALUE
aws_secret_access_key= KEY VALUE
aws_session_token= KEY VALUE
[MyProfile]
aws_access_key_id= KEY VALUE
aws_secret_access_key= KEY VALUE
aws_session_token= KEY VALUE
As long as you have your creds in .aws/credentials, you can create the Service client and the creds will be located and used. No need to create a BasicAWSParameters object.
Creds in a file named credentials:
[default]
aws_access_key_id=Axxxxxxxxxxxxxxxxxxxxxxxxxxx
aws_secret_access_key=/zxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
This .NET code works.
using System;
using System.Threading.Tasks;
using Amazon.SimpleSystemsManagement;
using Amazon.SimpleSystemsManagement.Model;
namespace ConsoleApp1 {
class Program {
static async Task Main(string[] args) {
var client = new AmazonSimpleSystemsManagementClient();
var request = new GetParameterRequest()
{
Name = "RDSConnection"
};
var response = client.GetParameterAsync(request).GetAwaiter().GetResult();
Console.WriteLine("Parameter value is " + response.Parameter.Value);
}
}
}

How to use "Azure storage blobs" for POST method in controller

I am creating an app where user can upload their text file and find out about its most used word.
I have tried to follow this doc to get used to the idea of using AZURE STORAGE BLOBS - https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
But I am super newbie and having a hard time figuring it out how to adapt those blobs methods for my POST method.
This my sudo - what I think I need in my controller and what needs to happen when POST method is triggered.
a.No need for DELETE or PUT, not replacing the data nor deleting in this app
b.Maybe need a GET method, but as soon as POST method is triggered, it should pass the text context to the FE component
POST method
connect with azure storage account
if it is a first time of POST, create a container to store the text file
a. how can I connect with the existing container if the new container has already been made? I found this, but this is for the old CloudBlobContainer. Not the new SDK 12 version.
.GetContainerReference($"{containerName}");
upload the text file to the container
get the chosen file's text content and return
And here is my controller.
public class HomeController : Controller
{
private IConfiguration _configuration;
public HomeController(IConfiguration Configuration)
{
_configuration = Configuration;
}
public IActionResult Index()
{
return View();
}
[HttpPost("UploadText")]
public async Task<IActionResult> Post(List<IFormFile> files)
{
if (files != null)
{
try
{
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "textdata" + Guid.NewGuid().ToString();
BlobContainerClient containerClient = await blobServiceClient.CreateBlobContainerAsync(containerName);
//Q. How to write a if condition here so if the POST method has already triggered and container already created, just upload the data. Do not create a new container?
string fileName = //Q. how to get the chosen file name and replace with newly assignmed name?
string localFilePath = //Q. how to get the local file path so I can pass on to the FileStream?
BlobClient blobClient = containerClient.GetBlobClient(fileName);
using FileStream uploadFileStream = System.IO.File.OpenRead(localFilePath);
await blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
string data = System.IO.File.ReadAllText(localFilePath, Encoding.UTF8);
//Q. If I use fetch('Home').then... from FE component, will it receive this data? in which form will it receive? JSON?
return Content(data);
}
catch
{
//Q. how to use storageExeption for the error messages
}
finally
{
//Q. what is suitable to execute in finally? return the Content(data) here?
if (files != null)
{
//files.Close();
}
}
}
//Q. what to pass on inside of the Ok() in this scenario?
return Ok();
}
}
Q1. How can I check if the POST method has been already triggered, and created the Container? If so how can I get the container name and connect to it?
Q2. Should I give a new assigned name to the chosen file? How can I do so?
Q3. How can I get the chosen file's name so I can pass in order to process Q2?
Q4. How to get the local file path so I can pass on to the FileStream?
Q5. How to return the Content data and pass to the FE? by using fetch('Home').then... like this?
Q6. How can I use storageExeption for the error messages
Q7. What is suitable to execute in finally? return the Content(data) here?
Q8. What to pass on inside of the Ok() in this scenario?
Any help is welcomed! I know I asked a lot of Qs here. Thanks a lot!
Update: add a sample code, you can modify it as per your need.
[HttpPost]
public async Task<IActionResult> SaveFile(List<IFormFile> files)
{
if (files == null || files.Count == 0) return Content("file not selected");
string connectionString = "xxxxxxxx";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "textdata" + Guid.NewGuid().ToString();;
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
containerClient.CreateIfNotExists();
foreach (var file in files)
{
//use this line of code to get file name
string fileName = Path.GetFileName(file.FileName);
BlobClient blobClient = containerClient.GetBlobClient(fileName);
//directly read file content
using (var stream = file.OpenReadStream())
{
await blobClient.UploadAsync(stream);
}
}
//other code
return View();
}
Original answer:
When using List<IFormFile>, you should use foreach code block to iterate each file in the list.
Q2. Should I give a new assigned name to the chosen file? How can I do
so?
If you want to keep the file original name, in the foreach statement like below:
foreach (var file in myfiles)
{
Path.GetFileName(file.FileName)
//other code
}
And if you want to assign a new file name when uploaded to blob storage, you should define the new name in this line of code: BlobClient blobClient = containerClient.GetBlobClient("the new file name").
Q3. How can I get the chosen file's name so I can pass in order to
process Q2?
refer to Q2.
Q4. How to get the local file path so I can pass on to the FileStream?
You can use code like this: string localFilePath = file.FileName; to get the path, and then combine with the file name. But there is a better way, you can directly use this line of code Stream uploadFileStream = file.OpenReadStream().
Q5. How to return the Content data and pass to the FE? by using
fetch('Home').then... like this?
Not clear what's it meaning. Can you provide more details?
Q6. How can I use storageExeption for the error messages
The storageExeption does not exist in the latest version, you should install the older one.
You can refer to this link for more details.
#Ivan's answer is what the documentation seems the recommend; however, I was having a strange issue where my stream was always prematurely closed before the upload had time to complete. To anyone else who might run into this problem, going the BinaryData route helped me. Here's what that looks like:
await using var ms = new MemoryStream();
await file.CopyToAsync(ms);
var data = new BinaryData(ms.ToArray());
await blobClient.UploadAsync(data);

Google Drive Access: At least one client secrets (Installed or Web) should be set

We have tested demo code of Google Drive (for console application) and that went correct, after that we are trying to implement the same to Web Application which at present is hosted at localhost. The application is giving us this exception:
An exception of type 'System.InvalidOperationException' occurred in Google.Apis.Auth.dll but was not handled in user code Additional
information: At least one client secrets (Installed or Web) should be
set
The code we are trying to run is this:
UserCredential credential;
GoogleClientSecrets s= new GoogleClientSecrets();
s.Secrets.ClientId="xxxxxxxxxx-xxxxxxxx.apps.googleusercontent.com";
s.Secrets.ClientSecret="yyyyyyyyyyyyyyyyyyyyy";
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(s.Secrets,Scopes,"user",CancellationToken.None,null).Result;
var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential
//ApplicationName = ApplicationName,
});
Before this we have put refresh token to get access token. We are also not aware what to do with this access token stores in a string variable.
We are trying to have it accessed unattanded, we used google playground to get first refresh token.
Please help us out, we are trying to get it done from last 7 days with no success.
Code for a service account:
string[] scopes = new string[] {DriveService.Scope.Drive}; // Full access
var keyFilePath = #"c:\file.p12" ; // Downloaded from https://console.developers.google.com
var serviceAccountEmail = "xx#developer.gserviceaccount.com"; // found https://console.developers.google.com
//loading the Key file
var certificate = new X509Certificate2(keyFilePath, "notasecret", X509KeyStorageFlags.Exportable);
var credential = new ServiceAccountCredential( new ServiceAccountCredential.Initializer(serviceAccountEmail) {
Scopes = scopes}.FromCertificate(certificate));
code ripped from Google Drive Authentication C# which also incudes an example for using Oauth2
Upload file
/// <summary>
/// Uploads a file
/// Documentation: https://developers.google.com/drive/v2/reference/files/insert
/// </summary>
/// <param name="_service">a Valid authenticated DriveService</param>
/// <param name="_uploadFile">path to the file to upload</param>
/// <param name="_parent">Collection of parent folders which contain this file.
/// Setting this field will put the file in all of the provided folders. root folder.</param>
/// <returns>If upload succeeded returns the File resource of the uploaded file
/// If the upload fails returns null</returns>
public static File uploadFile(DriveService _service, string _uploadFile, string _parent) {
if (System.IO.File.Exists(_uploadFile))
{
File body = new File();
body.Title = System.IO.Path.GetFileName(_uploadFile);
body.Description = "File uploaded by Diamto Drive Sample";
body.MimeType = GetMimeType(_uploadFile);
body.Parents = new List<ParentReference>() { new ParentReference() { Id = _parent } };
// File's content.
byte[] byteArray = System.IO.File.ReadAllBytes(_uploadFile);
System.IO.MemoryStream stream = new System.IO.MemoryStream(byteArray);
try
{
FilesResource.InsertMediaUpload request = _service.Files.Insert(body, stream, GetMimeType(_uploadFile));
//request.Convert = true; // uncomment this line if you want files to be converted to Drive format
request.Upload();
return request.ResponseBody;
}
catch (Exception e)
{
Console.WriteLine("An error occurred: " + e.Message);
return null;
}
}
else {
Console.WriteLine("File does not exist: " + _uploadFile);
return null;
}
}
Code ripped from DaimtoGoogleDriveHelper.cs
Tip:
The thing to remember with a service account is that it is not you, it is a dummy user account. A service account has its own google drive account so uploading files to it will upload to its account not yours. What you can do is take the service account email address add it as a user on a folder in YOUR personal google drive account giving it write access. This will then allow the Service account to upload a file to your personal google drive account. Remember to patch the file after upload granting yourself permissions to the file other wise the owner of the file will be the service account. There is currently a bug in the Google Drive api you have to patch the permissions on the file after you upload the file you cant do it at the time you upload it its a two step process. (been there done that)
Google drive sample project can be found on GitHub

Create and share Google Drive folders dynamically

I have a list of emails.
For each email, I need to create a Google Drive folder and share it with the given email.
How can I do it programmically?
I am using ASP.NET 4.0.
First of all you need to make sure you have application with clientid/secret and correct redirect uri configured. For my case - it's a desktop application:
So far you'll get clientId/secret key:
Now it's time to write some codes!
Step 1 - authorization:
private async static Task<UserCredential> Auth(ClientSecrets clientSecrets)
{
return await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, Scopes, "user", CancellationToken.None);
}
Step 2 - construct your client for google drive:
private static DriveService GetService(UserCredential credential)
{
return new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "MyApplicationName",
});
}
Step 3 - create folder (or any other content):
private static string CreateFolder(DriveService service, string folderName)
{
var file = new File {Title = folderName, MimeType = "application/vnd.google-apps.folder"};
var result = service.Files.Insert(file).Execute();
return result.Id;
}
Step 4 - share it!
/// <summary>
/// Share content. Doc link: https://developers.google.com/drive/v2/reference/permissions/insert
/// </summary>
private static void Share(DriveService service, string fileId, string value, string type, string role)
{
var permission = new Permission {Value = value, Type = type, Role = role};
service.Permissions.Insert(permission, fileId).Execute();
}
And finally usage of whole thing:
static void Main(string[] args)
{
var ClientId = "MySecredId";
var SecretKey = "MySecretKey";
var Scopes = new[] { DriveService.Scope.DriveFile, DriveService.Scope.Drive };
var secrets = new ClientSecrets { ClientId = ClientId, ClientSecret = SecretKey };
var credentials = Auth(secrets).Result;
var service = GetService(credentials);
var folderId = CreateFolder(service, "folderName");
Share(service, folderId, "user#gmail.com", "user", "reader");
}
For list of emails you can do pretty same thing creating/sharing content in a loop for every email you have.
Some usefull links:
Creating files
Sharing files
Also
You'll need Google.Apis.Drive.v2 nuget package
The steps to perform this hinge on authenticating with Google first. Once you've done that you'll be able to access the Drive API to perform the actions you'd like. The following links walk you through everything you need to do.
Step 1: Authenticate (server side in your case as you're using ASP.NET)
https://developers.google.com/drive/web/auth/web-server
Step 2: Create your folders
https://developers.google.com/drive/web/folder
Step 3: Share your folders
https://developers.google.com/drive/web/manage-sharing
Take a look in below link. It have full course on Google Drive !!
https://www.codeschool.com/courses/discover-drive

Configuring Alfresco with james server to receive inbound mails with attachment which gets uploaded into folder

I am using Alfresco 4.2c community edition . My requirement is to send a mail from user configured in James server so that the attachment i send to a mail id of a particular folder is uploaded into the folder . I have written the following code
public void sendAttachment(EmailVO emailVO)
{
try {
String host = "01HW342035";
String from = "alfresco#example.com";
String to = "inbox#example.com";
String user = "alfresco";
String password = "alfresco";
// Get system properties
Properties properties = System.getProperties();
// Setup mail server
properties.setProperty("mail.smtp.host", host);
// Get the default Session object.
Session session = Session.getDefaultInstance(properties);
// Define message
Message message = new MimeMessage(session);
message.setFrom(new InternetAddress(from));
message.addRecipient(Message.RecipientType.TO,
new InternetAddress(to));
message.setSubject("JavaMail Attachment");
// Create the message part
BodyPart messageBodyPart = new MimeBodyPart();
// Fill the message
messageBodyPart.setText("hi");
Multipart multipart = new MimeMultipart();
multipart.addBodyPart(messageBodyPart);
// Part two is attachment
messageBodyPart = new MimeBodyPart();
String filename = "C:\\Users\\594952\\Desktop\\Links.txt";
DataSource source = new FileDataSource(filename);
messageBodyPart.setDataHandler(new DataHandler(source));
messageBodyPart.setFileName(filename);
multipart.addBodyPart(messageBodyPart);
// Put parts in message
message.setContent(multipart);
// Send the message
Transport.send(message);
System.out.println("Msg Send ....") ;
}
catch(Exception e)
{
e.printStackTrace();
}
}
}
The code works fine and no exception occurs . I have configured james server and alfresco properties as per https://wiki.alfresco.com/wiki/Configuring_Email_With_Apache_James
I have given an alias as inbox to a folder in alfresco.The attachment i sent from the java code does not get uploaded into the repository. Kindly suggest the changes i should make to make this work correctly.

Resources