Create Kusto client using federated identity credential without using any app secret - azure-data-explorer

I'm trying to connect to Azure Data Explorer (ADX/ Kusto) from Azure Kubernetes Service (AKS) pod.
# Sample code
from azure.kusto.data import KustoClient, KustoConnectionStringBuilder
KCSB_DATA = KustoConnectionStringBuilder.with_aad_application_key_authentication(KUSTO_DATA_URI, CLIENT_ID, CLIENT_SECRET, AUTHORITY_ID)
ingest_data = KustoClient(KCSB_DATA)
Is there any way to create Kusto client using AAD application Authentication by providing federated identity credential but without using any app secret(CLIENT_SECRET) in python?

In this scenario there are two ways you can do
Without CLIENT_SECRET
from azure.kusto.data import KustoClient, KustoConnectionStringBuilder
kcsb = KustoConnectionStringBuilder.with_az_cli_authentication(KUSTO_URI)
ingest_data = KustoClient(kcsb)
If you have not signed in with your AAD credentials it will Automatically prompt you to do so opening a web browser to sign in.
With CLIENT_SECRET
from azure.kusto.data.request import KustoClient, KustoConnectionStringBuilder
from azure.kusto.data.exceptions import KustoServiceError
from azure.kusto.data.helpers import dataframe_from_result_table
KUSTO_DATABASE = "XXXX"
CLUSTER = "https://mynode.myregion.kusto.windows.net"
CLIENT_ID = "XXXX"
CLIENT_SECRET = "XXXX"
AUTHORITY_ID = "<insert here your tenant id>"
KCSB_DATA = KustoConnectionStringBuilder.with_aad_application_key_authentication(CLUSTER, CLIENT_ID, CLIENT_SECRET, AUTHORITY_ID)
KUSTO_CLIENT = KustoClient(KCSB_DATA)
CREATE_TABLE_COMMAND = “ Follow below link”
RESPONSE = KUSTO_CLIENT.execute_mgmt(KUSTO_DATABASE, CREATE_TABLE_COMMAND)
dataframe_from_result_table(RESPONSE.primary_results[0])
You can refer this link:
https://learn.microsoft.com/en-us/azure/data-explorer/python-ingest-data#add-import-statements-and-constants

Related

Where do you get the telegram session_hash

https://github.com/Eloise1988/OPENAI/blob/main/asyncV2/README.md
api_id: Telegram API ID and Hash (you can get it from my.telegram.org) api_hash : Telegram API Hash (you can get it from my.telegram.org) session_hash : Telegram Session Hash (you can get it from my.telegram.org)
I found api_id and api_hash, but I didn't find session_hash
from telethon import TelegramClient
api_id = 123456 # Your API ID
api_hash = 'your_api_hash' # Your API Hash
# Create a new TelegramClient instance and start the client
async with TelegramClient('session_file', api_id, api_hash) as client:
# Connect to the Telegram servers and log in
await client.start()
# After logging in, the session hash will be automatically generated and stored
# in the 'session_file' file, which you can use in future runs to log in with
# the same session:
session_hash = client.session.save()
print(str(session_hash))

403 Error when using generated Sas token to display blobs from Azure blob storage

I've been trying to display images from Azure blob storage on my web app for a while now.
My storage account SAS token is:
?sv=2021-06-08&ss=bfqt&srt=sco&sp=rwdlacupiytfx&se=2022-12-09T08:03:09Z&st=2022-11-09T08:03:09Z&spr=https&sig=SIGNATURE_HERE
This SAS token includes all permissions and allows all resource types and services.
To generate a SAS token to view a blob, I go through the following steps:
1. Getting the blobService:
const blobService = new
BlobServiceClient(https://${storageAccountName}.blob.core.windows.net/?${storageAccountSasToken});
2. Creating a containerClient:
const containerClient = blobService.getContainerClient(containerName);
3. creating a sasOptions object:
const sasOptions = {containerName: containerName, blobName: blobName, startsOn: sasStartTime, expiresOn: sasExpiryTime, permissions: "racwdt" as unknown as BlobSASPermissions};
4. Generating SAS token with the parameters:
generateBlobSASQueryParameters(sasOptions, sharedKeyCredential).toString();
5. Sending the blobURL (with the SAS token attached) back to the user:
const blobURL = containerClient.getBlockBlobClient(blobName).url;
The problem is, when using the blobURL as src for my Image tag, I get a 403 (forbidden) error:
Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.
the faulty blobURL in question:
https://mywebsite.blob.core.windows.net/container/profilePictures%2Fpicture.png?sv=2021-06-08&ss=bfqt&srt=sco&sp=rwdlacupiytfx&se=2022-12-09T08:03:09Z&st=2022-11-09T08:03:09Z&spr=https&sig=CITlY0uPxBCGdBeMtIxxJafJM61HQlhooR5ZnDiPHuE%3D
The Error:
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:df81f724-f01e-000e-593e-f41f7f000000 Time:2022-11-09T13:24:08.3305270Z
Signature did not match. String to sign used was STORAGE_ACCOUNT_NAME racwdt bfqt sc 2022-11-09T12:31:47Z 2022-12-09T20:31:47Z https 2021-06-08
Additional information:
The sasToken env variable includes "?" at the start of the string
All containers are PRIVATE.
My storage account is only accessible through a specific virtual network
My website's domain is listed on "Allowed Origins" in CORS tab, as well as localhost:3000
Uploading to Blob storage works, So Its safe to assume that the problem is solely related to the generated SAS token
Any assistance would be gladly appreciated :)
I tried in my environment and got below results:
Code:
var storage = require("#azure/storage-blob")
const accountname ="storage13261";
const key = "< Account key >";
const cred = new storage.StorageSharedKeyCredential(accountname,key);
const blobServiceClient = new storage.BlobServiceClient(`https://${accountname}.blob.core.windows.net`,cred);
const containerName="test";
const client =blobServiceClient.getContainerClient(containerName)
const blobName="nature.png";
const blobClient = client.getBlobClient(blobName);
const blobSAS = storage.generateBlobSASQueryParameters({
containerName,
blobName,
permissions: storage.BlobSASPermissions.parse("racwdt"),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + 86400)
},
cred
).toString();
const sasUrl= blobClient.url+"?"+blobSAS;
console.log(sasUrl);
Console:
The problem is in your SAS token where storage service is uses racwdt but in you SAS has rwdlacupiytfx that may cause to display an image.
I checked the Url + SAS token in the browser it perfectly worked.
Reference:
Grant limited access to data with shared access signatures (SAS) - Azure Storage | Microsoft Learn
Updated:
You can get both SAS and SAS-URL manually with check the permission by refer the below image.

How to fetch the viewid from google analytics by giving the access_token using python

After successful login from the consent screen, I am getting the access_token now the next step is to fetch all the view id from the google analytics account.Please help me out
Example: This is the access_token("ya29.A0ARrdaM8IvLg8jjVHWgxneSp_mxgFYHpKt4LwPGZEVqzOphMA2Cll6mjMxlQRFanbJHh1WrBEYVe2Y1BvBU6j7h_17nVeY4h-FWdUuv5bo0rzETTz_-xw4t5ZNBYpj26Cy3u4Y1trZnqVIA4")
You should check the Managment api quickstart python
"""A simple example of how to access the Google Analytics API."""
import argparse
from apiclient.discovery import build
import httplib2
from oauth2client import client
from oauth2client import file
from oauth2client import tools
def get_service(api_name, api_version, scope, client_secrets_path):
"""Get a service that communicates to a Google API.
Args:
api_name: string The name of the api to connect to.
api_version: string The api version to connect to.
scope: A list of strings representing the auth scopes to authorize for the
connection.
client_secrets_path: string A path to a valid client secrets file.
Returns:
A service that is connected to the specified API.
"""
# Parse command-line arguments.
parser = argparse.ArgumentParser(
formatter_class=argparse.RawDescriptionHelpFormatter,
parents=[tools.argparser])
flags = parser.parse_args([])
# Set up a Flow object to be used if we need to authenticate.
flow = client.flow_from_clientsecrets(
client_secrets_path, scope=scope,
message=tools.message_if_missing(client_secrets_path))
# Prepare credentials, and authorize HTTP object with them.
# If the credentials don't exist or are invalid run through the native client
# flow. The Storage object will ensure that if successful the good
# credentials will get written back to a file.
storage = file.Storage(api_name + '.dat')
credentials = storage.get()
if credentials is None or credentials.invalid:
credentials = tools.run_flow(flow, storage, flags)
http = credentials.authorize(http=httplib2.Http())
# Build the service object.
service = build(api_name, api_version, http=http)
return service
def get_first_profile_id(service):
# Use the Analytics service object to get the first profile id.
# Get a list of all Google Analytics accounts for the authorized user.
accounts = service.management().accounts().list().execute()
if accounts.get('items'):
# Get the first Google Analytics account.
account = accounts.get('items')[0].get('id')
# Get a list of all the properties for the first account.
properties = service.management().webproperties().list(
accountId=account).execute()
if properties.get('items'):
# Get the first property id.
property = properties.get('items')[0].get('id')
# Get a list of all views (profiles) for the first property.
profiles = service.management().profiles().list(
accountId=account,
webPropertyId=property).execute()
if profiles.get('items'):
# return the first view (profile) id.
return profiles.get('items')[0].get('id')
return None
def get_results(service, profile_id):
# Use the Analytics Service Object to query the Core Reporting API
# for the number of sessions in the past seven days.
return service.data().ga().get(
ids='ga:' + profile_id,
start_date='7daysAgo',
end_date='today',
metrics='ga:sessions').execute()
def print_results(results):
# Print data nicely for the user.
if results:
print 'View (Profile): %s' % results.get('profileInfo').get('profileName')
print 'Total Sessions: %s' % results.get('rows')[0][0]
else:
print 'No results found'
def main():
# Define the auth scopes to request.
scope = ['https://www.googleapis.com/auth/analytics.readonly']
# Authenticate and construct service.
service = get_service('analytics', 'v3', scope, 'client_secrets.json')
profile = get_first_profile_id(service)
print_results(get_results(service, profile))
if __name__ == '__main__':
main()

REST API to stocktwits.com in R

I have been banging my head over this the whole day. I am trying to access StockTwits API (https://api.stocktwits.com/developers) from an R session. I have earlier accessed the twitter API (via rtweet) without hassles.
I have created an app and got the client id and key (the below are just examples).
app_name = "some.name";
consumer_key = "my_client_id";
consumer_secret = "my_client_key";
uri = "http://iimb.ac.in" # this is my institute's homepage. It doesn't allow locahost OR 127.0.0.1
scope = "read,watch_lists,publish_messages,publish_watch_lists,direct_messages,follow_users,follow_stocks";
base_url = "https://api.stocktwits.com/api/2/oauth"; # see https://api.stocktwits.com/developers/docs/api
The procedure is to create an oauth2.0 app and endpoint. Then call oauth2.0_token.
oa = httr::oauth_app(app_name, key = consumer_key, secret = consumer_secret, redirect_uri = uri);
oe = httr::oauth_endpoint("stocktwits", "authorize", "token", base_url = base_url);
mytoken = httr::oauth2.0_token(oe, oa, user_params = list(resource = base_url), use_oob = F); # use_oob = T doesn't work.
After firing the above, it takes me to the browser for sign-in. I sign-in and it asks me to connect. After that, I am taken back to my URI plus a code, i.e. https://www.iimb.ac.in/?code=295ea3114c3d8680a0ed295d52313d7092dd90ae&state=j9jXzEqri1
Is the code my access token or something else? The oauth2.0_token() call keeps waiting for the code since the callback is not localhost. I didn't seem to get a hang of that.
I then try to access the API using the above code as access token but I am thrown "invalid access token" error. The format is described in https://api.stocktwits.com/developers/docs/api#search-index-docs
Can someone tell me what I have missed? If required I can share my app_name, consumer_key and consumer_secret for replication.

Sending async email with Flask-Security

I'm attempting to configure Flask-Security to send email asynchronously.
I have some code which send async email via Flask-Mail, but I'm having trouble integrating it with my application factory function so that it works in conjunction with Flask-Security.
Application factory:
mail = Mail()
db = SQLAlchemy()
security = Security()
from app.models import User, Role
user_datastore = SQLAlchemyUserDatastore(db, User, Role)
def create_app(config_name):
# Config
app = Flask(__name__)
app.config.from_object(config[config_name])
config[config_name].init_app(app)
# Initialize extensions
mail.init_app(app)
db.init_app(app)
security.init_app(app, user_datastore)
return app
In the Flask-Security documentation it says to use #security.send_mail_task to override the way the extension sends emails.
So where exactly do I implement this decorator? Seems like anywhere I put it inside the application factory, I get circular imports.
These are the async email functions I am trying to use, taken from this issue:
#async
def send_security_email(msg):
with app.app_context():
mail.send(msg)
#security.send_mail_task
def async_security_email(msg):
send_security_email(msg)
Where does this code need to be put in order to work with the app factory?
Thanks in advance.
I was able to achieve this like so:
mail = Mail()
db = SQLAlchemy()
security = Security()
from app.models import User, Role
user_datastore = SQLAlchemyUserDatastore(db, User, Role)
def create_app(config_name):
# Config
app = Flask(__name__)
app.config.from_object(config[config_name])
config[config_name].init_app(app)
# Initialize extensions
mail.init_app(app)
db.init_app(app)
security_ctx = security.init_app(app, user_datastore)
# Send Flask-Security emails asynchronously
#security_ctx.send_mail_task
def async_security_email(msg):
send_security_email(app, mail, msg)
return app

Resources