Azure Application Insight: Custom data not recorded when Timestamp or StartTime is set - azure-application-insights

I found a strange issue while sending data using custom code. No data appeared in the browser until I figured out it had something to do with how I send it. If I manually set the StartTime or Timestamp of a Telemetry class the data never shows up.
In the code below, only the event with name "None" shows up. Widening the time range of the filter in the browser dashboard does not help.
Is it not possible to send historical data to application insights with a custom datetime stamp?
I need it because my telemtry data comes from a single server and is then persisted to multiple destinations like Application Insights, SQL server en blob storage for later processing.
var rt = new RequestTelemetry
{
Name = "StartTime",
Duration = TimeSpan.FromSeconds(8),
StartTime = DateTime.Now.AddDays(-1)
};
telemetry.TrackRequest(rt);
rt = new RequestTelemetry
{
Name = "Timestamp",
Duration = TimeSpan.FromSeconds(8),
Timestamp = DateTime.Now.AddDays(-1)
};
telemetry.TrackRequest(rt);
rt = new RequestTelemetry
{
Name = "Both",
Duration = TimeSpan.FromSeconds(8),
Timestamp = DateTime.Now.AddDays(-1),
StartTime = DateTime.Now.AddDays(-1)
};
telemetry.TrackRequest(rt);
rt = new RequestTelemetry
{
Name = "None",
Duration = TimeSpan.FromSeconds(8),
};
telemetry.TrackRequest(rt);

Related

sqlite3 seems to be slow

im currently writing a electron App. I wanted to use a SQLite Datebase to store some data. I got a Code like this:
var startTime = performance.now()
let sql = `UPDATE entry
SET
title = "someData",
url = "someData",
username = "someData",
password = "someData",
email = "someData",
note = "someData",
dateOfLastModification = "someData"
WHERE id=someId;`;
this.database.run(sql, (err, data) => {
if (err) {
console.log(err);
} else {
var endTime = performance.now()
console.log(`Call to updateEntry took ${endTime - startTime} milliseconds`)
}
});
Every time i call this code it, it needs more than 200ms. Isnt this too long for such a easy Query? I got other Queries that need nearly the same time. So if i got a behavior in my app that executes three Queries the user have to wait nearly 1 sec.
Any tips what im doing wrong?

How to get transfer data from stripe using payment id and account id (.Net)

I am trying to get transfer related data from stripe
TransferService service = new TransferService();
TransferListOptions stripeTransferList = new TransferListOptions
{
Destination = accountId,
Limit = 100
};
var list = await service.ListAsync(stripeTransferList);
var finalData = list.FirstOrDefault(x => x.DestinationPaymentId == paymentId);
so when I try to search paymentId from that list I was not able to find any because the page limit is 100 only
Limit = 100
how to fetch all the data and filter from that??
The stripe-dotnet library supports automatic pagination and it's documented here.
This lets you paginate through all the data in your account based on specific criteria that you passed as parameters. For example you can list all Transfer objects made to a specific connected account this way:
TransferService service = new TransferService();
TransferListOptions listOptions = new TransferListOptions
{
Destination = "acct_123456",
Limit = 100
};
foreach (var transfer in service.ListAutoPaging(listOptions)) {
// Do something with this Transfer
}
Now, this allows you to iterate over every Transfer but if you have a lot of data this could be quite slow. An alternative would be to start from the charge id, the py_123456 that you have from your connected account. If you know which account this charge was created on, you can fetch it directly via the API. This is done using the Retrieve Charge API along with passing the connected account id as documented here.
The Charge resource has the source_transfer property which is the id of the Transfer (tr_123) from the platform that created this charge. You can also use the Expand feature, which lets you fetch the entire Transfer object back to get some detailed information about it.
The code would look like this
var transferOptions = new TransferGetOptions{};
transferOptions.AddExpand("source_transfer");
var requestOptions = new RequestOptions();
requestOptions.StripeAccount = "acct_12345";
TransferService service = new TransferService();
Charge charge = service.Get(transferOptions, requestOptions);
// Access information about the charge or the associated transfer
var transferId = charge.SourceTransfer.Id;
var transferAmount = charge.SourceTransfer.TransferData.Amount;

IngestFromStreamAsync method does not work

I manage to ingest data successfully using below code
var kcsbDM = new KustoConnectionStringBuilder(
"https://test123.southeastasia.kusto.windows.net",
"testdb")
.WithAadApplicationTokenAuthentication(acquireTokenTask.AccessToken);
using (var ingestClient = KustoIngestFactory.CreateDirectIngestClient(kcsbDM))
{
var ingestProps = new KustoQueuedIngestionProperties("testdb", "TraceLog");
ingestProps.ReportLevel = IngestionReportLevel.FailuresOnly;
ingestProps.ReportMethod = IngestionReportMethod.Queue;
ingestProps.Format = DataSourceFormat.json;
//generate datastream and columnmapping
ingestProps.IngestionMapping = new IngestionMapping() {
IngestionMappings = columnMappings };
var ingestionResult = ingestClient.IngestFromStream(memStream, ingestProps);
}
when I try to use QueuedClient and IngestFromStreamAsync, the code is executed successfully but no any data is ingested into database even after 30 minutes
var kcsbDM = new KustoConnectionStringBuilder(
"https://ingest-test123.southeastasia.kusto.windows.net",
"testdb")
.WithAadApplicationTokenAuthentication(acquireTokenTask.AccessToken);
using (var ingestClient = KustoIngestFactory.CreateQueuedIngestClient(kcsbDM))
{
var ingestProps = new KustoQueuedIngestionProperties("testdb", "TraceLog");
ingestProps.ReportLevel = IngestionReportLevel.FailuresOnly;
ingestProps.ReportMethod = IngestionReportMethod.Queue;
ingestProps.Format = DataSourceFormat.json;
//generate datastream and columnmapping
ingestProps.IngestionMapping = new IngestionMapping() {
IngestionMappings = columnMappings };
var ingestionResult = ingestClient.IngestFromStreamAsync(memStream, ingestProps);
}
Try running .show ingestion failures on "https://test123.southeastasia.kusto.windows.net" endpoint, see if there are ingestion error.
Also, you set Queue reporting method, you can get the detailed result by reading from the queue.
ingestProps.ReportLevel = IngestionReportLevel.FailuresOnly;
ingestProps.ReportMethod = IngestionReportMethod.Queue;
(On the first example you used KustoQueuedIngestionProperties, you should use KustoIngestionProperties. KustoQueuedIngestionProperties has additional properties that will be ignored by the ingest client, ReportLevel and ReportMethod for example)
Could you please change the line to:
var ingestionResult = await ingestClient.IngestFromStreamAsync(memStream, ingestProps);
Also please note that queued ingestion has a batching stage of up to 5 minutes before the data is actually ingested:
IngestionBatching policy
.show table ingestion batching policy
I find the reason finally, need to enable stream ingestion in the table:
.alter table TraceLog policy streamingingestion enable
See the Azure documentation for details.
enable streamingestion policy is actually only needed if
stream ingestion is turned on in the cluster (azure portal)
the code is using CreateManagedStreamingIngestClient
the ManagedStreamingIngestClient will first try stream ingesting the data, if it fails a few times, then it will use the QueuedClient
if the ingesting data is smaller, under 4MB, it's recommended to use this client.
if using QueuedClient, you can try
.show commands-and-queries | | where StartedOn > ago(20m) and Text contains "{YourTableName}" and CommandType =="DataIngestPull"
This can give you the command executed; however it could have latency > 5 mins
Finally, you can check the status with any client you use, do this
StreamDescription description = new StreamDescription
{
SourceId = Guid.NewGuid(),
Stream = dataStream
};
then you have the source id
ingesting by calling this:
var checker = await client.IngestFromStreamAsync(description, ingestProps);
after that, call
var statusCheck = checker.GetIngestionStatusBySourceId(description.sourceId.Value);
You can figure out the status of this ingestion job. It's better wrapped in a separate thread, so you can keep checking once a few seconds, for example.

Get a value from a string from a records in a server script

I work on an app with AppMaker and i want to get some value from a records.
I use this :
var query = app.models.Device.newQuery();
query.filters.Projet._equals = "project name";
var records = query.run();
for(var i in records){
var data = records[i];
console.log(data.Name);
}
i would get a value from a field name like :
for(var i in records){
var data = records[i];
console.log(data.getString("Name"));
}
I can do that with a JDBC object but it is not a good way.
There is a way to doing this with records from app maker ?
Thank you

how to persist data throughout all pages in the application?

Hi I am developing a cross platform application. I have a register page where I get the user details like user email and password and other details. I save all data under that particular user. I create a child with the unique email id given by the user and all remaining details must be stored under that user
Now I need the above product data and service data stored inside the first user. My path must always be the user who is logged in the application. Should I take that data to all pages. How to do it Can someone please help me?
Here is code how I create the above structure inside my database.
$(".subscribe1").click(function(){
var t1 = document.getElementById("email").value;
var t2 = document.getElementById("password").value;
var t3 = document.getElementById("confirmpassword").value;
var companyname = document.getElementById("companyname").value;
var address = document.getElementById("address").value;
var city = document.getElementById("city").value;
var pincode = document.getElementById("pincode").value;
var country = document.getElementById("country").value;
var sname = document.getElementById("sname").value;
var sid = document.getElementById("sid").value;
var saddr = document.getElementById("saddr").value;
var scity = document.getElementById("scity").value;
var spincode = document.getElementById("spincode").value;
var scountry = document.getElementById("scountry").value;
var s;
s = (t1.replace('.', ','));
console.log(s);
var ref = new Firebase("https://sampleapp.firebaseio.com/");
var account = ref.child(s+"/master_account");
account.push({"email": t1 , "password": t2});
var cmp = ref.child(s+"/company_details");
cmp.push({"email": t1, "company_name": companyname , "address": address, "city": city, "pincode": pincode, "country": country});
var store = ref.child(s+"/store_location");
store.push({"email": t1, "store_name": sname , "store_id" :sid , "address": saddr, "city": scity, "pincode": spincode, "country": scountry});
});
To be clear on your point, you want a logged in user's data to be available across your app.
Here are possible measures you could take to achieving that.
Once a user is logged in.
Save the basic data to localstorage, cache or sessionstore (data such as first_name, last_name, email).
Then for other pages, you can then query firebase to load other data you need for that page.
ref.child(email).child('master_account').once('value', () => {});
Once you get that data, you could save it somewhere in a variable object if you only want to use it for that page alone or cache or localstorage or session-store if you want to use it across other pages.
If a user then logs out, you'd clear the cache,local-storage or session-store whichever you choose to save the data in. And also you'd do a check to see if a user's detail still exists in any of those otherwise you'd log them out.
-I hope this helps.

Resources