Cosmos Db Change Feed do not give the desired result - azure-cosmosdb

I am trying to get all the changes from my monitored Db in the Cosmos db Change Feed by following the below codes. I am trying for different duration like the changes happening today or the changes happened in last 7 days or changes happened overall. Now in all the cases I get all the changes in first time and no changes in the second run unless there is a change coming. I would like to know what wrong am I doing here, and If I have to get changes for only last week on my each run, How should I configure the change feed if my below code is not correct. Thanks in Advance
private static CosmosClient Client { get; set; }
static void Main(string[] args)
{
Task.Run(async () =>
{
Client = new CosmosClient("AccountEndpoint = https://test.documents.azure.com:443/;AccountKey=FW5yvDA==;");
var database = Client.GetDatabase("twstDatabase");
var container = database.GetContainer("TestContainer");
var leaseContainer = database.GetContainer("leases");
var cfp = container.GetChangeFeedProcessorBuilder<dynamic>("cfpLibraryDThird", ProcessChanges)
.WithLeaseContainer(leaseContainer)
.WithInstanceName("Change Feed Instance Demo")
// I change the instance name for different start time
.WithStartTime(DateTime.Today.AddDays(-7).ToUniversalTime())
//.WithStartTime(DateTime.MinValue.ToUniversalTime())
.Build();
await cfp.StartAsync();
Console.WriteLine("Started Change feed processor- press key to stop");
Console.ReadKey(true);
await cfp.StopAsync();
}).Wait();
}
static async Task ProcessChanges(IReadOnlyCollection<dynamic> docs, CancellationToken cancellationToken)
{
foreach (var doc in docs)
{
Console.WriteLine($"Document {doc.id} has changed");
}
}
}
}

It was fixed at 3.15.1. I updated the package and it fixes the problem.

This is a bug with 3.15, you can track this here https://github.com/Azure/azure-cosmos-dotnet-v3/issues/2031.

Related

Azure Function Integration of Serilog with Application Insights, logs visible in Search but are not appearing in Failures events timeline

I am trying to use Serilog with Application Insights sink for logging purposes. I can see the logs in Search bar in Azure Portal (Application Insights) but same logs are not visible if we view the timeline of events in Failures or Performance Tab. Thanks
Below is the code am using for registering Logger in FunctionStartup, which then gets injected in Function for logging:
var logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.Enrich.WithProperty("ApplicationName", "testApp")
.Enrich.WithProperty("Environment", "Dev")
.WriteTo.ApplicationInsights(GetTelemetryClient("Instrumentationkey"), TelemetryConverter.Traces)
.CreateLogger();
builder.Services.AddSingleton<ILogger>(logger);
Telementory Client is getting fetched from a helper method:
public static TelemetryClient GetTelemetryClient(string key)
{
var teleConfig = new TelemetryConfiguration { InstrumentationKey = key };
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingExcludedTypes": "Request",
"samplingSettings": {
"isEnabled": true
}
}
}
}
I got your mean, and pls allow me to sum up my testing result here.
First, the failure blade is not designed for providing a timeline which used to trace the details(what happened before the exception take place), but to show all the exceptions, how often the error happened, how many users be affected, etc, it's more likely stand in a high place to see the whole program.
And to achieve your goal, I think you can use this kql query in the Logs blade or watching it in transaction blade.
union traces, requests,exceptions
| where operation_Id == "178845c426975d4eb96ba5f7b5f376e1"
Basically, we may add many logs in the executing chain, e.g. in the controller, log the input parameter, then log the result of data combining or formatting, log the exception information in catch, so here's my testing code. I can't see any other information in failure blade as you, but in the transaction blade, I can see the timeline.
public class HelloController : Controller
{
public string greet(string name)
{
Log.Verbose("come to greet function");
Log.Debug("serilog_debug_info");
Log.Information("greet name input " + name);
int count = int.Parse(name);
Log.Warning("enter greet name is : {0}", count);
return "hello " + name;
}
}
And we can easily find that, the whole chain shares the same operationId, and via all these logs, we can pinpoint the wrong line code. By the way, if I surround the code with try/catch, exception won't be captured in the failure blade.
==================================
Using Serilog integrate app insights, we need to send serilog to application insights, and we will see lots of Traces in transaction search, so it's better to made the MinimumLevel to be information and higher. The sreenshot below is my log details, and we can also use kql query by operationId to see the whole chain.
You can easily solve this by following the solution provided by Azure Application Insights on their GitHub repo, as per this Github Issue, you can either use the DI to configure TelemetryConfiguration, i.e
services.Configure<TelemetryConfiguration>(
(o) => {
o.InstrumentationKey = "123";
o.TelemetryInitializers.Add(new OperationCorrelationTelemetryInitializer());
});
or you can configure it manually like this:
var config = TelemetryConfiguration.CreateDefault();
var client = new TelemetryClient(config);
So in your code, you have to change your GetTelemetryClient from
public static TelemetryClient GetTelemetryClient(string key)
{
var teleConfig = new TelemetryConfiguration { InstrumentationKey = key };
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
to this
public static TelemetryClient GetTelemetryClient(string key)
{
var teleConfig = TelemetryConfiguration.CreateDefault();
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
In order to use logging using Telemetry Configuration as mentioned in the answer above for Azure Functions, we just need to update the function as in below snippet and on deployment it should fetch Instrumentation key itself
public static TelemetryClient GetTelemetryClient()
{
var teleConfig = TelemetryConfiguration.CreateDefault();
var teleClient = new TelemetryClient(teleConfig);
return teleClient;
}
But to run both locally and after deployment on Azure. We need to add something like this in function Startup and get rid of the Function above.
builder.Services.Configure<TelemetryConfiguration>((o) =>
{
o.InstrumentationKey = "KEY";
o.TelemetryInitializers.Add(new OperationCorrelationTelemetryInitializer());
});
builder.Services.AddSingleton<ILogger>(sp =>
{
var logger = new LoggerConfiguration()
.Enrich.FromLogContext()
.Enrich.WithProperty("ApplicationName", "TEST")
.Enrich.WithProperty("Environment", "DEV")
.WriteTo.ApplicationInsights(
sp.GetRequiredService<TelemetryConfiguration>(), TelemetryConverter.Traces).CreateLogger();
return logger;
});
After wards we just need to use typical DI in our classes/azure function to use ILogger
public class Test{
public ILogger _log;
public void Test(ILogger log){
_log=log;
}
}

Flutter moor insert hangs on isolate

When I create a database I want to initialize it with a ton of data.
I have the following initialization service.
// This needs to be a top-level method because it's run on a background isolate
DatabaseConnection _backgroundConnection() {
// construct the database. You can also wrap the VmDatabase in a "LazyDatabase" if you need to run
// work before the database opens.
final database = VmDatabase.memory();
return DatabaseConnection.fromExecutor(database);
}
Future<void> _initDatabase(Map<String, dynamic> args) async {
var moorIsolate = await MoorIsolate.spawn(_backgroundConnection);
var connection = await moorIsolate.connect();
var db = BillingDatabase.connect(connection);
_initBillingSpecialties(db, args["specialties"]);
}
Future<void> _initBillingSpecialties(BillingDatabase db, String specialtiesJson) async {
var json = jsonDecode(specialtiesJson);
var jsonSpecialties = json["specialties"] as List<dynamic>;
var specialities = jsonSpecialties.map((s) =>
DbSpecialtiesCompanion(name: Value(s["specialty_name"]),
mohNumber: Value(s["moh_specialty"]))).toList();
return db.specialtyDao.saveAllSpecialties(specialities);
}
#injectable
class InitDbService {
Future<void> initDatabase() async {
WidgetsFlutterBinding.ensureInitialized();
var specialties = await rootBundle.loadString("lib/assets/billing_specialties.json");
compute(_initDatabase, {"specialties": specialties});
//initDbSync(specialties);
}
Future<void> initDbSync(String specialtiesJson) async {
var json = jsonDecode(specialtiesJson);
var jsonSpecialties = json["specialties"] as List<dynamic>;
var specialities = jsonSpecialties.map((s) =>
DbSpecialtiesCompanion(name: Value(s["specialty_name"]),
mohNumber: Value(s["moh_specialty"]))).toList();
var dao = GetIt.instance.get<SpecialtyDao>();
return dao.saveAllSpecialties(specialities);
}
}
initDbSync runs and inserts just fine. While db.specialtyDao.saveAllSpecialties(specialities); never actually exectues any SQL. I have it printing log statements for the moment so I can see what it's doing.
Update: I found out that VmDatabase.memory(logStatements: true); was needed to see the SQL. I can see it printing the statements.
I'm running on a simulator so I can look at the raw db file. And there's nothing there. When I query in the app there's also nothing there.
So what's not really clear in the documentation is that VmDatabase.memory(); opens up a new database in memory. Not takes the database from memory.
You want to take your reference to the file that you pass in the constructor, and use
VmDatabase(File(dbFile));
then it will actually run on your sql.

How do I read and update HttpResponse body using PipeWriter?

This is actually a 2-part question related directly to .net core 3.0 and specifically with PipeWriter: 1) How should I read in the HttpResponse body? 2) How can I update the HttpResponse? I'm asking both questions because I feel like the solution will likely involve the same understanding and code.
Below is how I got this working in .net core 2.2 - note that this is using streams instead of PipeWriter and other "ugly" things associated with streams - eg. MemoryStream, Seek, StreamReader, etc.
public class MyMiddleware
{
private RequestDelegate Next { get; }
public MyMiddleware(RequestDelegate next) => Next = next;
public async Task Invoke(HttpContext context)
{
var httpResponse = context.Response;
var originalBody = httpResponse.Body;
var newBody = new MemoryStream();
httpResponse.Body = newBody;
try
{
await Next(context);
}
catch (Exception)
{
// In this scenario, I would log out the actual error and am returning this "nice" error
httpResponse.StatusCode = StatusCodes.Status500InternalServerError;
httpResponse.ContentType = "application/json"; // I'm setting this because I might have a serialized object instead of a plain string
httpResponse.Body = originalBody;
await httpResponse.WriteAsync("We're sorry, but something went wrong with your request.");
return;
}
// If everything worked
newBody.Seek(0, SeekOrigin.Begin);
var response = new StreamReader(newBody).ReadToEnd(); // This is the only way to read the existing response body
httpResponse.Body = originalBody;
await context.Response.WriteAsync(response);
}
}
How would this work using PipeWriter? Eg. it seems that working with pipes instead of the underlying stream is preferable, but I can not yet find any examples on how to use this to replace my above code?
Is there a scenario where I need to wait for the stream/pipe to finish writing before I can read it back out and/or replace it with a new string? I've never personally done this, but looking at examples of PipeReader seems to indicate to read things in chunks and check for IsComplete.
To Update HttpRepsonse is
private async Task WriteDataToResponseBodyAsync(PipeWriter writer, string jsonValue)
{
// use an oversized size guess
Memory<byte> workspace = writer.GetMemory();
// write the data to the workspace
int bytes = Encoding.ASCII.GetBytes(
jsonValue, workspace.Span);
// tell the pipe how much of the workspace
// we actually want to commit
writer.Advance(bytes);
// this is **not** the same as Stream.Flush!
await writer.FlushAsync();
}

Parallel httprequest in UWP app

I'm creating an app that requires todo parallel http request, I'm using HttpClient for this.
I'm looping over the urls and foreach URl I start a new Task todo the request.
after the loop I wait untill every task finishes.
However when I check the calls being made with fiddler I see that the request are being called synchronously. It's not like a bunch of request are being made, but one by one.
I've searched for a solution and found that other people have experienced this too, but not with UWP. The solution was to increase the DefaultConnectionLimit on the ServicePointManager.
The problem is that ServicePointManager does not exist for UWP. I've looked in the API's and I thought I could set the DefaultConnectionLimit on HttpClientHandler, but no.
So I have a few Questions.
Is DefaultConnectionLimit still a property that could be set somewhere?
if so, where do i set it?
if not, how do I increase the connnectionlimit?
Is there still a connectionlimit in UWP?
this is my code:
var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
requests.Add(Task.Factory.StartNew((x) =>
{
((Show)x).NextEpisode = GetEpisodeAsync(((Show)x).NextEpisodeUri, client).Result;}, show));
}
}
await Task.WhenAll(requests.ToArray());
and this is the request:
public async Task<Episode> GetEpisodeAsync(string nextEpisodeUri, HttpClient client)
{
try
{
if (String.IsNullOrWhiteSpace(nextEpisodeUri)) return null;
HttpResponseMessage content; = await client.GetAsync(nextEpisodeUri);
if (content.IsSuccessStatusCode)
{
return JsonConvert.DeserializeObject<EpisodeWrapper>(await content.Content.ReadAsStringAsync()).Episode;
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
return null;
}
Oke. I have the solution. I do need to use async/await inside the task. The problem was the fact I was using StartNew instead of Run. but I have to use StartNew because i'm passing along a state.
With the StartNew. The task inside the task is not awaited for unless you call Unwrap. So Task.StartNew(.....).Unwrap(). This way the Task.WhenAll() will wait untill the inner task is complete.
When u are using Task.Run() you don't have to do this.
Task.Run vs Task.StartNew
The stackoverflow answer
var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
requests.Add(Task.Factory.StartNew(async (x) =>
{
((Show)x).NextEpisode = await GetEpisodeAsync(((Show)x).NextEpisodeUri, client);
}, show)
.Unwrap());
}
Task.WaitAll(requests.ToArray());
I think an easier way to solve this is not "manually" starting requests but instead using linq with an async delegate to query the episodes and then set them afterwards.
You basically make it a two step process:
Get all next episodes
Set them in the for each
This also has the benefit of decoupling your querying code with the sideeffect of setting the show.
var shows = Enumerable.Range(0, 10).Select(x => new Show());
var client = new HttpClient();
(Show, Episode)[] nextEpisodes = await Task.WhenAll(shows
.Select(async show =>
(show, await GetEpisodeAsync(show.NextEpisodeUri, client))));
foreach ((Show Show, Episode Episode) tuple in nextEpisodes)
{
tuple.Show.NextEpisode = tuple.Episode;
}
Note that i am using the new Tuple syntax of C#7. Change to the old tuple syntax accordingly if it is not available.

Why does GattCharacteristic.ReadValueAsync() return 20 zero bytes?

I have the following code I'm using to connect to a TruConnect Bobcat bluetooth module (https://ack.me/FAQs/What_are_the_UUID_s_used_by_TruConnect):
public async void Start()
{
Guid RX_UUID = Guid.Parse("1cce1ea8-bd34-4813-a00a-c76e028fadcb");
Guid TX_UUID = Guid.Parse("cacc07ff-ffff-4c48-8fae-a9ef71b75e26");
Guid ServiceGuid = Guid.Parse("175f8f23-a570-49bd-9627-815a6a27de2a");
var devices = await DeviceInformation.FindAllAsync(GattDeviceService.GetDeviceSelectorFromUuid(ServiceGuid));
var service = await GattDeviceService.FromIdAsync(devices[0].Id);
var TXCharacteristic = service.GetCharacteristics(TX_UUID)[0];
GattReadResult result = await TXCharacteristic.ReadValueAsync(BluetoothCacheMode.Uncached);
byte[] buffer = (result.Value.ToArray());
}
The problem is that the buffer at the end always ends up with 20 zero bytes, even though my module is not sending anything. All this despite result.Status turns out to be Success.
This is what I've boiled it down to after trying to make it run in a bigger app and getting the same result.
Another interesting thing I've noticed is that I've tried the same approach on a WindowsHubApplication and it worked. Now, in a Universal App, it doesn't.
I've also tried both Cached and Uncached modes.
Thanks in advance
Problem solved!
I've initially tried subscribing to the Characteristic.ValueChanged event but the event handler never got called. Apparently it only works if you subscribe to the event AFTER you send the GattClientCharacteristicConfigurationDescriptorValue. (and I have absolutely no idea why, maybe someone can shed some light on this, but for now it works)
public async void Start()
{
Guid RX_UUID = Guid.Parse("1cce1ea8-bd34-4813-a00a-c76e028fadcb");
Guid TX_UUID = Guid.Parse("cacc07ff-ffff-4c48-8fae-a9ef71b75e26");
Guid ServiceGuid = Guid.Parse("175f8f23-a570-49bd-9627-815a6a27de2a");
var devices = await DeviceInformation.FindAllAsync(GattDeviceService.GetDeviceSelectorFromUuid(ServiceGuid));
var service = await GattDeviceService.FromIdAsync(devices[0].Id);
var TXCharacteristic = service.GetCharacteristics(TX_UUID)[0];
var RXCharacteristic = service.GetCharacteristics(RX_UUID)[0];
await TXCharacteristic.WriteClientCharacteristicConfigurationDescriptorAsync(GattClientCharacteristicConfigurationDescriptorValue.Notify);
TXCharacteristic.ValueChanged += TXCharacteristic_ValueChanged; ;
}
private void TXCharacteristic_ValueChanged(GattCharacteristic sender, GattValueChangedEventArgs args)
{
byte[] buffer = args.CharacteristicValue.ToArray();
}
Now I get very nice 20 byte buffers filled with actual valid data.

Resources