I'm starting with REDIS and the StackExchange Redis client. I'm wondering if I'm getting the best performance for getting multiple items at once from REDIS.
Situation:
I have an ASP.NET MVC web application that shows a personal calendar on the dashboard of the user. Because the dashboard is the landing page it's heavily used.
To show the calendar items, I first get all calendar item ID's for that particular month:
RedisManager.RedisDb.StringGet("calendaritems_2016_8");
// this returns JSON Serialized List<int>
Then, for each calendar item id I build a list of corresponding cache keys:
"CalendarItemCache_1"
"CalendarItemCache_2"
"CalendarItemCache_3"
etc.
With this collection I reach out to REDIS with a generic function:
var multipleItems = CacheHelper.GetMultiple<CalendarItemCache>(cacheKeys);
That's implemented like:
public List<T> GetMultiple<T>(List<string> keys) where T : class
{
var taskList = new List<Task>();
var returnList = new ConcurrentBag<T>();
foreach (var key in keys)
{
Task<T> stringGetAsync = RedisManager.RedisDb.StringGetAsync(key)
.ContinueWith(task =>
{
if (!string.IsNullOrWhiteSpace(task.Result))
{
var deserializeFromJson = CurrentSerializer.Serializer.DeserializeFromJson<T>(task.Result);
returnList.Add(deserializeFromJson);
return deserializeFromJson;
}
else
{
return null;
}
});
taskList.Add(stringGetAsync);
}
Task[] tasks = taskList.ToArray();
Task.WaitAll(tasks);
return returnList.ToList();
}
Am I implementing pipelining correct? The REDIS CLI monitor shows:
1472728336.718370 [0 127.0.0.1:50335] "GET" "CalendarItemCache_1"
1472728336.718389 [0 127.0.0.1:50335] "GET" "CalendarItemCache_2"
etc.
I'm expecting some kind of MGET command.
Many thanks in advance.
I noticed an overload method for StringGet that accepts a RedisKey[]. Using this, I see a MGET command in the monitor.
public List<T> GetMultiple<T>(List<string> keys) where T : class
{
List<RedisKey> list = new List<RedisKey>(keys.Count);
foreach (var key in keys)
{
list.Add(key);
}
RedisValue[] result = RedisManager.RedisDb.StringGet(list.ToArray());
var redisValues = result.Where(x=>x.HasValue);
var multiple = redisValues.Select(x => DeserializeFromJson<T>(x)).ToList();
return multiple;
}
Related
We have a static class to store certain data at the application level. We set these property values when the user logs in. These values will be used wherever they are required.
In Blazor WASM, which approach is suitable for my case above? Appreciate if can provide some sample.
Thanks.
Iam currently using a claim based approach, ( with azure B2C authentication ), where im storing the values i need into the claims:
string oid ="information to store";
(context.Principal.Identity as System.Security.Claims.ClaimsIdentity).AddClaim(new System.Security.Claims.Claim("claim_name", oid));
set the info after the login:
options.Events = new OpenIdConnectEvents
{ OnTokenValidated = async context =>
{
//SET THE INFO HERE
}
}
Retrieve it by:
public string get_info_in_claims(AuthenticationState authState)
{
string ret = "";
foreach (var cla in authState.User.Claims)
{
if (cla.Type == "claim_name")
{
ret = cla.Value;
}
}
return ret;
}
We have a web application written in .Net Core (currently v2.2), and with Angular as frontend.
If I do an ajax-call to one route in the backend which in turn opens up a dbcontext to perform a query, we are experiencing that all subsequent ajax-calls to any other route is getting held up until the query of the first controller route is done. (no, its not a DB Lock in the SQL server. Its different tables).
Example of the code in the first route (which, for the purpose of the example say takes 20 seconds):
public IActionResult GetBusinessesByNaceAndAmount(int take)
{
using (ConsumentContext consumentContext = new ConsumentContext())
{
var data = consumentContext.Businesses.AsNoTracking().Where(b => b.Established_date != null).GroupBy(b => new { Code = b.Business_code.Substring(0, 2) }).Select(b => new
{
BusinessName = b.First().Business_code.Substring(0, 2),
Businesses = b.Where(bl => bl.Established_date != null).OrderBy(bl => bl.Established_date).Select(bl => new { BusinessName = bl.Name, Amount = 10 }).Take(10).ToList(),
}).Take(take).ToList();
return Ok(data );
}
}
Then, I perform another call to this, one millisecond later in the frontend:
public IActionResult GetCustomers()
{
using (ConsumentContext consumentContext = new ConsumentContext())
{
var customers = consumentContext.Customers.AsNoTracking().Take(5).ToList();
return Ok(customers);
}
}
Even though the query of the second endpoint only takes a few milliseconds, its TTFB is held up until the first one is done.
I don't know if it has anything to do with it, but our backend is currently running in a linux environment (Docker container), and is communicating via TCP/IP to our MSSQL server (yes, its locked down in firewall).
Your problem looks like either your server is running out of free threads to process action or in your Angular application you are not making API calls simultaneously but sequentially.
To free threads in a long running DB call, you can try changing your first action to an async action so the thread is not freezed, e.g.
public async Task<IActionResult> GetBusinessesByNaceAndAmount(int take, CancellationToken token)
{
using (ConsumentContext consumentContext = new ConsumentContext())
{
var data = await consumentContext.Businesses
.AsNoTracking()
.Where(b => b.Established_date != null)
.GroupBy(b => new { Code = b.Business_code.Substring(0, 2) })
.Select(b => new
{
BusinessName = b.First().Business_code.Substring(0, 2),
Businesses = b.Where(bl => bl.Established_date != null)
.OrderBy(bl => bl.Established_date)
.Select(bl => new { BusinessName = bl.Name, Amount = 10})
.Take(10).ToList(),
})
.Take(take)
.ToListAsync(token);
return Ok(data);
}
}
This could help if your server is running out of threads to process the action.
You can also verify your Angular code. If your application is waiting for the result of the first API call, then above code won't help - you should make all the calls simultaneously.
I'm setting up some testing to automatically check that our DI has been configured correctly. In particular we want to ensure that the lifestyles of dependancies match (so we don't get transients injected into singletons) and to avoid using the service locator as much as possible, relying on the constructor injection instead.
In the past we've used Castle.Windsor as our service provider, which comes with diagnostic classes and functions to help catch these problems. Are there similar functions for MS.DI or is it something we'll have to roll ourselves?
While I agree with the advice from Steven & Chris, developers are not always in charge of the infrastructure they have to use. Where possible I will be pushing for Castle.Windsor, since it's what my team is most familiar with, but in this case we managed to cobble together the tests we wanted ourselves.
I'm presenting the test pattern here in case someone stumbles across this question and has the same struggles, but please do consider using a better DI provider, especially if you're still at the start of your project.
[Test]
public void Assert_Lifetimes_Are_Consistent()
{
var missing = new List<string>();
var errors = new HashSet<Tuple<string, string>>();
foreach (var service in _serviceCollection.Where(s => IsInYourAssembly(s.ServiceType)))
{
var serviceLifetimeRanking = LifetimeRanking(service.Lifetime);
foreach (var fieldInfo in ((System.Reflection.TypeInfo)service.ServiceType).DeclaredFields.Where(fi => fi.FieldType.IsAbstract && IsInYourAssembly(fi.FieldType)))
{
var dependencyLifetime = _serviceCollection.SingleOrDefault(fi => fi.ServiceType == fieldInfo.FieldType)?.Lifetime;
if (dependencyLifetime==null)
missing.Add($"No service found for {fieldInfo.FieldType.FullName} as a dependency for {service.ServiceType.FullName}");
var dependencyLifetimeRanking = LifetimeRanking(dependencyLifetime);
if (dependencyLifetimeRanking > serviceLifetimeRanking)
errors.Add(
Tuple.Create(
$"{service.ServiceType.Name} ({service.Lifetime})",
$"{fieldInfo.FieldType.Name} ({dependencyLifetime})"
)
);
}
}
if (missing.Any()||errors.Any())
{
var sb = new StringBuilder();
sb.AppendJoin(Environment.NewLine, missing);
if (errors.Any())
{
sb.AppendLine("Following dependency pairs have inconsistent lifestyles:");
sb.AppendLine(string.Join(Environment.NewLine, errors.Select(err => $"{err.Item1} -> {err.Item2}")));
}
Assert.Fail(sb.ToString());
}
}
private bool IsInYourAssembly(Type type)
{
return (type.Assembly.FullName?.IndexOf("YOUR_PROJECT_ASSEMBLY_HERE") ?? -1) == 0;
}
private int LifetimeRanking(ServiceLifetime serviceLifetime)
{
switch (serviceLifetime)
{
case ServiceLifetime.Singleton:
return 1;
case ServiceLifetime.Scoped:
return 2;
case ServiceLifetime.Transient:
return 3;
default:
throw new ArgumentOutOfRangeException("serviceLifetime", serviceLifetime,
$"Value is not a known member of the ServiceLifetime enum");
}
}
If the test fails, it will return a list of missing dependancies followed by a list of incompatible dependancy lifetimes.
The _serviceCollection field needs to be populated and Startup(config, env).ConfigureServices(_serviceCollection); needs to be called before running the test.
The IsInYourAssembly is an important function to filter out all the generic types which are also returned in the _serviceCollection.
In general I want to export data from asp.net mvc application to Google Sheets for example list of people. I've already set up connection and authenticated app with my Google account (trough OAuth2) but now I'm trying to send my list of objects to api and then handle it in script (by putting all data in new file) and couldn't get my head around this.
Here is some sample code in my app that sends the request.
public async Task<ActionResult> SendTestData()
{
var result = new AuthorizationCodeMvcApp(this, new AppFlowMetadata()).
AuthorizeAsync(CancellationToken.None).Result;
if (result.Credential != null)
{
string scriptId = "MY_SCRIPT_ID";
var service = new ScriptService(new BaseClientService.Initializer
{
HttpClientInitializer = result.Credential,
ApplicationName = "Test"
});
IList<object> parameters = new List<object>();
var people= new List<Person>(); // next i'm selecting data from db.Person to this variable
parameters.Add(people);
ExecutionRequest req = new ExecutionRequest();
req.Function = "testFunction";
req.Parameters = parameters;
ScriptsResource.RunRequest runReq = service.Scripts.Run(req, scriptId);
try
{
Operation op = runReq.Execute();
if (op.Error != null)
{
// The API executed, but the script returned an error.
// Extract the first (and only) set of error details
// as a IDictionary. The values of this dictionary are
// the script's 'errorMessage' and 'errorType', and an
// array of stack trace elements. Casting the array as
// a JSON JArray allows the trace elements to be accessed
// directly.
IDictionary<string, object> error = op.Error.Details[0];
if (error["scriptStackTraceElements"] != null)
{
// There may not be a stacktrace if the script didn't
// start executing.
Newtonsoft.Json.Linq.JArray st =
(Newtonsoft.Json.Linq.JArray)error["scriptStackTraceElements"];
}
}
else
{
// The result provided by the API needs to be cast into
// the correct type, based upon what types the Apps
// Script function returns. Here, the function returns
// an Apps Script Object with String keys and values.
// It is most convenient to cast the return value as a JSON
// JObject (folderSet).
Newtonsoft.Json.Linq.JObject folderSet =
(Newtonsoft.Json.Linq.JObject)op.Response["result"];
}
}
catch (Google.GoogleApiException e)
{
// The API encountered a problem before the script
// started executing.
AddAlert(Severity.error, e.Message);
}
return RedirectToAction("Index", "Controller");
}
else
{
return new RedirectResult(result.RedirectUri);
}
}
The next is how to handle this data in scripts - are they serialized to JSON there?
The execution API calls are essentially REST calls so the payload should be serialized as per that. Stringified JSON is typically fine. Your GAS function should then parse that payload to consume the encoded lists
var data = JSON.parse(payload);
I'm creating an app that requires todo parallel http request, I'm using HttpClient for this.
I'm looping over the urls and foreach URl I start a new Task todo the request.
after the loop I wait untill every task finishes.
However when I check the calls being made with fiddler I see that the request are being called synchronously. It's not like a bunch of request are being made, but one by one.
I've searched for a solution and found that other people have experienced this too, but not with UWP. The solution was to increase the DefaultConnectionLimit on the ServicePointManager.
The problem is that ServicePointManager does not exist for UWP. I've looked in the API's and I thought I could set the DefaultConnectionLimit on HttpClientHandler, but no.
So I have a few Questions.
Is DefaultConnectionLimit still a property that could be set somewhere?
if so, where do i set it?
if not, how do I increase the connnectionlimit?
Is there still a connectionlimit in UWP?
this is my code:
var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
requests.Add(Task.Factory.StartNew((x) =>
{
((Show)x).NextEpisode = GetEpisodeAsync(((Show)x).NextEpisodeUri, client).Result;}, show));
}
}
await Task.WhenAll(requests.ToArray());
and this is the request:
public async Task<Episode> GetEpisodeAsync(string nextEpisodeUri, HttpClient client)
{
try
{
if (String.IsNullOrWhiteSpace(nextEpisodeUri)) return null;
HttpResponseMessage content; = await client.GetAsync(nextEpisodeUri);
if (content.IsSuccessStatusCode)
{
return JsonConvert.DeserializeObject<EpisodeWrapper>(await content.Content.ReadAsStringAsync()).Episode;
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
return null;
}
Oke. I have the solution. I do need to use async/await inside the task. The problem was the fact I was using StartNew instead of Run. but I have to use StartNew because i'm passing along a state.
With the StartNew. The task inside the task is not awaited for unless you call Unwrap. So Task.StartNew(.....).Unwrap(). This way the Task.WhenAll() will wait untill the inner task is complete.
When u are using Task.Run() you don't have to do this.
Task.Run vs Task.StartNew
The stackoverflow answer
var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
requests.Add(Task.Factory.StartNew(async (x) =>
{
((Show)x).NextEpisode = await GetEpisodeAsync(((Show)x).NextEpisodeUri, client);
}, show)
.Unwrap());
}
Task.WaitAll(requests.ToArray());
I think an easier way to solve this is not "manually" starting requests but instead using linq with an async delegate to query the episodes and then set them afterwards.
You basically make it a two step process:
Get all next episodes
Set them in the for each
This also has the benefit of decoupling your querying code with the sideeffect of setting the show.
var shows = Enumerable.Range(0, 10).Select(x => new Show());
var client = new HttpClient();
(Show, Episode)[] nextEpisodes = await Task.WhenAll(shows
.Select(async show =>
(show, await GetEpisodeAsync(show.NextEpisodeUri, client))));
foreach ((Show Show, Episode Episode) tuple in nextEpisodes)
{
tuple.Show.NextEpisode = tuple.Episode;
}
Note that i am using the new Tuple syntax of C#7. Change to the old tuple syntax accordingly if it is not available.