I've implemented the tutorial here Server Broadcast with SignalR and my next step is to hook it up to an SQL db via EF code first.
In the StockTicker class the authors write the following code:
foreach (var stock in _stocks.Values)
{
if (TryUpdateStockPrice(stock))
{
BroadcastStockPrice(stock);
}
}
and the application I am working on needs a real time push news feed with a small audience (around 300 users). What would be the disadvantages of me simply doing something like (pseudo):
foreach (var message in _db.Messages.Where(x => x.Status == "New")
{
BroadcastMessage(message)
}
and what would be the best way to update each message status in the DB to != New without totally compromising performance?
I think the best way to determine whether or not your simple solution compromises performance too much is to try it out.
Something like the following should work for updating each message status.
foreach (var message in _db.Messages.Where(x => x.Status == "New"))
{
BroadcastMessage(message);
message.Status = "Read";
}
_db.SubmitChanges();
If you find this is too inefficient, you could always write a stored procedure that will select new messages and mark them as read.
It might be better to fine tune performance by adjusting the rate you are polling the database and batching messages so you broadcast a single message via SignalR for each DB query even when the DB returns multiple new messages.
If you decide to go stored proc route, here is another fairly in-depth article about using them with EF: http://msdn.microsoft.com/en-us/data/gg699321.aspx
Related
I would like to ask a question. I want to use services in foreach. But I think that when I call the service to access my database for menu items, the service can open new connections. If so what can I use? After 1000 items what can it be? Can the speed of the page decrease because of this?
#foreach (var parent in Model)
{
var menuItem = _uow.Menu.GetById(parent.Id);
#if (menuItem != null)
{
<span>#menuItem.Title</span>
}
}
The interface service
T GetById(int id);
The Entity Framework Service
public T GetById(int id)
{
return _context.Set<T>().Find(id);
}
Thank you.
Individually retrieving 1000 items will perform 1000 calls to the database, which is slower than a single query which will perform a single call (it's similar to performing a 1000 API calls vs a single one).
A solution to your issue would be to change your service so that you can query multiple ids at once:
var menuItems = _uow.Menu.GetByIds(Model.Select(parent => parent.Id));
and the service:
public IEnumerable<T> GetByIds(IEnumerable<int> ids)
{
return _context.Set<T>().Where(t => ids.Contains(t.id));
}
From your code point of view, each time you loop, you will access the database once. If there are 1000 counts of the model in the foreach, then you need to access the database 1000 times.
Assuming that it takes 0.01 seconds for the database access to return, the time consumed in the database for 1000 cycles is 10s. This is obviously unreasonable, because the time is proportional to the number of foreach.
Suggestion
If a certain kind of data needs to be accessed frequently, we generally store it in a cache, such as redis cache or MemoryCache. This can optimize speed.
I have a requirement where I will be receiving a batch of records. I have to disassemble and insert the data into DB which I have completed. But I don't want any message to come out of the pipeline except the last custom made message.
I have extended FFDasm and called Disassembler(), then we have GetNext() which is returning every debatched message out and they are failing as there is subscribers. I want to send nothing out from GetNext() until Last message.
Please help if anyone have already implemented this requirement. Thanks!
If you want to send only one message on the GetNext, you have to call on Disassemble method to the base Disassemble and get all the messages (you can enqueue this messages to manage them on GetNext) as:
public new void Disassemble(IPipelineContext pContext, IBaseMessage pInMsg)
{
try
{
base.Disassemble(pContext, pInMsg);
IBaseMessage message = base.GetNext(pContext);
while (message != null)
{
// Only store one message
if (this.messagesCount == 0)
{
// _message is a Queue<IBaseMessage>
this._messages.Enqueue(message);
this.messagesCount++;
}
message = base.GetNext(pContext);
}
}
catch (Exception ex)
{
// Manage errors
}
Then on GetNext method, you have the queue and you can return whatever you want:
public new IBaseMessage GetNext(IPipelineContext pContext)
{
return _messages.Dequeue();
}
The recommended approach is to publish messages after disassemble stage to BizTalk message box db and use a db adapter to insert into database. Publishing messages to message box and using adapter will provide you more options on design/performance and will decouple your DB insert from receive logic. Also in future if you want to reuse the same message for something else, you would be able to do so.
Even then for any reason if you have to insert from pipeline component then do the following:
Please note, GetNext() method of IDisassembler interface is not invoked until Disassemble() method is complete. Based on this, you can use following approach assuming you have encapsulated FFDASM within your own custom component:
Insert all disassembled messages in disassemble method itself and enqueue only the last message to a Queue class variable. In GetNext() message then return the Dequeued message, when Queue is empty return null. You can optimize the DB insert by inserting multiple rows at a time and saving them in batches depending on volume. Please note this approach may encounter performance issues depending on the size of file and number of rows being inserted into db.
I am calling DBInsert SP from GetNext()
Oh...so...sorry to say, but you're doing it wrong and actually creating a bunch of problems doing this. :(
This is a very basic scenario to cover with BizTalk Server. All you need is:
A Pipeline Component to Promote BTS.InterchageID
A Sequential Convoy Orchestration Correlating on BTS.InterchangeID and using Ordered Delivery.
In the Orchestration, call the SP, transform to SOAP, call the SOAP endpoint, whatever you need.
As you process the Messages, check for BTS.LastInterchagneMessage, then perform your close out logic.
To be 100% clear, there are no practical 'performance' issues here. By guessing about 'performance' you've actually created the problem you were thinking to solve, and created a bunch of support issues for later on, sorry again. :( There is no reason to not use an Orchestration.
As noted, 25K records isn't a lot. Be sure to have the Receive Location and Orchestration in different Hosts.
I would like to get the gyroscope value of a smartphone and send it to another one. I managed to set a value and retrieve it, however the result is very laggy, is the following method correct?
If no what can I change?
If yes is their another way to set a value and retrieve it in realtime in Unity?
//UPDATING THE VALUE
reference = FirebaseDatabase.DefaultInstance.RootReference;
Dictionary<string, object> gyro = new Dictionary<string, object>();
gyro["keyToUpdate"] = valueToUpdate;
reference.Child("parentKey").UpdateChildrenAsync(gyro);
// RETRIEVING THE VALUE
FirebaseDatabase.DefaultInstance
.GetReference("parentKey")
.Child("keyToUpdate")
.GetValueAsync().ContinueWith(task => {
if (task.IsFaulted) {
Debug.Log("error");
}
else if (task.IsCompleted) {
DataSnapshot snapshot = task.Result;
float valueUpdated = float.Parse(snapshot.Value.ToString());
Debug.Log(valueUpdated);
}
});
Firebase is fundamentally slower than you think it is. This is within its performance boundaries.
With any asynchronous calls, you can never be sure how fast or slow you may receive a response. Keep in mind that Firebase is routed through a system which is layered with elements for dealing with things like authentication and decentralized data.
If you continue to use Firebase, you'll need to make sure your code and UI is set up to allow for possibly long delays which are out of your control. Or you could spend lots of time building your own infrastructure, as DoctorPangloss mentioned.
I had to write a Web API to insert data into custom on-premise DB and then call a stored procedure for LogicApps to use. The LogicApps' call timeouts when passing large amopnts of data. So I'm trying to use this solution I found here:
LogicAppsAsyncResponseSample
So I would basically put all my code into the doWork like this:
foreach (var record in records)
{
...
//Insert record
cmd.ExecuteNonQuery();
}
...
//Call SP
cmd.ExecuteNonQuery();
runningTasks[id] = true;
My question is should I make my code in doWork, asynchronous? Use Await as needed and ExecuteNonQueryAsync instead of ExecuteNonQuery and add AsynchronousProcessing to my connection string?
Alternatively, too I was actually considering writing this to be "Fire and Forget". Meaning I would start a thread in my API to call doWork as in the sample and return OK instead of Accepted right away. Then I wouldn't need to store thread statuses or have the chekcStatus method. This is OK for me since the API can send alerts if anything fails. The only advantage to the noted sample is I can eventually return something to LogicApps indicating success or not and show it in my LogicApps' log (one place to see all). Is "Fire and Forget" a sound practice?
FYI: the call to dowork in the sample is:
new Thread(() => doWork(id)).Start();
I'm using .Net backend Azure Mobile Services on a Windows Phone app. I've added the offline services that Azure Mobile Services provides to the code for using SQLite.
The app makes successful Push calls (I can see the data in my database and they exist in the local db created by Azure Mobile Offline Services).
In PullAsync calls, it makes the right call to the Service (The table controller) and Service calculates the results and returns multiple rows from the database. However, the results are lost on the way. My client App is getting an empty Json message (I double checked it with fiddler).
IMobileServiceSyncTable<Contact> _contactTable;
/* Table Initialization Code */
var contactQuery = _contactTable.Where(c => c.Owner == Service.UserId);
await _contactTable.PullAsync("c0", contactQuery);
var contacts = await _contactTable.Select(c => c).ToListAsync();
Any suggestion on how I can investigate this issue?
Update
The code above is using incremental sync by passing query ID of "c0". Passing null to PullAsync for the first argument disables incremental sync and makes it return all the rows, which is working as expected.
await _contactTable.PullAsync(null, contactQuery);
But I'm still not able to get incremental sync to return the rows when app is reinstalled.
I had a similar issue with this, I found that the problem was caused by me making different sync calls the same table.
In my App I have a list of local users, and I only want to pull down the details for the users I have locally.
So I was issuing this call in a loop
userTbl.PullAsync("Updating User Table", userTbl.CreateQuery().Where(x => x.Id == grp1.UserId || x.Id == LocalUser.Id)
What I founds is that by Adding the user Id to the call I over came the issue:
userTbl.PullAsync("User-" + grp1.UserId, userTbl.CreateQuery().Where(x => x.Id == grp1.UserId || x.Id == LocalUser.Id)
As a side note, depending on your id format, the string you supply to the query has to be less than 50 characters in length.
:-)
The .NET backend is set up to not send JSON for fields that have the default value (e.g., zero for integers). There's a bug with the interaction with the offline SDK. As a workaround, you should set up the default value handling in your WebConfig:
config.Formatters.JsonFormatter.SerializerSettings.DefaultValueHandling = Newtonsoft.Json.DefaultValueHandling.Include;
You should also try doing similar queries using the online SDK (use IMobileServiceTable instead of sync table)--that will help narrow down the problem.
This is a very useful tool for debugging the deserialisation: use a custom delegating handler in your MobileServiceClient instance.
public class MyHandler: DelegatingHandler
{
protected override async Task SendAsync(HttpRequestMessage message, CancellationToken token)
{
// Request happens here
var response = await base.SendAsync(request, cancellationToken);
// Read response content and try to deserialise here...
...
return response;
}
}
// In your mobile client code:
var client = new MobileServiceClient("https://xxx.azurewebsites.net", new MyHandler());
This helped me to solve my issues. See https://blogs.msdn.microsoft.com/appserviceteam/2016/06/16/adjusting-the-http-call-with-azure-mobile-apps/ for more details.