I'm building a UI in Google Appmaker to manage the Domain Shared Contacts API. I can read and display data from the API in appmaker using a calculated model, but I cannot save data back into the API.
I tried adding a form widget with the calculated model as a datasource, but this is a read / view only form. This is when i found out calculated data sources are always read-only in app maker.
I then created an SQL datasource and made an event that queries the Domain Shared Contact API which then creates records and saves them to the SQL Model. I was thinking about using the standard form widget and then in the events beforeSave, beforeDelete, etc to write the record to the API.
However this feels dirty to me, since now I have to keep the SQL model synchronized with the API, need to have an SQL server running, etc.
Does anyone have any tips on the best way to write data from a form in appmaker to a REST API? I can't be only one with this use case.
You have to write app script function for this. https://developers.google.com/apps-script/reference/url-fetch/url-fetch-app
Related
I am working on a personal project to recreate the news feed of Facebook. So what I am trying to do is to recreate the scenario where when the user goes to the news feed, the user gets posts of everyone he follows only. Is there any way to run a query like that using the Firebase real-time database using an of "followings".
I can successfully generate single users posts in the android studio app using snapshot and recycler view.
If you're asking whether you can get posts from multiple userUID values with a single query, that is not possible.
If you're asking whether you can pass a list of postUID values to retrieve, that is also not possible.
In both cases the solution is to execute a separate query/read operation for each of the values, and merge the results in your application code. This is not nearly as slow as you may think, since Firebase pipelines the requests over a single web socket connection - which is quite efficient. For more on this, see Speed up fetching posts for my social network app by using query instead of observing a single event repeatedly
tl;dr: I want to reference an external data source from a Kusto query in Application Insights.
My application is writing logs to Application Insights, and we're querying it using Kusto in the Azure portal. To give an example of what I'm trying to do:
We're currently looking at these logs to find an action that triggers when a visitor viewed a blog post on our site. This is working well on a per blog-post level, but now we want to group this data by the category these blog posts are in, or by the tags they have, but that's not information I have within the logs.
The information we log contains unique info about that blog post (unique url, our internal id, etc) that I could use to look up this information in another data source (e.g. our SQL DB where this relation is stored), but I have no idea if/how this is possible. So that's the question, is this possible? Can I query a SQL DB, or get data in JSON via a URL or something?
Alternative solutions would be to move the reporting elsewhere (e.g. PowerBI) and just use AI as a data source, or to actually log all the category/tag info, but I really don't want to go down that route.
Kusto supports accessing external data (blobs, Azure SQL, Cosmos DB), however
Application Insights / Azure Monitor and other multi-tenant services are blocking this functionality due to security and resource governance concerns.
You could try setting-up your own Azure Data Explorer (Kusto) cluster, where this functionality will be available, and then access your Application Insights data using cross-cluster query, or by exporting the data from Application Insights and hooking up EventGrid ingestion into your Kusto cluster.
Relevant links:
Kusto supporting external data:
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables
Querying data inside Application Insights:
https://learn.microsoft.com/en-us/azure/data-explorer/query-monitor-data
Continuous export data from Application Insights:
https://learn.microsoft.com/en-us/azure/azure-monitor/app/export-telemetry
Data ingestion into Kusto from EventGrid:
https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-grid
Am using grids in VB.net to display database records stored in Microsoft Access, the tables allow editing and deleting using the grid fields.
Is there a way I can monitor whenever a user deletes or edits a record? I want to be able to view details of every update or deletion to certain records, such as the date and users who did it.
What you're speaking of is known as "auditing" and certain databases - such as MS SQL Server - have built-in support for this. MS Access does not include this feature. With the abscence of auditing, a common way to implement this in a custom manner is using update triggers. Unfortunately MS Access also does not have triggers. The only way you'll be able to do this is via an API you write yourself to interact with your tables and discipline to stick to that API.
What you want to do is hook into the save commands on your inserts and deletes. You could also hook into the events to capture the data. Either way, create an insert statement that dumps the log data into your log database.
I'm not really sure how to best word this. We have an ASP.NET web application with the backend services accessible over a WCF service layer. We need to add some reporting/dashboard type bits to the web application.
To make it scalable the data needed for the reporting needs to be calculated on the backend. I'm just wondering if there is a recommended way to pass this data around. It doesn't make much sense to have different service methods to get the different bits of data, it feels like it should be summarised already.
I had a look at WCF Data Services, but that seems more for retrieving full object trees. Maybe some sort of XML document so extra items can be added to the summary without needing service layer changes?
The data would be things like number of orders today, number of orders specific to the person running it, open orders outstanding etc.
Does anyone have any pointers?
Thanks for your time
You can look at something like ASP.NET Web API and use an XML Formatter for your data. You can use ViewModels to flatten your data and send over the wire to your web app to bind to grids or whatever you need to.
Basically you would get request (filters, keywords, etc) from your web app, send the parameters to your reporting back-end, retrieve the reporting data, map the values to your ViewModels and serialize them using Web API. Using Web API you can use all kinds of formatters for your data to XML, CSV and JSON to vCard, iCal, PDF, etc...
You can read more about it here: http://www.asp.net/web-api
I am using the SqlProfileProvider to store my user profiles in an asp.net web application.
What I am looking for is a way to fetch all user profiles (I would prefer a search API, but there is not one available) with some reasonable performance.
Using the ProfileManager.GetAllProfiles kills the performance of my application.
I was thinking of using Sql Cache Dependency on the object that returns from this method, but I would still have a very slow site every time someone updates a profile (which could happen several times a day).
Anyone have a suggestion to improve the performance? I am looking for things on these lines:
Caching efficiently (only the differences should be re-cached)
Optimizing the GetAllProfiles call
Being able to search profiles, instead of having to fetch them all and filtering later
The sqlProfileProvider does not provide an easy way to search profiles as all profile data is stored in a single column.
You should consider creating your own profile provider or use something like the Table Profile Provider. This stores each profile property in its own database column and so you could easily write custom queries to search the data.