After online registration, my mobile app must be able to be used without a network. It consists of a service and a UI, both accessing the same data.
I do not understand how to organize my application.
I known how to record data remotely (on my ROS) and locally but what about the synchronization between the two .realm?
This is should my app manage synchronization between these two db or is there another way?
When I create the user on my ROS, I get a .realm file on the device.
But it is unusable without User.LoginAsync (Incompatible histories exception).
Related
I have made a WPF Desktop application that fetches data from an API (on every launch) provided by the website over the Internet and updates its UI accordingly. Now I want to change that. I want to put the application's fetching functionality on the cloud which regularly gets data from API and store it in a SQL SERVER (or any) database. Now I want the application on the client side to get data from the database and not from the API itself (can be on every launch or when there is a change in data).
WHATS THE PROBLEM THEN ?
When thinking of a desktop application over the cloud on a windows virtual machine it feels ok and I understand how to implement it.
But when thinking of a web application the server is supposed to respond to client request and then perform some piece of code to give output/Response. Which I don't need I wan't the code to just run weekly or monthly without client asking.
WHAT I WANT ?
I wan't the server to up 24/7 and updating the database every week.
How usually such kind of thing is implemented in general. Let say for example stack overflow. Don't they perform certain task independent of the client on their servers.
I am developing an application which collects data into a local sqlite database, and (when the Internet is available) synchronises the data with a server.
I understand roughly how to create a background task (in different ways for each platform, unfortunately), but I read in Xamarin Background Tasks...
As a word of caution, be aware of what service or application is accessing an application, for example a SQLite DB. Two processes can not access a file at the same time, unless read-only. Hence ensure only one process is performing actions on file’s or locked resources.
This worries me - I need both the background task and the app to be able to update a single database. Is it in fact possible for the background task and the main app to access the same database? If so, how does one go about avoiding "Database is locked" messages?
I have an application that needs to send data to a cloud database (DynamoDb).
The app runs on a computer that can lose internet connectivity or be switched off at any time, but I must ensure that all data eventually gets to the cloud database.
I can assume the application will eventually be switched on, and will eventually get internet access back.
The app is written in VB .NET
What are some schemes for achieving this, and are there any ready-made products that already achieve this?
You could implement a write-through cache using a local DynamoDB instance (or even using SQLite). But without getting specific details about what kind of data you'd be storing into the database, and what data should be made available "offline" it's hard to say exactly how you should structure your application. You'll definitely want to not keep everything local, unless the volume of data is really small overall.
Then there is the problem of resolving conflicts that may occur during network partitions (ie. a client goes offline and makes some database modifications, while other clients also make modifications to the database; these need to be reconciled and it's up to you, and your users to determine how)
It's not a simple problem to solve.
My organisation (a small non-profit) currently has an internal production .NET system with SQL Server database. The customers (all local to our area) submit requests manually that our office staff then input into the system.
We are now gearing up towards online public access, so that the customers will be able to see the status of their existing requests online, and in future also be able to create new requests online. A new asp.net application will be developed for the same.
We are trying to decide whether to host this application on-site on our servers(with direct access to the existing database) or use an external hosting service provider.
Hosting externally would mean keeping a copy of Requests database on the hosting provider's server. What would be the recommended way to then keep the requests data synced real-time between the hosted database and our existing production database?
Trying to sync back and forth between two in-use databases will be a constant headache. The question that I would have to ask you is if you have the means to host the application on-site, why wouldn't you go that route?
If you have a good reason not to host on site but you do have some web infrastructure available to you, you may want to consider creating a web service which provides access to your database via a set of well-defined methods. Or, on the flip side, you could make the database hosted remotely with your website your production database and use a webservice to access it from your office system.
In either case, providing access to a single database will be much easier than trying to keep two different ones constantly and flawlessly in sync.
If a webservice is not practical (or you have concerns about availability) you may want to consider a queuing system for synchronization. Any change to the db (local or hosted) is also added to a messaging queue. Each side monitors the queue for changes that need to be made and then apply the changes. This would account for one of the databases not being available at any given time.
That being said, I agree with #LeviBotelho, syncing two db's is a nightmare and should probably be avoided if you can. If you must, you can also look into SQL Server replication.
Ultimately the data is the same, customer submitted data. Currently it is being entered by them through you, ultimately it will be entered directly by them, I see no need in having two different databases with the same data. The replication errors alone when they will pop-up (and they will), will be a headache for your team for nothing.
I've always personally used dedicated servers and VPS so I have full control over my SQL Server (using 2008 R2). Now I'm working on a asp.net project that could be deployed in a shared hosting environment which I have little experience with. My question is are there limitations on the features of SQL Server I can use in a shared environment?
For example, if I design my database to use views, stored procedures, user defined functions and triggers, will my end user be able to use them in shared hosting? Do hosts typically provide access to these and are they difficult to use?
If so, I assume the host will give a user his login, and he can use tools like management studios to operate within his own DB as if it were his own server? If I provide scripts to install these, will they run on the user's credential within his database?
All database objects are available. It includes tables, views, sp, functions, keys, certificates...
Usually CLR and FTS are disabled.
At last, you will not be able to access most of the server objects (logins, server trigger, backup devices, linked servers etc...)
SQL Mail, Reporting Services are often turned off too.
Depends on how the other users are authenticated to the database, if it is one shared database for all users.
If every user on the host will recieve it's own db:
If your scripts are written in a generic way (are not bound to fixed usernames in that case for example), other users will be able to execute them on their database and will have the same functionality. (Secondary click on the db and choose task->backup for example)
You could also provide simple pure backup dumps of a freshly setup database so for other users, the setup is only one click away. Also from the beginning, you should think about how to roll out changes that need to affect every user.
One possible approach is to always supply delta scripts, no matter if you are patching errors away or adding new things.