We currently have an application that is usable by several clients, it is used to download and store data from our application that they have on their environment.
We have a need to pass this application over to a developer but at the same time, we need to protect our code. The way that I see it working is that we would like to some how consider our current app a framework, allowing another app to be created on top of it, but the app may have its own screens, but re-use some of the built-in screens.
Is it possible to protect our app in such a way with out rewriting everything into protected DLL's? Or should we just suck it up and share our code with consulting firms that want to build these types of apps for our clients?
If your proprietary code is entirely focused on downloading and storing data. You could create an online REST api that returns the data over the internet. The other developer could then just request the data from your servers using an HTTP call.
However if your code needs to be client-side, the only real thing you can do is compile a DLL, and even then that can be decompiled.
Related
I'm using Python and QT (PySide) in a local application (which connect to a database on cloud Azure).
Now, my objective is moving this app on the web, in particular on Azure (I have an Azure subscription), simply transfering it on Azure, it's possible in some manner? I have not found examples on the web.
The important question is: is Python QT (app web) compatible with Azure?
Thanks
UPDATED ANSWER!
Yes, now you can. Well sort of. The mad mads at Digia have created something called "QT for Web Assembly" that can compile your whole app into something that runs embedded into a web page.
https://doc.qt.io/qtcreator/creator-setup-webassembly.html
You might have to rethink connecting directly to the database however, as thats simply not gonna fly with web-sockets (And honestly direct app to remote RDBMS has never been a smart move. Theres a LOT of things that can go wrong letting the internet connnect to your databaes). But you could at least keep the UI and rewrite the databaes layer to interogate something like a GraphQL (or whatever) front end to the data.
OLD ANSWER
I'm afraid your up for a nearly complete rewrite. QT is a desktop/mobile platform. It doesn't go anywhere near HTML/CSS except perhaps for displaying them in a webview component. Azure or AWS won't magically make it into a web application for you.
Your code as it stands needd to be rewritten in a web-first transactional manner. That is it takes a request, processes it, produces a result. To some extent websockets has changed this dynamic for a limited subset of use cases where interaction needs to be non transactional, and modern web app design hides much of the transactionality behind a web-services model, but 90% of web work is still very much transactional.
Database <---> Web server/Web app stack <--- Internet! --> Web browser
My suggestion is to pick up Django (or one of the other systems. If its just simple, Flask is another good alternative. Flask for simple apps, Django for the big stuff. Or use something else, you have choices here!, and start from scratch. Analyse your products function and start mapping out how to make this work as a database driven transactional system.
Theres no shortcuts here, I'm afraid.
Is there a way to partition Meteor client-side code so that only some of the code is packaged up to send to certain clients? For example, could all client-side code go to users who are "teachers," but only a subset of the client-side code go to users who are "students."
Of course, I could create two separate applications, but I'd rather keep the code base for multiple types of users together to ease maintenance.
As far as I know, there is currently no built-in way to load (i.e. send to the client) part of an app depending on the route, the user role, etc.
You will have to either load everything for all users or build two applications sharing some private packages (this is pretty efficient actually).
There is also the possibility to store the javascript/template files in the public folder (which content is not sent to the client, but on desktop only! see below) and to load them with $.getScript().
See for example this tutorial or this package. The latter might be what you are looking for.
But this might not work for a mobile app where the public folder content is actually bundled at build time and re-sent to the client on each code push.
We've got a .Net MVC/EF web application that is already in place with a client. The app was developed using .Net Membership and Roles for security/login. The app runs on tablet devices placed in the client's locations.
Now the client wants a different company to build a new UI for a portion of the site (not the whole thing), that has to integrate with our DB. This other company has been doing this type of work for a long time, so they've established how they operate, which is using a disconnected, distributed methodology to avoid Internet problems messing up an always-connected setup. Basically, they want each device to pull down only the segment of the DB that is relevant to the device's location and then sync it every two minutes through an API. The device will need to allow user logins.
So basically, we are being asked to adapt our web app/DB structure to accommodate this. And it boils down to 2 questions:
1) Since we are using .Net Membership and Roles, can we use the MS Sync Framework for syncing the DB of users (or at least, the ones relevant to the device's location)? I'm guessing the answer is yes because as long as the other party's UI is coded to utilize .Net Membership as well, the DB should be in the right format to read the data. I just want to confirm this is the case.
2) Can all the synchronization requests (up and down) be run through a web API that we write and expose to the devices? Since the UI is not going to be part of the code base, it needs to connect that way.
Thanks in advance!
1.Yes. from Sync Fx perspective, they're just tables.
2.Sync Fx out of the box doesnt work over http. You will in the documentation how to n-tier setup using WCF though. If you want Web API, you will have to code it as well.
I realize that this question can start a discussion but that's really not my intention. We've created a Flex Application to take tests from candidates. The advantage of the Flex Application is that all state can be stored in the application running in the browser of the client. Things like time limits, navigation, scoring, ... can all be handled within the application without us having to worry about a back button for instance. Even running the app offline with Adobe Air isn't that hard.
My question now is if such an application could easily be made with HTML, Javascript, Ajax, ... ? The reason I'm asking is because an application in HTML would be much easier to distribute on Mobile devices for instance. Also, our domain model for instance is mostly implemented in AS3 (Flex) so using it along the server side means porting it to C#.NET. (with two codebases as a result).
Look at any good MVC toolkit, you will easily be able to handle this. Castle project is good as is Microsoft MVC, both of which allow you to choose from a variety of view engines to handle the actual page rendering thereby allowing you to choose the most 'mobile efficient' engine...
As for the technicalities, you would store all persistent data in a server session object.
I'm creating a web application using asp.net & WCF as 3 tier architecture, which is mostly looks like a social website. Users can register with the system and they can upload their profile images, documents, video clips etc. So, what i want to know is what is the best way to store those files? In the wcf side or web application side ?
Also I want to know that, if i choose web application side to store those files as set of folders, how it makes those folders shared and allow access to another different project (such as a desktop client need to upload files into that shared folder) ?
thank you all in advance.
I think the question can better be put like this:
save in a folder in the web application or close by and have the metadata stored in a database
grab the saved images from a database via WCF
The second approach would likely be rather slow. Grabbing information over a service, convert it, use an httphandler with the correct mime type to spit out the binary stream to the browser...
Most architectures cut down in the middle: save the images close, or in, the UI layer and have the metadata about them stored in the database. Retrieval of that information's mostly just a bunch of strings so easily retrieved.
Update for the new question:
Since winforms applications/other projects were not in your original question this deviates into something new. In that case you go for some of the following scenarios:
Use the WCF tier as a common ground and store the images behind that service. As I said it's going to be an extra to pull the byte arrays over.
Store the images in the Web UI tier and have a service (asmx or WCF one) to expose the images to your winforms client.
Make a share for the winforms client on the server where the web ui runs, and where the images are. Of course be sure to be respectful to security and possible hacks.
It depends on what the most used scenario is. My assumption is that the web ui layer will be mostly used and the the winforms are going to be used for image manipulation? If so there are ASP.NET third party controls available for such manipulation as well so the need for a winforms client would decrease.
This depends on how big you expect this thing to get.
If this is for the wider internet and you expect it to get big, having it on the webserver will make it difficult to scale up your application by adding new webservers to your web farm.
One approach would be to have the physical files uploaded to the webserver, to make the uploads quick for users, and then have a coordinator background service that is triggered by an upload, perhaps using a FileWatcher. This service would propogate the file to all nodes in the web farm so that subsequent requests to other nodes will find the file.
If it is a small application intended only for within a company, on the web server is okay, with the following conditions:
You have full control over the hosting server so that you can set up the appropriate folder permissions.
You write your file saving and retrieving code in such a way that it can be moved onto the lower tiers without too much pain. Do it through an interface and inject the implementation