How to access file from another application's directory on Bluemix? - r

This is my problem scenario :
1.Create 2 apps.
2.App1 continuously pulls tweets and stores the json file in its /data folder.
3.App2 picks up the latest file from the /data folder of App1 and uses it.
I have used R and its corresponding build-pack to deploy the app on bluemix.
How do I access /data/file1 in App1 from App2 i.e. can I do something like this in the App2 source file :
read.csv("App1/data/Filename.csv") ;
will bluemix understand what App1 folder points to ?

Bluemix is a Platform as a Service. This essentially means that there is no filesystem in the traditional sense. Yes, your application "lives" in a file structure on a type of VM, but if you were to restage or redeploy your application at any time, changes to the filesystem will be lost.
The "right" way to handle this data that you have is to store it in a NoSQL database and point each app to this DB. Bluemix offers several options, depending on your needs.
MongoDB is probably one of the easier and more straight-forward DBs to use and understand. Cloudant is also very good and robust, but has a slightly higher learning curve.
Once you have this DB set up, you could poll it for new records periodically, or better yet, look into using WebSockets for pushing notifications from one app to the other in real-time.
Either way, click the Catalog link in the Bluemix main navigation and search for either of these services to provision and bind them to your app. You'll then need to reference them via the VCAP_SERVICES environment object, which you can learn more about here.

You can't access files from another app on bluemix. You should use a database service like cloudant to store your json. Bind the same service to both apps.

Using something like Cloudant or the Object Storage service would be a great way to share data between two apps. You can even bind the same service to 2 apps.
Another solution would be to create a microservice that is your persistance layer that stores your data for you. You could then create an API on top of this that both of your apps could call.
As stated above storing information on disk is not a good idea for a cloud app. Go check out http://12factor.net, it describes no-no's for writing a true cloud based app.

Related

Synchronise users from PeopleSoft HRMS to an external application

I have an enterprise application which needs to synchronise user information from the centralised source. We have been so far been integrating using LDAP with AD using a daemon process.
However, In our next deployment we need to integrate with PeopleSoft HRMS (9.1). The application needs to periodically synchronise users with the PeopleSoft HRMS.
I wanted to check how to proceed on implementing this?
Is there a standard module which would expose these details or does it allow LDAP communication?
Any direction on how to consume user records will be helpful.
Webservices can be implemented with Integration Broker: https://docs.oracle.com/cd/E41633_01/pt853pbh1/eng/pt/tibr/concept_IntroductiontoPeopleSoftIntegrationBroker-076593.html
A more low-level approach could be done with an Application Engine.
Your enterprise application could generate an XML/csv.
You would make a record in peopletools that corresponds with the fields in the XML/csv file. Then you make a fileLayout. If you drag this fileLayout into the Application Engine peoplecode window, you get a template of what your code should be and you'd have to complete it with some paths to files and minimal logic to process and import the data into your user tables.
Remember you can schedule Application Engines with reccurence, so after setting this up all you need to worry about is that the file gets updated.
If you require validation you should also look into feeding the data to a Component Interface after reading it in via Application Engine.

Do I need a Storage Controller for all my model classes in the app to use azure file sync?

Background:
Xamrin Forms Client App
Azure backend with Dot Net
Using Azure offline data sync
trying to use Azure offline File Sync
Related SO questions
there have been 2 more questions I asked here which are somewhat related
Getting a 404 while using Azure File Sync
Getting a 500 while using Azure File Sync
Solution
As stated above in the first link, I had to create a storage controller for the User entity to be able to successfully login even though I do not intend to use Files for Users.
As I work further in the app, I am still getting more 404 errors as I can see in fiddler. These are similar calls which are looking to access an API like below
GET /tables/{EntityName}/{Id}/MobileServ‌​iceFiles HTTP/1.1
My Question Now
Do I need a storage controller for every entity I have in my solution? may be every entity that inherits from EntityData?
Is there a way I can selectively tell the system which entities are going to work with files & have storage controllers only for them? Like, may be, marking them with some Attribute?
Reference
I am using this blog post to implement Azure File Sync in my app.
To answer my own query (and not the answer I wanted to hear) YES. We need a Storage controller for all entities, even if they don't have any files to be stored in Storage account. This is a limitation.
Found this info on comments of the original blog I was following (I wish I did it earlier), to quote the author
Donna Malayeri [donnam#MSFT] Chris • 2 months ago
It's a limitation of the current storage SDK that you can't specify which tables have files. See this GitHub issue: https://github.com/Azure/azure...
As a workaround, you have to make your own file sync trigger factory.
Here's a sample: https://github.com/azure-appse...
The reason the SDK calls Get/Delete for files in the storage
controller is because the server manages the mapping from record to
container or blob name. You wouldn't necessarily want to give the
client access to the blob account to access arbitrary files or
containers, for instance. In the case of delete, the server doesn't
even need to give out a SAS token with delete permissions, since it
can just authenticate the user and do the delete itself.

how to sync data between company's internal database and externally hosted application's database

My organisation (a small non-profit) currently has an internal production .NET system with SQL Server database. The customers (all local to our area) submit requests manually that our office staff then input into the system.
We are now gearing up towards online public access, so that the customers will be able to see the status of their existing requests online, and in future also be able to create new requests online. A new asp.net application will be developed for the same.
We are trying to decide whether to host this application on-site on our servers(with direct access to the existing database) or use an external hosting service provider.
Hosting externally would mean keeping a copy of Requests database on the hosting provider's server. What would be the recommended way to then keep the requests data synced real-time between the hosted database and our existing production database?
Trying to sync back and forth between two in-use databases will be a constant headache. The question that I would have to ask you is if you have the means to host the application on-site, why wouldn't you go that route?
If you have a good reason not to host on site but you do have some web infrastructure available to you, you may want to consider creating a web service which provides access to your database via a set of well-defined methods. Or, on the flip side, you could make the database hosted remotely with your website your production database and use a webservice to access it from your office system.
In either case, providing access to a single database will be much easier than trying to keep two different ones constantly and flawlessly in sync.
If a webservice is not practical (or you have concerns about availability) you may want to consider a queuing system for synchronization. Any change to the db (local or hosted) is also added to a messaging queue. Each side monitors the queue for changes that need to be made and then apply the changes. This would account for one of the databases not being available at any given time.
That being said, I agree with #LeviBotelho, syncing two db's is a nightmare and should probably be avoided if you can. If you must, you can also look into SQL Server replication.
Ultimately the data is the same, customer submitted data. Currently it is being entered by them through you, ultimately it will be entered directly by them, I see no need in having two different databases with the same data. The replication errors alone when they will pop-up (and they will), will be a headache for your team for nothing.

Designing a SQL Server database to be used in a shared hosting environment

I've always personally used dedicated servers and VPS so I have full control over my SQL Server (using 2008 R2). Now I'm working on a asp.net project that could be deployed in a shared hosting environment which I have little experience with. My question is are there limitations on the features of SQL Server I can use in a shared environment?
For example, if I design my database to use views, stored procedures, user defined functions and triggers, will my end user be able to use them in shared hosting? Do hosts typically provide access to these and are they difficult to use?
If so, I assume the host will give a user his login, and he can use tools like management studios to operate within his own DB as if it were his own server? If I provide scripts to install these, will they run on the user's credential within his database?
All database objects are available. It includes tables, views, sp, functions, keys, certificates...
Usually CLR and FTS are disabled.
At last, you will not be able to access most of the server objects (logins, server trigger, backup devices, linked servers etc...)
SQL Mail, Reporting Services are often turned off too.
Depends on how the other users are authenticated to the database, if it is one shared database for all users.
If every user on the host will recieve it's own db:
If your scripts are written in a generic way (are not bound to fixed usernames in that case for example), other users will be able to execute them on their database and will have the same functionality. (Secondary click on the db and choose task->backup for example)
You could also provide simple pure backup dumps of a freshly setup database so for other users, the setup is only one click away. Also from the beginning, you should think about how to roll out changes that need to affect every user.
One possible approach is to always supply delta scripts, no matter if you are patching errors away or adding new things.

Moving a ASP.NET application to the cloud

I am new to cloud computing, so please bear with me here. I have an existing ASP.NET application with SQL Server 2008 hosted on a Virtual Private Server. Here's what it briefly does:
The front end accepts user's requests and adds them to a DB table
A Windows Service running in the background picks up the request, processes it and sets a flag.
The Windows Services also creates a file for the user to download.
User downloads file
I'd like to move this web application with the service to the cloud. The architecture I envision is that I'll have 1 Web server in which I will install the front end and the windows service. I'll also have a cloud files server for file storage. The windows service should somehow create a file and transfer it to the cloud file server (I assume this is possible?)
My questions:
Does the architecture look like I am going in the right direction?
I know Amazon has been providing cloud services for a long time. If I want to do minimal changes to my application, should I go with Amazon, Rackspace, Azure or some other provider?
I understand that I would not only pay for file storage and web server but also for the bandwidth of users downloading the file and the windows servic uploading the file to the cloud server. Can I assume these costs are negligible? Should I go with VPS + Cloud Files combination to begin with?
Any other thoughts/suggestions?
#user102533,
The scenario you describe is very close to the one we cover in this guide. You can also download the documents here.
The web site should be fairly staright forwrad to move. The key things to consider:
- Authentication
- Session management
- Bandwidth use and latency considerations (e.g. big ViewState, etc)
The Windows Service will have to be refactored into a "Worker". This is covered in the guide above with more detail for very similar purposes.
The guide comes with full samples showing how to do it.
Hope it helps
Eugenio

Resources