Does anyone know how can I extract DocumentDB data to an external backup tool? I mean using dumps, ad-hoc tools or API.
One possible way for you to do this is to query your container to list all documents in your container and save them locally in json files. You could use any available Cosmos DB SDK to do so.
If you're looking for a tool to do that, may I request you to take a look at Cerebrata Cerulean (Disclosure: My company is behind this tool)? This tool has a feature to download documents from a container to the local computer. You can download all documents in a container or documents matching a query from a container using Cerulean.
Related
Are there scripts for exporting and importing all Apigee Edge objects, such as developers, users, apps, caches, key value maps, etc?
To clarify, it would be nice to have non-runtime objects as a priority vs. the runtime data contained within. E.g., the current content of caches are not as critical as just having the cache object available.
I have released a tool that can be used to retrieve Apigee organization settings. This tool has been in use internally at Apigee for some time, but this is the first time it has been released to the public. It uses the Apigee management API to pull configuration data, and that data to be pulled is configurable. The data is stored in a hierarchical directory structure, which can be archived, explored, or used to compare organizations. It can be used with both the Apigee Edge cloud and on-prem offerings.
A few caveats:
This tool does not retrieve all data from an organization. For example, it does not retrieve API proxies. Use the Apigee management UI or management API to retrieve API proxies.
The tool is composed of a few bash scripts. It has been successfully run on Linux and Mac OS X.
The tool does not write data back into the organization, although the files it retrieves can often be POSTed back to the organization using the management API.
This tool is released as-is. It is not officially supported by Apigee.
Find the tool at the api-platform samples site (https://github.com/apigee/api-platform-samples) in the tools/org-snapshot directory.
There is work planned to provide a tool that will export/import provisional data (such as apps, developer, products). Other aspects of an org's configuration require access to the production Cassandra database, which cannot be given out publicly. We have a provisional tool for in-house use that we are currently hardening. If the consumer tool (when it is available) doesn't provide all of the backup support you need, you will need to log a support ticket for them to run the in-house tool.
There are scripts for importing a set of objects (developers, apps, API products) that work with the sample proxies that you can find on GitHub:
https://github.com/apigee/api-platform-samples/tree/master/setup
For Perl programmers: see also Apigee::Edge on CPAN
I am working on a Livecode Aapplication. In this I need to use cloud base sqlite database. But I have not much knowledge about cloud base sqlite and how to implement it with Livecode Application. Could anyone explain to me, what is it and how can I use cloud base sqlite into Livecode?
Thanks
An SQLite database is just a file which resides in the file system of the device. So each device will have its own database with its own data. If you want to store data in the cloud you have to do something on the server side.
If you want to have a solution on the server you might want to go for a PHP script. PHP has sqlite access built in. However you can use other scripting languages as well.
Or on another line something like https://cloudant.com/ . But there the data is not stored in relational tables but as JSON objects. Access is as well through the http protocol (restful).
Related question
See also here How to retrieve data from a server
Suggestion
Please do not forget to use the search box of this web site. E.g. by searching for
sqlite cloud
you get
https://stackoverflow.com/search?q=sqlite+cloud
which has as the first answer
A: Can I use the SQLite as a db storage for cloud-based websites?
So your question needs to be more specific.
I am new to windows azure. I've created simple HelloWorld ASP.NET azure application and published it. I know I can republish whole application in Visual Studio by clicking right button on project and then publish it. But is it possible to update only one file (aspx page, picture etc.)
Thanks!
Regards, Alexander.
I think if you're just learning Windows Azure, the most helpful answer is "You can't." The way Windows Azure works is that to update an application, you create the full package and deploy it again.
This isn't to say that David's answer isn't also correct. I just wanted to directly answer the question of "How do I change just one file after I deploy?"
If you want to update individual files such as images, one thing you can do is store all images (and css, javascript, and any other static content) in Blob storage. This has several advantages:
Easy to upload new files individually, with both free tools and paid tools. For instance: Cloudberry Explorer is a free app, and Cerebrata Cloud Storage Studio is a paid app, both which let you manage containers and blobs individually.
Smaller deployment package, because you've removed images and other large files
Less load on IIS, since image requests go directly to blob storage, not to your role instances
You can't store your aspx files in blobs, though you can store static content like html in blobs. To update aspx, you're basically updating the deployment. You can now do this as an "upgrade" which doesn't disrupt your IP address and, if you have multiple instances, doesn't take down your service during upgrade.
You can either use webdeploy (which should do a selective update of all files) or connect via remote desktop and update certain files yourself.
Like the comment and MSDN says: neither of these two ways are recommended/usable for production deployments. They are only meant as a shortcut for certain development scenarios.
We have a client requirement to upload documents (Word Doc and possibly PDF) to our Azure hosted application and have full text search on the document.
My understanding is that SQL Azure doesn't support full text indexing so I can't just store them in the DB.
Has anyone done anything similar? If so how? Are there any Nuget packages or things I can install into the Azure role etc when I create it? Is blob storage serachable/indexable?
Any ideas?
I would suggest using Lucene.NET along with your data. Take a look at:
http://code.msdn.microsoft.com/windowsazure/Azure-Library-for-83562538
If you are doing this now and you are using Azure. It is best to combine this with Azure Search Service. It has stacks of features and you can add text and meta data to allow fast searching. It has various options options indexing blob etc blob indexing
I maintain a web application (ASP.NET/IIS7/SQL2K8/Win2K8) that needs to access documents, actually hundreds of thousands of documents, and growing. Currently, they are all on a Windows 2K8 Server fileshare, being accessed by UNC path (SMB). The files are in a single flat directory and I'm trying to plan how to best improve this solution. I don't want to use the SQL Filestream attribute as it would be significant effort to migrate it all into that, and would really lock in to SQL Server. I also need to find a way to replicate the data for disaster recovery, so perhaps a solution can help with that too.
Options could be:
Segment files into multiple directories?
Application would add metadata for which directory it's on (or segment by other means)
Segment files into separate servers? (virtualize)
Backup becomes more complicated.
Application would add metadata for which server it's on
NAS Storage
SAN Storage
Put a service (WCF) in front of the files and have the app talk to the service
bonus of being reusable across many applications
Assuming I'm going to store on filesystem and not in database (I've read those disccusions here), which would be a more scalable solution?
You've got a couple issues:
- managing a large volume of (static?) files
- preparing for backups and disaster recovery of said files
I'll throw this out there, even though I'm not a fan of the answer, but you might poke around with the free SharePoint 2010 Foundation that's included with server 2k8. If you're having issues with finding the documents you need (either by search, taxonomy via tagging or other metadata) as well as document expiration and you don't want to buy a full blown document management system, this might be a solution. Of course it introduces new problems...
If your only desire is to have these files available to spit out on the web, then the file store like you're using now really is the simplest solution. For DR/redundancy purposes, I'd look at a) running them on a raid/SAN of some sort and b) auto-syncing them with the cloud (either azure or amazon). For b) you can get apps that make the cloud appear as a mapped drive and then use an rsync type software to keep the cloud up to date.
If you want to build something new and cool, you might think about moving the entire file archive into the cloud and just write a table in a db to manage the file name, old location, new cloud location and a redirector code that can provide the access tokens to requestors.
3 different approaches... your choice.