We have a drupal site and we wish to export data from this site in the form of several CSV files. I'm aware of the Views module addins that make this a very simple process on demand, but what we're looking for is a way to automate this process through cron.
Most likely, we'll end up having to either write a standalone PHP file we can then access with cron to complete this action, or a custom module.
I first wanted to check to ensure that there isn't already a module or set of modules out there that will do what we're looking for. How would you accomplish this issue?
The end result is that these csv files will reside on the server for other services to pick up and import into their own systems or be distributed with rsync or something similar.
Best practices suggestions would also be appreciated!
if you want to do with cron,
Set up views with cvs data in them
Then add wget <path to your cvs view> or the path of a script which does everything you need, in your crontab.
Related
I want to uninstall Alfresco and re-install again, but I don't want to loose the accounts created in Alfresco and the other things. Is there any way to save this ?
Thanks!!
Let's assume that by re-installing you mean you want to start with a clean alfresco WAR and share WAR. If so, you can just shut down, remove the alfresco and share directories, then place the clean WARs in the $TOMCAT_HOME/webapps directory. If you had any AMPs deployed, use MMT to re-install those. Then, restart Tomcat. The content is in the content store directory and the metadata is in the database, so you can start over with fresh WARs without losing any data.
If you mean you want to delete everything in $ALFRESCO_HOME but you want to save your data, the easiest thing to do is to dump your database and tar up your content store. Then you can completely blow away your installation, and after reinstalling, you can load your database and un-tar your content store.
If you are trying to blow away some, but not all of your data, you'll have to export what you want to keep. You might go look at the docs on ACP files as one possible approach. Or you could use something like CMIS to write an exporter.
For users, specifically, it is often helpful to use an LDAP directory instead of managing them in Alfresco directly. That way you can completely start over and Alfresco will simply synchronize your users from LDAP.
If you don't want to use LDAP an alternative is to have a simple CSV file with user account data. After starting your repository for the first time you can re-import your users from the CSV file.
We have several databases that we access, and different scripts require one or more of these odbcconnections. I wanted to know if instead of putting an odbcConnect line in every script if there was a method to store all the connections centrally and import them as needed? That way if the database info changes I can update one file instead of every script.
You could use options in your .Rprofile file to save the connections. In your scripts, you would then use getOption.
There is also a Rprofile.site file which might be a better choice if you are working in a team with several R installations.
See here or the R installation and administration handbook for more information.
What I ended up doing was creating a package with each connection as a function.
App Services is a great place to store data but now that I have a lot of critial info in there I realized there isn't a way to create a backup or roll back to an earlier state (in case I did something stupid like -X DELETE /users)
Any way to back up this data either online or offline?
Apart from API access to fetch records x by x and storing locally, there is no solution at the moment. Team is planning an S3 integration (export data to S3) but no completion date is defined for that yet.
Looks like the only way is to query the data using e.g. CURL and save the results to a local file. I dont believe there is a way to export natively.
http://apigee.com/docs/app-services/content/working-queries
From 2014/2015 Usergrid versions it is possible to make exports and imports using "Usergrid tools"
On this page it is explained how to install them :
https://github.com/apache/incubator-usergrid/tree/master/stack/tools
Basically once you run
$ java -jar usergrid-tools.jar export
and this will export your data as json files in an export directory.
There are several export and import tools avaible, the best way to see them is to visit this page :
https://github.com/apache/incubator-usergrid/tree/6d962b7fe1cd5b47896ca16c0d0b9a297df45a54/stack/tools/src/main/java/org/apache/usergrid/tools
I’ve created or rather copied from a number of sources the necessary pieces in order to create my own bash script that will install WP via ssh. I’m not a coder at all, so many things will go over my head. But my patience and desire to learn will compensate.
I’ve created a bash script to download wordpress, create a new database (with correct permissions) then open the wp-config-sample.php file, rename it to wp-config.php file then open it up using nano. This is where I’m stuck. I’d like to use the details from the newly created database and have those details automatically inserted into the correct places within the wp-config file. Then I’d like to visit this url (https://api.wordpress.org/secret-key/1.1/salt/) and take those salt keys and also save them in the same wp-config.php file then save and quit.
Please help!
I don't think nano is going to be scriptable in the manner you need.
You need to look into Templates. The general idea is to put placeholders like %DB_NAME%, %DB_USER%, inside a template file like wp-config-template.php. Then you can use something like sed to search and replace these placeholders with real values, to generate the final wp-config.php.
However doing this in sed will get messy pretty quickly. You are better of using a scripting language like perl or python when you start doing templating. They will do the heavy lifting for you.
Finally, not to discourage your efforts. But building a server like you are is tricky, because there are number of things that could go wrong. Also you need to be idempotent, ie:- the ability to rerun your entire set of scripts, and only applying changes. So if you decide to add/remove packages the rest of the system should remain untouched.
If you are looking for a good guideline containing some the best tools to do this, you should check out the Genesis-WordPress project. It uses Ansible for provisioning and Capistrano for deployment.
Good Luck!
If you want a disgusting, but functional example on how to change wp-config.php from environmental variables in a bash script, check out the "official" WordPress scripts for creating a Docker image.
I have a website on which I have published several of my applications.
Right now I have to update it each time one of the applications is updated.
The applications themselves check for updates so the user only visits the website if they don't have a previous version installed.
I would like to make it easier for me by creating a single executable that when downloaded and executed, will check with the database which version is the most recent and then download that one and run that setup.
Now I can make a downloader for each application, but I rather make something more universal with a parameter or argument as the difference.
For the download the 'know' which database to check for the most recent version, I need to pass on the data to the downloader.
My first thought was putting that in a XML file, so I only have to generate different xml files for each application, but then it wouldn't be a single executable anymore.
My second thought was using commandline arguments like: downloader.exe databasename
But how would I do that when the file is downloaded?
Would a link like: "https://my.website.com/downloader.exe databasename" work?
How could I best do this?
rg.
Eric