Save profiles / files Alfresco - alfresco

I want to uninstall Alfresco and re-install again, but I don't want to loose the accounts created in Alfresco and the other things. Is there any way to save this ?
Thanks!!

Let's assume that by re-installing you mean you want to start with a clean alfresco WAR and share WAR. If so, you can just shut down, remove the alfresco and share directories, then place the clean WARs in the $TOMCAT_HOME/webapps directory. If you had any AMPs deployed, use MMT to re-install those. Then, restart Tomcat. The content is in the content store directory and the metadata is in the database, so you can start over with fresh WARs without losing any data.
If you mean you want to delete everything in $ALFRESCO_HOME but you want to save your data, the easiest thing to do is to dump your database and tar up your content store. Then you can completely blow away your installation, and after reinstalling, you can load your database and un-tar your content store.
If you are trying to blow away some, but not all of your data, you'll have to export what you want to keep. You might go look at the docs on ACP files as one possible approach. Or you could use something like CMIS to write an exporter.
For users, specifically, it is often helpful to use an LDAP directory instead of managing them in Alfresco directly. That way you can completely start over and Alfresco will simply synchronize your users from LDAP.
If you don't want to use LDAP an alternative is to have a simple CSV file with user account data. After starting your repository for the first time you can re-import your users from the CSV file.

Related

Alfresco conent store deletion

I have conentstore configured in below content store location.
D:\alfresco-content-services\alf_data\contentstore**2019**
I want to delete above shown 2019 folder under conentstore. I dont need 2019 anymore. Basically purging .
If i delete files above folder ,will it clean-up the metadata and indexes also ?
or will it corrupt my respository ? Whats the best way to achive mass deletion , which will delete references in database also without corrupting repo ?
Thanks & Regards
Brijesh
If you delete any folder from the content store, it will not affect the database (and hence the indexes) in any way. You will end up with nodes referencing .bin files that do not exist anymore, though.
Note, if that folder is the first year in your content store, then it also contains some files used by Alfresco to determine if the content store matches the database. Depending on this, if you delete the folder - you will mess up Alfresco (repository will not start if it does not find those files).
Mass deletion in general is tricky, I'd suggest using Bulk Import Tool's delete web script that does this as fast as possible (avoids audit logs, recycle bin, etc).

How to export and sync an Artifactory repository to the filesystem?

I am looking for a solution that would allow me to have a network share where people can access (read-only) the artifacts from an Artifactory repository.
Why? We use Artifactory to also keep track of big binaries like installation kits, ISO images and so on and it takes a lot of time to download all of them (sometimes as zips), unpack and run them. If these would be exported to a NFS/SMB share people would be able to only mount them in order to use them.
How can we achieve this? Please keep in mind that we also want to automate this, so the files would be updated by Artifactory when needed.
Artifactory supports WebDAV out of the box.
It seems that's not possible at this moment and there is a feature request for enabling it:
https://www.jfrog.com/jira/browse/RTFACT-8302
Feel free to vote and to comment on it, allowing jFrog to realise how important is this use case.
I guess they should be able to provide a script that does mirror/sync a repository to a NFS share but that would almost double the storage space needed.
Instead if they would use hardlinks or symlinks to create a browsable tree of the repository inside the storage directory, this would be solved and no sync will be needed.

Install WordPress with its plugins using chef-solo

I have a WordPress website up and running with many plugins installed on it and a huge database, I need to use chef-solo in order to create an environment in which can install the same website with all its plugins and and also importing its database.
I need it to be like, using chef to install the same website on a different server, exactly the same
Now here are my questions:
I know we can use chef to install WordPress but can we set it in a
way that we don't need to configure the the WordPress and everything
is already set once its running?
What to do with the plugins? can we install them using the chef or
now that should be done manually?
How about importing the database, that can be done with chef-solo
as well?
The whole website is on git, can I somehow import the whole
thing?
is there any other issue I may possibly face? if I want do that?
There is a wordpress cookbook openly available for chef.
When you mean configure, I take it you mean setup data in the database. Assuming that you've separated the database instance from the server instance, and you're attempting to scale up the number of servers then you should be able to skip data setup. You should be configuring the new server instance (node) to point to the same database via Chef.
I stumbled into this question looking for the answer to this question. From what I can tell the start may be here.
Kind of hand-wavy, but this should enable you to do some wordpress stuff via the command line with Chef, rather than the point and click it prefers.
As per #1, you should not need to import the database. If the database goes down, you'll want to focus on that as a separate but connected recipe, since then you'll want to be taking snapshots and uploading them somewhere like S3 via a cron job. I believe there are plugins that can enable this.
You'll have to be a little more clear by "import". If it's in a code base you may be able to short-cut your cookbook path by pulling down the git repo onto the host. You may want to look at git-archive.
Other issues that I'm looking at are images. We're migrating from a hosted solution to AWS, and it appears that instead of storing the images in the database, word-press pulls them into a local directory. This means that if we scale to > 1 host, we'll have issues with images. Something to think about, there's a wealth of plugins that can probably solve this.
Hope this is helpful,
Ben

How to manage multiple alfresco repositories?

Problem description:
I have multiple alfresco installations (development, testing, production) of one project.
I need to copy files under Data Dictionary folder (Scripts, Templates, Web Scripts) from one to another in one direction (development -> testing -> production).
Current solution:
I copy files manually via webdav, which is annoying and unreliable (I can forget to copy some.).
Desired solution:
I'd like to have I tool, which will copy changed files at my command, what they are ready for the next step. I had an idea, that it could internally use a Git repository with branches for each installation, being able to fetch the files from devel and push the files to testing and production. This way (with Git) it could also support reverting changes.
It looks like a quite common problem, but I wasn't able to google something about it, so I'm asking here. Does such a tool exist or is there a better way of managing multiple repositories?
If you have a brand new installation of your development/testing/production Alfresco instances, you could simply migrate alf_data dir content, that contains by default db, indexes, content-store, backup files. If you need, you could migrate the "shared" folder too, or at least some files from the shared folder as could be some Alfresco customization (custom scripts or similar). Here is the link that helps with migration steps:
http://wiki.alfresco.com/wiki/System_Migration
Otherwise, if you need only to move a folder from Data Dictionary, or a set of documents, you could use ACP in order to achieve that. Here is the wiki for doing this: http://wiki.alfresco.com/wiki/Export_and_Import
You could do this via FTP. When your want to deploy new changes, you can go with manual client like FileZila to download changes from Dev, then upload them to test.
But you can also automate FTP, so that it can run a scheduled check if there are new things on, say, dev and push them to test.
If you use Git for source control, you could also do this via git-ftp. Hold a copy of Data Dictionary in your source folder, then add some sort of pre-commit check, which will see if you changed any of those files. If you did, on commit it will push the change to dev and test.
I think Relication service AF is suitable for you.
http://wiki.alfresco.com/wiki/Alfresco_Community_3.4.a#Replication

Best way for downloading many files in a website

I'm designing a website that will let the user synchronize a local folder to an online folder (Kinda like dropbox).
I'm trying to find a way to avoid developing a local tool to do the download and synchronization. And do it somehow online.
Is it possible to download multiple files in a certain directory tree?
Can a website have free write access to local directories?
A zip file is not an option, since the file batch could get pretty big.
EDIT: Synchronization shouldn't occur periodically. Just when the user logs in the website.
My recommendation is to use something like WebDAV then see here

Resources