Errors while importing users from old site - drupal

I have an old drupal site with 100 users. However when I tried importing them all in new drupal instance I got errors like below for most of user entries.
Notice: Object of class stdClass could not be converted to
int in drupal_write_record() (line 7159 of sitepath\includes\common.inc
I'm using the Drupal data export import module.
Is this serious, as I can see users imported but not sure if this error give me problem later.
Or is there any better way of importing user from old Drupal site.
Note: both sites are running on Drupal 7.

It looks like the Data Export Import module is not very maintained. Last commit 9 months ago.
I see two options:
Create View with a CSV on the old site, then use User Import to import the CSV.
If you have coding capabilities around (and experience with Drupal coding) and you need more customization you should go for Migrate.

Related

Drupal 8 to publish files

I'm trying to create a simple CMS to ingest contents and publish files. I'm using Drupal 8 with Feeds module to read xml files from a directory and it works fine.
I can't figure it out how I can take the information saved in my custom content and publish them to a file in another directory.
Anyone can help?
Thx
Luca
Could you create a view to render the desired output for the content you're after and then use the Views data export module to export that view into a file?
I note the following:
This module also exposes a drush command that can execute the view and
save its results to a file.
drush views-data-export [view-name] [display-id] [output-file]
If you can do it with Drush you can do it with PHP.
So potentially you could write hooks to manage the export of files based on the activity of the feed. E.g. The hook_ENTITY_TYPE_insert would allow you to perform logic when a node of a particular content type is created.

WordPress WXR file import partial fail: how to debug?

I am converting an old CMS to WordPress. Given that I have control over the source database, I have written a conversion script that generates a WXR formatted XML file. The import is partially successful, including comments, media, categories, etc. But several posts fail to import with no reason given. PHP error log shows no errors. How can I trace what goes wrong?
Issue was bad date/time formatting.

Importing Firebase database with changed to certain nodes

Is it possible to export the Firebase database and then make changes on an editor like Notepad and then import the database enforcing that only the changes made offline get updated?
For example, say I export the database today and add 6000 new child nodes through my notepad and then two days I import the database back to update it with the new nodes without affecting the other updates that have been made by my users.
What happens by default is that when I import the database it will rewrite every value that does not match. Hope I'm making sense.
As you can see from above image, firebase clearly says All data at this location will be overwritten, so you have to stop writing new data when you are updating the data by export and then import.
Now to recover such data, you can code that way to write upcoming data into json file or another database and stop writing it into firebase.
Hope this helps.
If you are using the Export JSON option from your Firebase Console, and than you make some offline changes on that file and than you use Import JSON option, no, it's not possible, because the new added file overrides the old one. So in other words, if you previously have made some changes, all the changes will be lost when you upload the modified file.
There are two options to solve this.
Stop to database for beeing written -> Export the file -> Make the changes -> Import the file.
and
Make all changes programmatically even if the database is changed by users.
Hope it helps.

How to combine multiple Plone sites into one site?

I have 3 Plone 4.1.4 sites under the same zope instance with the same Data.fs. They handling different folders, files and plomino databases with different users(maybe with the same username).
How could I combine them into one Plone site?
Best Regards.
zxl
zmi -> export will works very well in this case. You've to export from 2 sites and import in one site. You've to export JUST the data, the folders, files and documents I mean, not the whole plone site. For users, it is a bit complicated, you need to export data from portal_memberdata, portal_membership and user login and password from acl_users -> source_users.
For portal_memberdata data, you can go to site/portal_setup/manage_exportSteps and export it and import using the import tab. For portal_membership there's no such a step, but I think there's no data but just configs that should be the same in every portal. So the real problem is to migrate users in acl_users.
Export and import your folders and files using Products.CSVReplicata.
Regarding the Plomino databases: first, export their design as XML (in the Design tab), and import it back into an empty database on the target site, then export their documents as XML (in the Replication tab), and import them back in the corresponding database on the target site.

User imports with cron job

I have a drupal website where i wonna import user settings with a CSV ( Comma separate file ) file. I can do that with the user imports module. But this is manually. You need to choose a csv file and can then change some settings,... to import that file. That is no problem.
Now i wonna do that automaticly ( cron ? ). I have read that it could be done with the same user imports module. But i can't get it working. Somebody that can help me starting, to do the following.
So that he gonna see to an ftp location if there is a new csv file.
The he gonna do the import.
He must do changes for users that already exists an add new users if they doesn't exists. And for users that where deleted from the file , the user should set to inactive.
And after this, there should be a report with al the changes that are made
I would suggest using the feeds module. You can easily set it up to import on a schedule and can specify a local folder to look for new files. You can also set it up to access a remote url for the file, but only through http not ftp. We do a similar import and have a bash script that transfers the file to a local folder and then use feeds to import the resulting file.
Thanks for the advice. But i am rewriting the module into a new module. With everything i need. :)

Resources