I have an issue and I have been struggling with for weeks, almost a month, I am working on this website for a real estate agent in Toronto, Ontario and the last thing I have to do is get the listing on her website. We are using Wordpress wp-residence theme, this theme is compatible with iHomefinder, however the data I get from iHomefinder does not work with the themes features (maps, searches and even styles) I read somewhere that the only way to use these theme features is to add the listing manually. I found this plugin that will import all the listings
https://en-ca.wordpress.org/plugins/wp-residence-add-on-for-wp-all-import/
But now I need to get my listings in the form of XML, CSV or XLS. I have all my login info and url to get my listing, however in the instructions it says I need to connect via RETS Client, which I do not have...is there anyone out there who can point me in the right direction?
You'll need to use an IDX plugin.
I think you should also look over https://mlsimport.com .This plugin should help you import data from MLS. But is a paid version.
Related
I would like to import a client's posts from Blogger to Wordpress
I have searched for blog posts, plugins, and scripts that do what I need, and have not been able to find what I am looking for. A question from 2016 asks the same thing, but the solution offered is to upload all posts and delete the ones you don't want. I would like to try to avoid having to delete 600+ posts manually. Has anyone found a solution for this issue?
Edit:
Requirements from client:
She has compiled a spreadsheet in Google Docs of posts that she would like to keep that has the Original Blogger Title, Original Blogger Post Date, What will be the new category, and what will be the new tags. This is the only way I know which posts she would like to bring over. So far I have been using process of elimination and deleting posts one-by-one.
Things I've tried:
The default Blogger importer froze. So I've tried to use Blogger Importer Extended, but I'm not sure what's extended about it, because it does the same as the default importer, indiscriminately importing everything.
Update:
Reporting back. I cannot get a Blogger XML export to open in Excel to be converted to .csv. I tried then exporting the Wordpress posts as XML to convert to CSV also. However, both times I tried this with different methods, I got the same error from Excel:
So I now need to know how to convert whichever XML format that Blogger/Wordpress use to .csv so it can be compared/contrasted to the list I have received from the client. Will report back later.
Update2:
Was able to convert to csv using the online service proposed below, however the output is seemingly useless, unless I'm missing something. The output appears as below:
Note the default WP Importer have issues when you import XML filtered to a particular author. For this type of task, CSV Import/Export plugin will be useful. Also, you must install the WP export add-on for export option and use the filter options like "Export data by specific Author" or "Specific Period" checked as in this image https://ps.w.org/wp-ultimate-exporter/trunk/screenshot-2.png?rev=2134132. Now you can export the CSV of blog posts filtered for particular Author or Date range. You can use this CSV now to import again.
I use Kimonolabs right now for scraping data from websites that have the same goal. To make it easy, lets say these websites are online shops selling stuff online (actually they are job websites with online application possibilities, but technically it looks a lot like a webshop).
This works great. For each website an scraper-API is created that goes trough the available advanced search page to crawl all product-url's. Let's call this API the 'URL list'. Then a 'product-API' is created for the product-detail-page that scrapes all necessary elements. E.g. the title, product text and specs like the brand, category, etc. The product API is set to crawl daily using all the URL's gathered in the 'URL list'.
Then the gathered information for all product's is fetched using Kimonolabs JSON endpoint using our own service.
However, Kimonolabs will quit its service end of february 2016 :-(. So, I'm looking for an easy alternative. I've been looking at import.io, but I'm wondering:
Does it support automatic updates (letting the API scrape hourly/daily/etc)?
Does it support fetching all product-URL's from a paginated advanced search page?
I'm tinkering around with the service. Basically, it seems to extract data via the same easy proces as Kimonolabs. Only, its unclear to me if paginating the URL's necesarry for the product-API and automatically keeping it up to date are supported.
Any import.io users here that can give advice if import.io is a usefull alternative for this? Maybe even give some pointers in the right direction?
Look into Portia. It's an open source visual scraping tool that works like Kimono.
Portia is also available as a service and it fulfills the requirements you have for import.io:
automatic updates, by scheduling periodic jobs to crawl the pages you want, keeping your data up-to-date.
navigation through pagination links, based on URL patterns that you can define.
Full disclosure: I work at Scrapinghub, the lead maintainer of Portia.
Maybe you want to give Extracty a try. Its a free web scraping tool that allows you to create endpoints that extract any information and return it in JSON. It can easily handle paginated searches.
If you know a bit of JS you can write CasperJS Endpoints and integrate any logic that you need to extract your data. It has a similar goal as Kimonolabs and can solve the same problems (if not more since its programmable).
If Extracty does not solve your needs you can checkout these other market players that aim for similar goals:
Import.io (as you already mentioned)
Mozenda
Cloudscrape
TrooclickAPI
FiveFilters
Disclaimer: I am a co-founder of the company behind Extracty.
I'm not that much fond of Import.io, but seems to me it allows pagination through bulk input urls. Read here.
So far not much progress in getting the whole website thru API:
Chain more than one API/Dataset It is currently not possible to fully automate the extraction of a whole website with Chain API.
For example if I want data that is found within category pages or paginated lists. I first have to create a list of URLs, run Bulk Extract, save the result as an import data set, and then chain it to another Extractor.Once set up once, I would like to be able to do this in one click more automatically.
P.S. If you are somehow familiar with JS you might find this useful.
Regarding automatic updates:
This is a beta feature right now. I'm testing this for myself after migrating from kimonolabs...You can enable this for your own APIs by appending &bulkSchedule=1 to your API URL. Then you will see a "Schedule" tab. In the "Configure" tab select "Bulk Extract" and add your URLs after this the scheduler will run daily or weekly.
I am trying to create a webpage to add hotels (database for hotels) using wordpress. (name, address, pictures, reviews, ...). Ideas?
Another idea for categorizing hotels with cities?
Thanks.
You will probably need to further explain your question so that we can be more help to you. If your intention is to just add a list of hotel names to your website then I don't know of any WordPress plugins for that, but you might want to check out Expedia's EAN http://developer.ean.com/
You need to sign for their affiliate program, which is very easy. You get immediate access to their hotel databases plus you can make availability/booking requests with several response options, including JSON, which is more convenient and lightweight than the (unfortunately) more widespread XML. This is something you will need to hire a developer to do.
If you want to allow clients/visitors to be able to book and search for hotels by just installing a plugin try https://wordpress.org/plugins/wp-auto-hotel-finder/.
I have a scenario in which I have to export entire information from the Alfresco and import that into another alfresco...
To not find any difference in users, sites, dashlets, rules, aspects and other information. It should work without any difference as like the previous one after import.
I used the plugin share-import-export-0.1-JAR-alfresco-3.4 for import and export in the Alfreso share itself. But my export option in share is not working consistently, sometimes it is successfully exported and some times showing error.
After export if I get the .acp file successfully... I am getting error msg while importing the entire content in share and the error msg also not so descriptive. Err Msg:"Unexpected error occured during content extraction".
You can't use an ACP for your needs. An ACP includes the content and permissions, but not the definitions of the users/groups of those permissions. An ACP also won't include the site definition - there's more to a site than just the content that makes it up.
If you need to export a site, including its contents and its users, then your best bet is to use the new functionality that was introduced into Alfresco 3.5 (Team) to support the sample site. There's a webscript that handles the export, and a bootstrap importer that'll load it into a new system for you. I find it works pretty well for this sort of thing, but then I'm possibly biased as I wrote most of it ;-)
The webscript for the export is org.alfresco.repository.site.site-export.get, which lives at /alfresco/services/api/sites/{shortname}/export - drop the site name into the URL and fetch (as an admin) to receive the zip with all the parts of the site in it.
To see the loading, take a look at patch-services-context.xml and search for patch.siteLoadPatch.swsdp. The class that loads is org.alfresco.repo.admin.patch.impl.SiteLoadPatch and it takes the users, people, groups, content and site name as parameters. (You'll get all of these things in the zip file from the export)
If you need to do multiple sites + users + groups + contents, then you should look at something like import/export, replication/transfer, or rolling something custom yourself (probably based on CMIS).
I would like to create a data entry form in Drupal 7 that is similar to Filemaker's List View. List View is a view that contains many records on a page. When Submit button is clicked, data entered in the fields will be assigned to the individual records.
For example, I have a list of students' names and a column field of grade type. The student's name will be created from Drupal View's filtering, but the grade field will be empty waiting for me to key in.
What is the proper Drupal's module that can enable this functionality? Or what can I do to create this functionality.
This approach is closer to your original request, im testing this out now and I think it should more than do the trick, much more interestingly too.
http://drupal.org/project/slickgrid
Edit:
I highly recommend trying this its awesome!!
A couple quick tips
Be sure to also install http://drupal.org/project/title so you can reset the title (make it a field), Basically with this editor you can only edit actual fields, so same goes for location module, you'll need to use the field option rather than node option.
One possible downer, at least for my site, it appears the drupal module does not support jquery 1.7 which my site uses, so a few buttons etc don't work as expected, also the drupal module does not support the latest slickgrid release either. I'd like to look into fixing that but I dont have the time just yet, possibly someone will before long. If jquery 1.7 is not required for your site then none of that will be a problem for you.
I just figured how to do something similar, although I went about it a different way then I think would be the most desirable, at any rate what I did works perfectly for me at the moment.
What I did...
Used a google docs spreadsheet for data entry, exported a CSV file, then used Feeds module to Import and Map it to my desired content type. I was even able to get location, taxonomy & image fields to map.
Modules & Stuff Used...
Feeds http://drupal.org/project/feeds which also includes Feeds
Importer, you'll want to read through all the instructions to be sure
you understand the import methodology, I could never explain it all
here!
Feeds Tamper http://drupal.org/project/feeds_tamper , I used this to
explode the cell which had a | separated | list of taxonomy terms (dont use comma), the
nice side effect is if the term doesn't exist it creates it for you!
Spreadsheet - Personally I created a Google Docs Spreadsheet, any type
will do. This spreadsheet has every field you want to map to fields
in your content type.
"More Better"
Currently I am on the hunt for a simple backend UI way to do this, but this is what I've settled on for now. I just couldn't imagine hand entering hundreds of nodes, page submit after page submit! I will post back if I figure another way to do it. Good luck!
I have just started with
https://www.drupal.org/project/editableviews
which enables the creation of views where all the fields are editable, including when no data exists in the related entity. At the first url is also documentation.
There a screencast which shows more
https://www.youtube.com/watch?v=g_D4z4Bw6iw