Wordpress form to add hotels - wordpress

I am trying to create a webpage to add hotels (database for hotels) using wordpress. (name, address, pictures, reviews, ...). Ideas?
Another idea for categorizing hotels with cities?
Thanks.

You will probably need to further explain your question so that we can be more help to you. If your intention is to just add a list of hotel names to your website then I don't know of any WordPress plugins for that, but you might want to check out Expedia's EAN http://developer.ean.com/
You need to sign for their affiliate program, which is very easy. You get immediate access to their hotel databases plus you can make availability/booking requests with several response options, including JSON, which is more convenient and lightweight than the (unfortunately) more widespread XML. This is something you will need to hire a developer to do.
If you want to allow clients/visitors to be able to book and search for hotels by just installing a plugin try https://wordpress.org/plugins/wp-auto-hotel-finder/.

Related

How to create a job post by email parsing in WordPress?

I want to develop a job board website. I want specific feature in that is job posts will be created automatically by email parsing.
I tried Zapier, but it creates only blog posts.
And tried postie plugin to, but Gmail didn't allow it.
Willing to use job monster / work scout/ superio any one of these themes. If you have any suggestions, please let me know about it.
Is there any way to parse the email data and create a new job post. Please help me to resolve this issue.
No paid task. Need help to learn the things
There is a lot to unpack here.
The main problem you are going to encounter is that the emails you are parsing may not all be formatted the same. To pull the info out of an email you will need to be able to generate some rules to extract it.
If however, the emails are formatted the same then you can use the "split" function in Zapier to pull out the various bits of data from the email. Once you have these you can create a new post with your Zap.
I would recommend looking for a Wordpress plugin that allows you to create lists with custom post types. WP-Bakery does this from memory. You can set up a custom feed based on that post type.
Hopefully this helps narrow down the process for you. Good Luck.

Import.io - Can it replace Kimonolabs

I use Kimonolabs right now for scraping data from websites that have the same goal. To make it easy, lets say these websites are online shops selling stuff online (actually they are job websites with online application possibilities, but technically it looks a lot like a webshop).
This works great. For each website an scraper-API is created that goes trough the available advanced search page to crawl all product-url's. Let's call this API the 'URL list'. Then a 'product-API' is created for the product-detail-page that scrapes all necessary elements. E.g. the title, product text and specs like the brand, category, etc. The product API is set to crawl daily using all the URL's gathered in the 'URL list'.
Then the gathered information for all product's is fetched using Kimonolabs JSON endpoint using our own service.
However, Kimonolabs will quit its service end of february 2016 :-(. So, I'm looking for an easy alternative. I've been looking at import.io, but I'm wondering:
Does it support automatic updates (letting the API scrape hourly/daily/etc)?
Does it support fetching all product-URL's from a paginated advanced search page?
I'm tinkering around with the service. Basically, it seems to extract data via the same easy proces as Kimonolabs. Only, its unclear to me if paginating the URL's necesarry for the product-API and automatically keeping it up to date are supported.
Any import.io users here that can give advice if import.io is a usefull alternative for this? Maybe even give some pointers in the right direction?
Look into Portia. It's an open source visual scraping tool that works like Kimono.
Portia is also available as a service and it fulfills the requirements you have for import.io:
automatic updates, by scheduling periodic jobs to crawl the pages you want, keeping your data up-to-date.
navigation through pagination links, based on URL patterns that you can define.
Full disclosure: I work at Scrapinghub, the lead maintainer of Portia.
Maybe you want to give Extracty a try. Its a free web scraping tool that allows you to create endpoints that extract any information and return it in JSON. It can easily handle paginated searches.
If you know a bit of JS you can write CasperJS Endpoints and integrate any logic that you need to extract your data. It has a similar goal as Kimonolabs and can solve the same problems (if not more since its programmable).
If Extracty does not solve your needs you can checkout these other market players that aim for similar goals:
Import.io (as you already mentioned)
Mozenda
Cloudscrape
TrooclickAPI
FiveFilters
Disclaimer: I am a co-founder of the company behind Extracty.
I'm not that much fond of Import.io, but seems to me it allows pagination through bulk input urls. Read here.
So far not much progress in getting the whole website thru API:
Chain more than one API/Dataset It is currently not possible to fully automate the extraction of a whole website with Chain API.
For example if I want data that is found within category pages or paginated lists. I first have to create a list of URLs, run Bulk Extract, save the result as an import data set, and then chain it to another Extractor.Once set up once, I would like to be able to do this in one click more automatically.
P.S. If you are somehow familiar with JS you might find this useful.
Regarding automatic updates:
This is a beta feature right now. I'm testing this for myself after migrating from kimonolabs...You can enable this for your own APIs by appending &bulkSchedule=1 to your API URL. Then you will see a "Schedule" tab. In the "Configure" tab select "Bulk Extract" and add your URLs after this the scheduler will run daily or weekly.

Wordpress plugin for dependent drop down lists

Well, I know i'm going to be downvoted, but i think it's worth the shot.
I never worked with wordpress, and find it very displeasing to work with. A friend of mine asked me to implement a feature and i just don't have the time to understand it's inner works.
What i'm looking for is a plugin that let me have some sort of a tree, representing localization, kind of like:
Country
State
City
Person 1
Person 2
Person 3
and let me represent it with dependent select boxes that will list the people that belong to in the city, in the state, in the country i select?
Thanks for your help, and i'm sorry if this falls out the scope of SO
P.S.: yes, i have looked and looked in the wordpress plugin directory and haven't found anything.
Probably you will never find a plugin with such a feature.
This is a simple rule-based interface in a structure database.
I suggest you create a database structure parallel to wordpress with the characteristics necessary for the hierarchy shown in your example.
After this model database, you can use the the class wpdb() in wordpress to access this table and perform the query. In the link below there are instructions for this:
http://codex.wordpress.org/Class_Reference/wpdb
If you want to use the structure of the actual wordpress users, there are some plugins for listing users:
http://wordpress.org/plugins/simple-user-listing/
http://wordpress.org/plugins/user-list/
But user registration provided by wordpress can not provide detailed information about the user that you need.
Hope this helps.

Wordpress Cutsom Post Type - 700+ items

I have to deal with 700+ employees that are listed in the company address book module. I have to integrate them in an intranet that I'm developing with wordpress and I'm thinking to create a CPT called "Employee" and import all these employees here with all their information (phone, email, picture, etc) for which I'd use custom-fields.
This data must be searchable and paginated.
My question are:
Would it be OK to use CPT for this? I mean, maybe 700+ is too much and queries will be slow?
Would it be better instead to have a separate table "users" and deal directly with it and keep it separate from wordpress architecture?
Any other suggestions?
I appreciate your help!
I'v developed something similar. Willing to test out with a sample data set if you want to see the results?
http://www.webdistortion.com/2011/12/10/ontact-a-simple-wordpress-contact-solution/

RSS/Atom for professional use

I wondered if anyone can give an example of a professional use of RSS/Atom feeds in a company product. Does anyone use feeds for other things than updating news?
For example, did you create a product that gives results as RSS/Atom feeds? Like price listings or current inventory, or maybe dates of training lessons?
Or am I thinking in a wrong way of use cases for RSS/Atom feeds anyway?
edit #abyx has a really good example of a somewhat unexpected use of RSS as a way to get debug information from program transactions. I like the idea of this process. This is the type of use I was thinking of - besides publishing search results or last changes (like mediawiki)
Some of my team's new systems generate RSS feeds that the developers syndicate.
These feeds push out events that interest the developers at certain times and the information is controlled using different loggers. Thus when debugging you can get the debugging feed, when you want to see completed transactions you go to the transactions feeds etc.
This allows all the developers to get the information they want in a comfortable way and without any need to mess a lot with configuration. If you don't want to get it there's no need to remove yourself from a mailing list or edit a configuration file - simply remove the feed and be done with it.
Very cool, and the idea was stolen from Pragmatic Project Automation.
Most of the digital libraries uses RSS/ATOM to display their search/results, data update, according to the OAI-PMH protocol
With our internal TRAC server, I'm subscribed to the timeline view for each project that I work on. It's great for keeping track of checkins and bug tickets. This is pretty exclusive to a developer position though.
I also am subscribed to the recent changes for our installation of MediaWiki that we use for our intranet. That way it's easy to see if documents that I need have been changed, or if there's new policies etc.
Our website has a news page that I wrote an RSS feed for as well. While you mentioned that you weren't really interested in recent news, it is nice to keep up with our press releases.
I have seen RSS used to syndicate gas prices from a service for a specific zip code.
there are many examples. Here are a couple.
SharePoint provides RSS feeds from its lists.
Many faceted navigation products allow you to get an RSS feed based on a selected filter. For example, you can navigate to view 24" LCD Monitors on newegg.com and then get an RSS feed of that view.
Mantis bug tracker includes RSS feeds although I wish they were more configurable. Also we use MediaWiki for documentation which has all sorts of RSS Feeds including a per page watch, and recent changes.
I just added RSS feeds to the ticketing system I use at work (TicketDesk) and that feature should be in the next release of the product.
It's nice because it basically provides me a custom search view of outstanding trouble tickets or work requests that comes to me rather then me having to go to the application. It also allows users to get feeds of issues they may be interested in, but not require them to get emails on each update.
I'm looking at implementing an RSS feed for calls for service that our agency takes, to provide the administrators a quick and easy way to see what has been going on.
Atom feed documents and Atom entry documents are used as the representation format for RESTful web services that follow the Atom Publication Protocol (AtomPub).
I personally have used syndication feeds to expose a sub-set of the Windows Event Log information so that I could subscribe and be notified of critical events on a server.
immobilienscout24
they use RSS feeds for updates on your search.

Resources