jquery-datatable, ajax-datatable-rails or filterrific? - ruby-on-rails-4.1

I'm really confused what to use for serverside pagination, filtering, sorting etc. which is good to use (jquery-datatables, ajax-datatables-rails or filterrefic gem.

Filterrific is a rails engine that hooks into middleware for filtering and pagination stuff, if filterrefic itself depends on kaminari for pagination and other gems for other functionality, so, if your app is not so big and possesses less traffic then it's better to use filterrific since it's easy to use and bundles all filtering, paginating, sorting concepts, if traffic is high consider using gems or build your own required only solution.

Related

Dependencies needeed to use only data binding in Aurelia?

As part of the architecture team, we are currently discussing about how frontend should be developped. With the huge amount and versatility of the js framework we have decided to stick on standard as much as possible. So we have decided to use ECMA6 script with web component, fetch api for ajax and polyfills. However, we don't have any data binding for the moment. I was looking for a non intrusive framework that will do the job and found Aurelia very interesting. However, some of us are concerned by the relative heavy weight of the lib(270k+60k for system.js) while we only need data-binding. Is there a way to reduce the size of this lib ? I have noticed there is a data-binding library ? Is it possible to only use that library ? If so how ?

AspNetCore.Mvc vs AspNetCore.Mvc.Core

What's the difference between the AspNetCore.Mvc and AspNetCore.Mvc.Core NuGet packages? Is Mvc.Core just bare bones stuff while Mvc is an all-inclusive package? That's what I would guess from looking at the descriptions here and here but it's not totally clear.
see https://github.com/aspnet/Mvc/issues/4785
AspNetCore.Mvc has all the basic stuff already set for you
if you want to use AspNetCore.Mvc.Core you will have to configure them yourself
it seems wise to use AspNetCore.Mvc unless you KNOW you need AspNetCore.Mvc.Core
If you use .AddMvc() then you get a lot of "opinionated" features, e.g. what kind of app are you building, which formatters are registered and in what order, which application conventions are there by default.
If you use .AddMvcCore() ("and you know what you're doing) then the behavior of your application will be decided by your own opinions and not the built-in default opinions.

Search multiple plone site indexes

I need to implement a central search for multiple plone sites on different servers/machines.If there is a way to select which sites to search would be a plus but not the primary concern.Few ways I came upon to go about this:
-Export the ZCatalog indexes to an XML file and use a crawler periodically to get all the XML files so a search can be done on them,but this way does not allow for live searching.
-There is a way to use a common catalog but its not optimal and cannot be implemented on the sites i am working on because of some requirements.
-I read somewhere that they used solr but i need help on how to use it.
But I need a way to use the existing ZCatalog and index and not create another index as i think is the case with using solr due to the extra overheads and the extra index required to be maintained.But will use it if no other solution possible.I am a beginner at searching so please give details as much as possible.
You should really look into collective.solr:
https://pypi.python.org/pypi/collective.solr/4.1.0
Searching multiple sites is a complex use case and you most likely need a solution that scales. In the end it will require far less effort to go with Solr instead of coming up with your own solution. Solr is build for these kind of requirements.
As an alternative, you can also use collective.elasticindex, an extension to index Plone content into ElasticSearch, for this.
According to its documentation:
This doesn’t replace the Plone catalog with ElasticSearch, nor
interact with the Plone catalog at all, it merely index content inside
ElasticSearch when it is modified or published.
In addition to this, it provides a simple search page called
search.html that queries ElasticSearch using Javascript (so Plone is
not involved in searching) and propose the same features than the
default Plone search page. A search portlet let you redirect people to
this new search page as well.
That can be and advantage over collective.solr.

Creating a multilingual website

I am looking to build a multilingual website using MS expressions web. The website will consist of a blog and possibly a art display portions. I would like to do all translations manually but I don't want to have more then one CSS stack. What would be the best way to populate the website text. Because this is just a learning experience I will host the website myself.
What would be the best method to implement such a website: XML or SQL.
In my professional opinion I would use SQL simply because databases are always easier to edit and handle (in my opinion...) than XML, I like KatieK's idea of simply querying a different database based on which language it is in. However, if this is a learning experience I would use whichever language you know least of b/c it will teach you more.
Also (as a side-note), if this is run locally, you have to consider the ramifications of having MYSQL database calls, it means you have to deal with server side scripting (I'm assuming you will get the XML using js, although I could be wrong), its CPU usage (although I wouldn't worry too much abut that...), and preventing mysql injection.
Finally (this is my last side-note, I swear), I know you said that you wanted to do all the translations yourself, but I thought it couldn't hurt to mention that you don't have to.
Anyway, that's my two cents.
If it were me, I'd do it using SQL. I'd have two database tables, each with the different language content, and change the SQL call server-side based on query strings.
But the best implementation method for you depends entirely on your skills and abilities. Do you have experience designing databases and writing SQL queries? Do you have a database set up right now?

drupal feed aggregator

i found some modules for feed parsing(aggregator,feeds,feedapi). i am confusing to choose right one. i need to filter and classify the feeds. can any one guide me
Feeds is an attempt to replace FeedAPI, done by the same developers. It should be better, but as FeedAPI has gathered some extensions by other modules, Feeds might not offer some features yet that where available via extension modules before (note that this is just speculation).
Both offer more functionality than Drupals build in Aggregator module, which is geared towards a 'lightweight' aggregation approach.
So I would start with checking the built in Aggregator module. It offers 'categorization' of feeds and items, which might be enough for your need to 'filter' and 'classify'. If it is not enough, I would check the new Feeds module next, and only 'fall back' to FeedsAPI, if you need some extension/functioanlity not available for Feeds yet.
Feeds is the way to go. FeedAPI is not going to be further developed.
Also, the Managing News install profile might be a good starting point depending on your needs. Both are built by Development Seed, who are forging ahead in doing interesting stuff with feeds.
Feeds and/or FeedAPI work well. FeedAPI has been discontinued in favor of Feeds though.

Resources