How to get Drupal 8 Views working with GraphQL / Gatsby.js - drupal

I'm struggeling to find an answer, because I'm not really sure if this is a Drupal, GraphQl or Gatsby question.
I'm building a portfolio-site with Gatsby.js and Drupal-8 as datasource (via gatsby-source-drupal).
GraphQl queries for Nodes, Taxonomy, Users, eg are working without problems.
But I can not access my created Views at my API-endpoint.
I have created a working Views-Page with a path.
I also exposed the data as a Block.
Tried a REST-Export as serialized JSON, but I can not get it working with JSON:API and JSON-Views-Module.
I expect to access the data from my View at my /jsonapi/endpoint, but my Views are not showing up.
I can't get my head around this. What am I missing? Is it even possible? Thanks!

TL;DR; You can't (from Drupal's JSON:API docs Unlike the REST module that comes with Drupal Core, JSON:API does not export Views results)
How to get filtered results
gatsby-source-drupal only fetches from the JSON:API endpoint (by default /jsonapi).
The way I see you can, kind of, emulate what Views does is using the filters options that gatsby-source-drupal provides.
The JSON:API Extras module for Drupal allows you to set some defaults filters as well.
Drupal documentation about filters
Example: ?filter[field_name]=value&filter[field_other]=value
Best!

With the JSON:API module you can not get views indeed, but there is also the GraphQl module and for your purpose the GraphQl views module that can make a view available to create a custom graphql schema. Good luck

Related

Separate plugins or one giant plugin on WP

I’m building a WP plugin to enhance a website, and come to an interrogation with the workflow.
Basically, I have to create a custom post type, assorted with several custom taxonomies, which will be used/displayed on the frontend and backend, and create a backend section in order to interact with our CRM, and Supabase via their respective APIs (service centralisation).
All of the second part is only intended to be used/displayed on the admin section, to logged users.
However, when creating/saving a custom post type, or when viewing it from the frontend, I have to make a GET request to the CRM to fetch some data and store it in JSON somewhere (24h cache).
That I can do.
At the moment, I worked on the CPT part, and made a class to interact with the CRM, with credentials stored in wp_options. I now have to work on the backend part.
My question is: what are the best practices here? Keep it in a single plugin or divide into several plugins?
And if I divide, how should I turn it? 2 plugins, one for the CPT and one for the backend? Or go even deeper, and get the CRM and Supabase their own simple plugin, and call their methods to make my requests?
I am short of ideas here, so if you encountered this situation, could you enlighten me?

Should I use drupal's graphql module?

I'm building a website with drupal as a CMS and going to integrate Graphql as a middleware. I've seen a lot of examples using the drupal's integrated graphql module, but I feel like it goes against the purpose of having a single decoupled middleware.
I'd like to only make one request from the frontend to graphql to then retrieve data from drupal or any other source. Doesn't the graphql module goes against this philosophy or have I misunderstood something?

how to get content from Solr to drupal?

I have crawl some website with nutch and and index it with solr.Now i would like to pass these data to drupal and take use of there search interface.
Most of the tutorial only show indexing from drupal to solr but not from solr to drupal.
So how can i go about passing already indexed data from solr to drupal?
You would need to build a custom module to query your Solr index using the solr HTTP API, then present the results using Drupal theming and render API.
I guess the reason why all tutorials say that is because that's the only possible direction. Solr is a search engine and it's not supposed to be a "source" system for feeding another storage (like a ln RDBMS): things are supposed to be exactly in the opposite way (e.g. Nutch --> Solr, RDBMS --> Solr)

Drupal Feed Aggregator and Twitter integration

I am currently using the Drupal feed aggregator built-in module to aggregate a bunch of RSS feeds. I also have the Twitter module installed. I want to set things up so that all the new posts from the feed aggregator get sent out to twitter, but unfortunately the twitter module doesn't allow for that right now. Does any one know of another way to do this or a work around? I know I can create a custom module to do this but didn't want to go down that road unless I had to
My understanding of the Twitter module (it's been a while since I've used it) is that it will send out notifications to Twitter on new node creation or update. Would you be interested in the Feeds module as an alternative to the core aggregator in Drupal? That would create nodes from each feed item, instead of the Aggregator, which only creates database entries that expire. The Feeds module approach would create nodes, which would then theoretically automatically ping Twitter. See also http://drupal.org/node/403274 which I think is a similar feature request.

Is it possible with Views module to create a search and results page?

I know Drupal has built in search module, but I want more flexibility and control. Is it possible using Views to create the search form and results pages?
Sure. There's two ways. One is to use Views filters: just create the view for the results page, add a filter, and expose the filter. You can create a search block by checking the option to create a block for the exposed form in the Views settings. Load the Advanced Help module for more information about Views filters.
The other way is to use Apache Solr and the Apache Solr Views module. Same idea as just using Views filters, but it'll use the Solr search backend instead of just doing SQL queries to the database.

Resources