I am currently trying to choose between numerous workflow frameworks. I need an important feature which is workflow composition.
I found nothing on the documentation even in the API reference or advanced tutorial.
So my question is : Is it possible to compose pipelines ? i.e. To build some DAGs with already written ones. There is maybe some workarounds but I am interested by its native integration.
Thanks
I think you may be looking for the composite solid abstraction, which lets you compose sub-DAGs. https://docs.dagster.io/tutorial/advanced_solids#composite-solids https://docs.dagster.io/_apidocs/solids#composing-solids
Related
I'm starting to develop Azure blueprints and I can see that structure of ARM template is different compared to one used in ARM deployment. I like to modularize code and trying to figure out how I can properly develop individual ARM templates and then incorporate into final blueprint properly. As of right now instead of directly putting ARM artifact into blueprint (along with 100 others) I just manually debug ARM template and then cut in paste into artifact. I'm wondering if there is more effective way doing that or I missing something? Based on documentation it seems to be suggested directly incorporate templates into artifacts then deploy/publish/assign blueprint which takes way to much when you just need to work on single ARM template
An effective / dynamic / automated way can be accomplished by leveraging this Blueprints as Code repository for management and dynamic way of configuring lifecycle of your blueprints which helps in reducing the effort compared to Portal way of managing the blueprints.
Other related references:
Functions for use with Blueprints
Blueprints REST API Reference
Blueprints Az PowerShell Reference
I'm exploring AirFlow as a workflow execution engine.
I like the fact that it's very flexible, allow multiple operators, such as Python functions. However, I'm afraid I may be missing something fundamental - Task reuse. I want to run existing operators in multiple DAGs without having to redefine them.
As far as I can tell - this is not supported. Am I wrong? and If so, I'll be happy if someone can point me to a solution.
The only (awkward) solution that comes to mind is to have a dummy-DAG for every Operator, and then building my DAG on top of these dummy-DAGs with a DagRunOperator.
Many thanks!
The recommended way to achieve this would be to create your own Airflow plugin.
(From the Airflow Documentation on Plugins)
Airflow has a simple plugin manager built-in that can integrate external features to its core by simply dropping files in your $AIRFLOW_HOME/plugins folder.
The python modules in the plugins folder get imported, and hooks, operators, sensors, macros, executors and web views get integrated to Airflow’s main collections and become available for use.
So if you were to create a custom operator in your plugin, you would be able to re-use that same operator across multiple DAGs.
This repo may also be helpful as it has quite a few examples of Airflow plugins: https://github.com/airflow-plugins
I need to do a POC on Dotnet core Microservices with CQRS pattern and MongoDb as nosql database, I don't know where to start, please help
Please ask for a specific problem and do some work of your own before asking.
That said, there is a rather nice nuget package for .Net for working with MongoDB here: https://www.nuget.org/packages/MongoDB.Driver/
Personally I have no expertise with CQRS, however I found this which may help: CQRS Read models in a NoSql (Mongo DB)
There is also a nuget package for using mongodb with cqrs.net here: https://www.nuget.org/packages/Cqrs.MongoDB/
This is very subject question and there might not be any right answer. First you need to understand if you really need CQRS. This pattern generally goes with Event Sourcing .CQRS is only required in some cases where system Read and write has to be separated.
Before you go into CQRS and event sourcing , i would strongly recommend to understand your requirements since it will make your application logic complex.
This example covers all of the microservices concepts in a single project .
https://github.com/EdwinVW/pitstop/
From the question here, There are two external libraries to use for http operation. It seems that dispatch has more visibility while scalaj-http is easy to use as stated there. Thus, I am more inclined toward scalaj-http. I want to use the http library in google app engine, where there are restraints. For standard Java, there is a work around for it from here. I would like to get advice on what would be the best approach to use Scala in Google app engine(this is not for Lift framework).
I personally am very happy with Dispatch. There are several executors, including one for App Engine, dispatch-gae.
In the official documentation is just shown code with explanations not about a project apart. My question is: Do I need to create a stand alone project which will contain interface and plugin class? And which project template should I use? C++ library?
Knowing Qt's plugin architecture is probably not going to help you much when extending a 3rd party application. The application will undoubtedly have wrapped that mechanism for it's own usage patterns - assuming that the application is even extendible.
So to answer your question directly: The application you are developing for should have it's own API and documentation for extending it, reading that will give you the answers you need.