Cannot import name 'operator' - airflow

I'm trying to create my first custom operator.
I have a airflow folder, a dags folder and a plugins folder.
Within the plugins folder I have the operators folder.
My operator is in the operators folder.
In my operators folder I have the following __init__.py file:
from operators.my_operator import MyOperator
__all__ = [
'MyOperator',
]
In my plugins folder I have the following __init__.py file:
from airflow.plugins_manager import AirflowPlugin
import operators
# Defining the plugin class
class NewOperator(AirflowPlugin):
name = "test_operator"
operators = [
operators.MyOperator
]
and then from my dag I'm calling my operator like this from airflow.operators import MyOperator
From the UI I see the error cannot import name 'MyOperator'
What's wrong?

Related

Missing Jar Files for HelloGlobe (JogAmp)

I am trying to run the following example.
I get errors for the following lines
import framework.Semantic;
import glm.mat.Mat4x4;
import glm.vec._2.Vec2;
import glm.vec._3.Vec3;
import uno.debug.GlDebugOutput;
import uno.glsl.Program;
...
import static glm.GlmKt.glm;
import static uno.buffer.UtilKt.destroyBuffer;
import static uno.buffer.UtilKt.destroyBuffers;
It seems that I am missing some more jar files even though I successfully imported jogl-all.jar and gluegen-rt.jar
I tried searching mvnrepository but did not find the jars I was looking for. I am not using maven.

ModuleNotFoundError after build exe by pyinstaller

My project folder as below:
C:\HHS\Actions<mysomeclass>.py
C:\HHS\Configuration<mysomesetting>.txt
C:\HHS\Data<mysomedata>.txt
C:\HHS\Form<mysomeclass>.py
C:\HHS\Testcases<mysometestcaseclass>.py
C:\HHS\Labels\OCR\Tmp<somefile>.png
C:\HHS\main.py
C:\HHS<somebatfile>.bat
After build exe file by command: C:\HHS>pyinstaller --paths=C:\HHS\Actions;C:\HHS\Form;C:\HHS\Testcase main.py
I copy all folder Actions , Configuration, Data, Form , Testcase, all file .bat to C:\HHS\dist\main
After that I run file main.exe in dist\main. It overcome the previous issue (How to release an application write by Python (which has structure folder) to user) but I met the other issue as below:
C:\HHS\dist\main>main.exe
Which testcase do you want to test
1. Image Capture
2. Reset Base
3. OCRTestcase
4. USBOEMCommand (remember turn off Vietnamese)
Please enter that number : 1
C:\HHS\dist\main> py C:\HHS\dist\main\Testcases\ImageCapture.py
Traceback (most recent call last):
File "C:\HHS\dist\main\Testcases\ImageCapture.py", line 9, in <module>
from Actions.Config import Config
ModuleNotFoundError: No module name 'Actions'
This is my ImageCapture.py
import os
import serial
import time
import sys
import os
import logging
from Actions.Config import Config
import Actions.Common as Common
from datetime import datetime
from Actions import DBv2
from Actions.parselog2 import ParseLog
I know maybe how I import module Actions is not correct (although it can run successful by Pycharm) but not work after build exe. I don't know how to fix it. Please help me. Thanks alot.

ts file is not showing the node module import path .it says couldn't find the declaration for "../node_modules/pdfmake/build/pdfmake";

here i am using dotnet core project to implement ts file and inside
ts file it says unable to find the module pdfmake .
~
import is unable to find the path of module where i have already
installed that module
in axcatly same path
import * as pdfMake from "../node_modules/pdfmake/build/pdfmake";
//<reference path="../node_modules/pdfmake/build/pdfmake" />
import * as pdfFonts from
"../node_modules/pdfmake/build/vfs_fonts";
pdfMake.vfs = pdfFonts.pdfMake.vfs;
~

airflow plugins not getting picked up correctly

We are using Apache 1.9.0. I have written a snowflake hook plugin. I have placed the hook in the $AIRFLOW_HOME/plugins directory.
$AIRFLOW_HOME
+--plugins
+--snowflake_hook2.py
snowflake_hook2.py
# This is the base class for a plugin
from airflow.plugins_manager import AirflowPlugin
# This is necessary to expose the plugin in the Web interface
from flask import Blueprint
from flask_admin import BaseView, expose
from flask_admin.base import MenuLink
# This is the base hook for connecting to a database
from airflow.hooks.dbapi_hook import DbApiHook
# This is the snowflake provided Connector
import snowflake.connector
# This is the default python logging package
import logging
class SnowflakeHook2(DbApiHook):
"""
Airflow Hook to communicate with Snowflake
This is implemented as a Plugin
"""
def __init__(self, connname_in='snowflake_default', db_in='default', wh_in='default', schema_in='default'):
logging.info('# Connecting to {0}'.format(connname_in))
self.conn_name_attr = 'snowflake_conn_id'
self.connname = connname_in
self.superconn = super().get_connection(self.connname) #gets the values from Airflow
{SNIP - Connection stuff that works}
self.cur = self.conn.cursor()
def query(self,q,params=None):
"""From jmoney's db_wrapper allows return of a full list of rows(tuples)"""
if params == None: #no Params, so no insertion
self.cur.execute(q)
else: #make the parameter substitution
self.cur.execute(q,params)
self.results = self.cur.fetchall()
self.rowcount = self.cur.rowcount
self.columnnames = [colspec[0] for colspec in self.cur.description]
return self.results
{SNIP - Other class functions}
class SnowflakePluginClass(AirflowPlugin):
name = "SnowflakePluginModule"
hooks = [SnowflakeHook2]
operators = []
So I went ahead and put some print statements in Airflows plugin_manager to try and get a better handle on what is happening. After restarting the webserver and running airflow list_dags, these lines were showing the "new module name" (and no errors
SnowflakePluginModule [<class '__home__ubuntu__airflow__plugins_snowflake_hook2.SnowflakeHook2'>]
hook_module - airflow.hooks.snowflakepluginmodule
INTEGRATING airflow.hooks.snowflakepluginmodule
snowflakepluginmodule <module 'airflow.hooks.snowflakepluginmodule'>
As this is consistent with what the documentation says, I should be fine using this in my DAG:
from airflow import DAG
from airflow.hooks.snowflakepluginmodule import SnowflakeHook2
from airflow.operators.python_operator import PythonOperator
But the web throws this error
Broken DAG: [/home/ubuntu/airflow/dags/test_sf2.py] No module named 'airflow.hooks.snowflakepluginmodule'
So the question is, What am I doing wrong? Or have I uncovered a bug?
You need to import as below:
from airflow import DAG
from airflow.hooks import SnowflakeHook2
from airflow.operators.python_operator import PythonOperator
OR
from airflow import DAG
from airflow.hooks.SnowflakePluginModule import SnowflakeHook2
from airflow.operators.python_operator import PythonOperator
I don't think that airflow automatically goes through the folders in your plugins directory and runs everything underneath it. The way that I've set it up successfully is to have an __init__.py under the plugins directory which contains each plugin class. Have a look at the Astronomer plugins in Github, it provides some really good examples for how to set up your plugins.
In particular have a look at how they've set up the mysql plugin
https://github.com/airflow-plugins/mysql_plugin
Also someone has incorporated a snowflake hook in one of the later versions of airflow too which you might want to leverage:
https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/hooks/snowflake_hook.py

webpack alias error when import css file

I have a webpack alias config like this:
'c-assets': path.resolve(__dirname, './assets/')
and there is a file in assets directory
assets/css/normalize.css
when I use "import" to import the css file using alias
import "c-assets/css/normalize.css";
error occurs:
but when I use "require" to import the file, it works well:
require("c-assets/css/normalize.css")
only css files lead to this problem, why?

Resources