I installed Django and was able to double check that the module was in fact in Python, but when attempting to implement basic commands such as runserver or utilize manage.py; I get DJANGO_SETTEINGS_MODULE error. I already used "set DJANGO_SETTINGS_MODULE = mysite.settings" as advised and inserted mysite.settings into the PATH for Python as some documentation online directed me to.
Now instead of undefined it says no such module exists. I can't find anything else in the documentation and I used my test site name instead of "mysite" without any change. Does anyone know what am I missing? All I can find in the module library for Django in my Python is this code.
from future import unicode_literals
from django.utils.version import get_version
VERSION = (1, 11, 5, 'final', 0)
__version__ = get_version(VERSION)
def setup(set_prefix=True):
"""
Configure the settings (this happens as a side effect of accessing the first setting), configure logging and populate the app registry.
Set the thread-local urlresolvers script prefix if `set_prefix` is True.
"""
from django.apps import apps
from django.conf import settings
from django.urls import set_script_prefix
from django.utils.encoding import force_text
from django.utils.log import configure_logging
configure_logging(settings.LOGGING_CONFIG, settings.LOGGING)
if set_prefix:
set_script_prefix(
'/' if settings.FORCE_SCRIPT_NAME is None else force_text(settings.FORCE_SCRIPT_NAME)
)
apps.populate(settings.INSTALLED_APPS)
Are you sure you wrote properly the environment variable? I'm asking cause I see you get the error DJANGO_SETTEINGS_MODULE (settings has a misspelling)...
Related
I have seen boost-build / bjam: execute a script post install (make 'install' a dependency of executing a script) where there is a recommendation for using notfile. Then I found the https://www.boost.org/build/doc/html/bbv2/builtins/raw.html page with a basic example, where I've added the import notfile:
import notfile;
notfile echo_something : #echo ;
actions echo
{
echo "something"
}
And I've tried this snippet in a Jamroot file of a project. If I do not have the import notfile, then it fails with:
...
Jamroot:57: in modules.load
ERROR: rule "notfile" unknown in module "Jamfile</home/USER/src/myproject>".
/usr/share/boost-build/src/build/project.jam:372: in load-jamfile
/usr/share/boost-build/src/build/project.jam:64: in load
/usr/share/boost-build/src/build/project.jam:142: in project.find
/usr/share/boost-build/src/build-system.jam:618: in load
/usr/share/boost-build/src/kernel/modules.jam:295: in import
/usr/share/boost-build/src/kernel/bootstrap.jam:139: in boost-build
/usr/share/boost-build/boost-build.jam:8: in module scope
If I have the import notfile; then it fails with:
Jamroot:56: Unescaped special character in argument notfile;
/usr/share/boost-build/src/kernel/modules.jam:258: in modules.import from module modules
error: When loading multiple modules, no specific rules or renaming is allowed
/usr/share/boost-build/src/build/project.jam:1121: in import from module Jamfile</home/USER/src/myproject>
Jamroot:62: in modules.load from module Jamfile</home/USER/src/myproject>
/usr/share/boost-build/src/build/project.jam:372: in load-jamfile from module project
/usr/share/boost-build/src/build/project.jam:64: in load from module project
/usr/share/boost-build/src/build/project.jam:142: in project.find from module project
/usr/share/boost-build/src/build-system.jam:618: in load from module build-system
/usr/share/boost-build/src/kernel/modules.jam:295: in import from module modules
/usr/share/boost-build/src/kernel/bootstrap.jam:139: in boost-build from module
/usr/share/boost-build/boost-build.jam:8: in module scope from module
How can I get this to work?
Just noticed the "Jamroot:56: Unescaped special character in argument notfile" while writing the question which finally made sense (errors like "error: When loading multiple modules, no specific rules or renaming is allowed" are completely misleading and useless) - and I realized, I had written:
import notfile;
... that is, with semicolon directly after the word - it seems, here space is required; so with this change:
import notfile ;
... things start working again.
I'm trying to implement custom XCOM backend.
Those are the steps I did:
Created "include" directory at the main Airflow dir (AIRFLOW_HOME).
Created these "custom_xcom_backend.py" file inside:
from typing import Any
from airflow.models.xcom import BaseXCom
import pandas as pd
class CustomXComBackend(BaseXCom):
#staticmethod
def serialize_value(value: Any):
if isinstance(value, pd.DataFrame):
value = value.to_json(orient='records')
return BaseXCom.serialize_value(value)
#staticmethod
def deserialize_value(result) -> Any:
result = BaseXCom.deserialize_value(result)
result = df = pd.read_json(result)
return result
Set at config file:
xcom_backend = include.custom_xcom_backend.CustomXComBackend
When I restarted webserver I got:
airflow.exceptions.AirflowConfigException: The object could not be loaded. Please check "xcom_backend" key in "core" section. Current value: "include.cust...
My guess is that it not recognizing the "include" folder
But how can I fix it?
*Note: There is no docker. It is installed on a Ubuntu machine.
Thanks!
So I solved it:
Put custom_xcom_backend.py into the plugins directory
set at config file:
xcom_backend = custom_xcom_backend.CustomXComBackend
Restart all airflow related services
*Note: Do not store DataFrames that way (bad practice).
Sources I used:
https://www.youtube.com/watch?v=iI0ymwOij88
Attempting to import a module in the Deno REPL results in the following error:
Uncaught SyntaxError: Cannot use import statement outside a module
at evaluate (rt/40_repl.js:60:36)
at replLoop (rt/40_repl.js:160:15)
I use the Node REPL to quickly test out code, almost on a daily basis. The ability to import external code without writing a script or dealing with temporary files is a huge convenience.
Why can't Deno use import statements outside of a module? Is it even possible to use external code in the Deno REPL?
Starting with v1.4.3, you can use top-level await in the REPL to dynamically import modules:
> const path = await import("https://deno.land/std#0.73.0/path/mod.ts")
> path.basename("/my/path/name")
"name"
If you also try to use import a from "a" in Node REPL, it will also throw the same error. Only require can be directly used to import modules in Node REPL.
For Deno, there is no built-in CommonJS loader. Therefore it does not even provide require for you to load stuff synchronously.
The technical reason of why static import cannot be used in REPL is that REPL is actually a script evaluation tool: instead of compiling what you write into an ES Module, they are treated as plain scripts and directly fed into the engine, in the way similar to <script> in the browser without turning on the type="module". (ES modules with static imports have the semantics of asynchronously loading dependencies and determining the "shape" of a module without even actually running it.)
To import modules in Deno REPL, you can use dynamic import(). Personally I sometimes do the following (loading is usually fast enough such that you will pretty much have mod value set before you continue using the mod in REPL):
$ deno
> let mod; import("./mod.ts").then(m => mod = m)
Promise { <pending> }
Check file:///[blah]/mod.ts
> mod
Module { a: 1, Symbol(Symbol.toStringTag): "Module" }
We are using Apache 1.9.0. I have written a snowflake hook plugin. I have placed the hook in the $AIRFLOW_HOME/plugins directory.
$AIRFLOW_HOME
+--plugins
+--snowflake_hook2.py
snowflake_hook2.py
# This is the base class for a plugin
from airflow.plugins_manager import AirflowPlugin
# This is necessary to expose the plugin in the Web interface
from flask import Blueprint
from flask_admin import BaseView, expose
from flask_admin.base import MenuLink
# This is the base hook for connecting to a database
from airflow.hooks.dbapi_hook import DbApiHook
# This is the snowflake provided Connector
import snowflake.connector
# This is the default python logging package
import logging
class SnowflakeHook2(DbApiHook):
"""
Airflow Hook to communicate with Snowflake
This is implemented as a Plugin
"""
def __init__(self, connname_in='snowflake_default', db_in='default', wh_in='default', schema_in='default'):
logging.info('# Connecting to {0}'.format(connname_in))
self.conn_name_attr = 'snowflake_conn_id'
self.connname = connname_in
self.superconn = super().get_connection(self.connname) #gets the values from Airflow
{SNIP - Connection stuff that works}
self.cur = self.conn.cursor()
def query(self,q,params=None):
"""From jmoney's db_wrapper allows return of a full list of rows(tuples)"""
if params == None: #no Params, so no insertion
self.cur.execute(q)
else: #make the parameter substitution
self.cur.execute(q,params)
self.results = self.cur.fetchall()
self.rowcount = self.cur.rowcount
self.columnnames = [colspec[0] for colspec in self.cur.description]
return self.results
{SNIP - Other class functions}
class SnowflakePluginClass(AirflowPlugin):
name = "SnowflakePluginModule"
hooks = [SnowflakeHook2]
operators = []
So I went ahead and put some print statements in Airflows plugin_manager to try and get a better handle on what is happening. After restarting the webserver and running airflow list_dags, these lines were showing the "new module name" (and no errors
SnowflakePluginModule [<class '__home__ubuntu__airflow__plugins_snowflake_hook2.SnowflakeHook2'>]
hook_module - airflow.hooks.snowflakepluginmodule
INTEGRATING airflow.hooks.snowflakepluginmodule
snowflakepluginmodule <module 'airflow.hooks.snowflakepluginmodule'>
As this is consistent with what the documentation says, I should be fine using this in my DAG:
from airflow import DAG
from airflow.hooks.snowflakepluginmodule import SnowflakeHook2
from airflow.operators.python_operator import PythonOperator
But the web throws this error
Broken DAG: [/home/ubuntu/airflow/dags/test_sf2.py] No module named 'airflow.hooks.snowflakepluginmodule'
So the question is, What am I doing wrong? Or have I uncovered a bug?
You need to import as below:
from airflow import DAG
from airflow.hooks import SnowflakeHook2
from airflow.operators.python_operator import PythonOperator
OR
from airflow import DAG
from airflow.hooks.SnowflakePluginModule import SnowflakeHook2
from airflow.operators.python_operator import PythonOperator
I don't think that airflow automatically goes through the folders in your plugins directory and runs everything underneath it. The way that I've set it up successfully is to have an __init__.py under the plugins directory which contains each plugin class. Have a look at the Astronomer plugins in Github, it provides some really good examples for how to set up your plugins.
In particular have a look at how they've set up the mysql plugin
https://github.com/airflow-plugins/mysql_plugin
Also someone has incorporated a snowflake hook in one of the later versions of airflow too which you might want to leverage:
https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/hooks/snowflake_hook.py
i am using QtAV but getting this error so please help me:
QQmlApplicationEngine failed to load component
qrc:/main.qml:4 plugin cannot be loaded for module "QtAV": Cannot load library /home/intel/Qt5.7.0/5.7/gcc_64/qml/QtAV/libQmlAV.so: (libswresample.so.2: cannot open shared object file: No such file or directory)
at qrc:/main.qml:4 import QtAV1.6
using linux
The application cannot find your requested plugin.
You need to check if your import path is set correctly. If this is the case: please check if the plugin is installed correctly. You can use
export QML_IMPORT_TRACE=1
to check which import pathes are set.
And make sure, that all dependncies are installed as well. For instance libswresample ( see https://ffmpeg.org/libswresample.html )