celerybeat uses UTC even with timezone settings - celerybeat

I'm finding that celerybeat is using UTC time in its scheduling (and outputting logs in UK time?!) even though I believe I have the required settings in my django settings.py:
TIME_ZONE = 'UTC'
USE_TZ = True
CELERY_ENABLE_UTC = True
CELERY_TIMEZONE = 'Australia/Sydney'
CELERYBEAT_SCHEDULE =
"testRunBeat" : {
"task" : "experiments.tasks.testHeartBeat",
"schedule" : crontab(minute = "*/1", hour="13-14"),
}
I have tried switching the TIME_ZONE variable with no luck
I am using:
django==1.4
celery==2.5.5
django-celery==2.5.5
Thanks

Turns out that it was a bug in celery which is now fixed. See https://github.com/celery/django-celery/issues/150

I think that you want
CELERY_ENABLE_UTC = False
The celery configuration docs state pretty clearly that if this value is true, dates and times are converted to UTC. Also note this value is enabled by default since version 3.0.

Related

Using another connection instead of the default connection throughout the DAG

I'm using Airflow in Google Composer. By default all the tasks in a DAG use the default connection to communicate with Storage, BigQuery, etc. Obviously we can specify another connection configured in Airflow, ie:
task_custom = bigquery.BigQueryInsertJobOperator(
task_id='task_custom_connection',
gcp_conn_id='my_gcp_connection',
configuration={
"query": {
"query": 'SELECT 1',
"useLegacySql": False
}
}
)
Is it possible to use a specific connection as the default for all tasks in the entire DAG?
Thanks in advance.
UPDATE:
Specify gcp_conn_id via default_args in DAG (as Javier Lopez Tomas recommended) doesn't work completely. The Operators that expect gcp_conn_id as parameter works fine, but in my case unfortunately most of interactions with GCP components do so via clients or hooks within PythonOperators.
For example: If I call DataflowHook (inside a function called by a PythonOperator) without specifying the connection, it internally uses "google_cloud_default" and not "gcp_conn_id" :(
def _dummy_func(**context):
df_hook = DataflowHook()
default_args = {
'gcp_conn_id': 'my_gcp_connection'
}
with DAG(default_args=default_args) as dag:
dummy = PythonOperator(python_callable=_dummy_func)
You can use default args:
https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html#default-arguments
In your case it would be:
default_args = {
"gcp_conn_id": "my_gcp_connection"
}
with DAG(blabla)...

apache airflow variables on startup

I'm learning Airflow and am planning to set some variables to use across different tasks. These are in my dags folder, saved as configs.json, like so:
{
"vars": {
"task1_args": {
"something": "This is task 1"
},
"task2_args": {
"something": "this is task 2"
}
}
}
I get that we can enter Admin-->Variables--> upload the file. But I have 2 questions:
What if I want to adjust some of the variables while airflow is running? I can adjust my code easily and it updates in realtime but it doesn't seem like this works for variables.
Is there a way to just auto-import this specific file on startup? I don't want to have to add it every time I'm testing my project.
I don't see this mentioned in the docs but it seems like a pretty trivial thing to want.
What you are looking for is With code, how do you update an airflow variable?
Here's an untested snippet that should help
from airflow.models import Variable
Variable.set(key="my_key", value="my_value")
So basically you can write a bootstrap python script to do this setup for you.
In our team, we use such scripts to setup all Connections, and Pools too
In case you are wondering, here's the set(..) method from source
#classmethod
#provide_session
def set(
cls,
key: str,
value: Any,
serialize_json: bool = False,
session: Session = None
):
"""
Sets a value for an Airflow Variable with a given Key
:param key: Variable Key
:param value: Value to set for the Variable
:param serialize_json: Serialize the value to a JSON string
:param session: SQL Alchemy Sessions
"""
if serialize_json:
stored_value = json.dumps(value, indent=2)
else:
stored_value = str(value)
Variable.delete(key, session=session)
session.add(Variable(key=key, val=stored_value))
session.flush()

How to pass a parameter from the Jupyter backend to a frontend extension

I currently have a value that is stored as an environment variable the environment where a jupyter server is running. I would like to somehow pass that value to a frontend extension. It does not have to read the environment variable in real time, I am fine with just using the value of the variable at startup. Is there a canonical way to pass parameters a frontend extension on startup? Would appreciate an examples of both setting the parameter from the backend and accessing it from the frontend.
[update]
I have posted a solution that works for nbextentions, but I can't seem to find the equivalent pattern for labextensions (typescript), any help there would be much appreciated.
I was able to do this by adding the following code to my jupter_notebook_config.py
from notebook.services.config import ConfigManager
cm = ConfigManager()
cm.update('notebook', {'variable_being_set': value})
Then I had the parameters defined in my extension in my main.js
// define default values for config parameters
var params = {
variable_being_set : 'default'
};
// to be called once config is loaded, this updates default config vals
// with the ones specified by the server's config file
var update_params = function() {
var config = Jupyter.notebook.config;
for (var key in params) {
if (config.data.hasOwnProperty(key) ){
params[key] = config.data[key];
}
}
};
I also have the parameters declared in my main.yaml
Parameters:
- name: variable_being_set
description: ...
input_type: text
default: `default_value`
This took some trial and error to find out because there is very little documentation on the ConfigManager class and none of it has an end-to-end example.

Symfony Doctrine - set time zone

I have multiple Symfony applications running on the same sever, and every application may need a different time zone.
For php, it's doable by using different fpm pools and setting the time zone in the pool configuration:
php_admin_value[date.timezone] = America/New_York
but for MySQL, I need to issue the statement:
"SET time_zone = 'America/New_York';
as the first query after connected, or add to the $options array in the PDO constructor:
PDO::MYSQL_ATTR_INIT_COMMAND => "SET time_zone = 'America/New_York';"
How can this be done?
You can use PostConnect Doctrine event ... https://www.doctrine-project.org/projects/doctrine-dbal/en/2.9/reference/events.html#postconnect-event
This is useful to configure the connection before any sql statement is executed.
An example may be:
<?php
namespace App\EventListener;
use Doctrine\DBAL\Event\ConnectionEventArgs;
/**
* My initializer
*/
class MyPdoInitializerListener
{
public function postConnect(ConnectionEventArgs $args)
{
$args->getConnection()
->exec("SET time_zone = 'America/New_York'");
}
}
Don't forget to add the listener to services.yaml
# services.yaml
# ...
App\EventListener\MyPdoInitializerListener:
tags:
- { name: doctrine.event_listener, event: postConnect }
To extend on SilvioQ's answer, I wrote the following snippet to inherit the time zone from the PHP environment, assuming that date_default_timezone_set() was previously called to set the correct time zone:
This solution has an advantage over named time zones (like "America/New_York") when those named time zones are not available within MySQL/MariaDB and you have no control over the server.
public function postConnect(ConnectionEventArgs $args) {
$tz = new DateTimeZone(date_default_timezone_get());
$offset = $tz->getOffset(new DateTime('now'));
$abs = abs($offset);
$str = sprintf('%s%02d:%02d', $offset < 0 ? '-' : '+', intdiv($abs, 3600), intdiv($abs % 3600, 60));
$args->getConnection()->exec("SET time_zone = '$str'");
}
As of version 3.5 of Doctrine DBAL, the solution proposed by SilvioQ is deprecated.
Doctrine documentation recommends implementing a middleware class for the database driver instead. An example of such implementation can be found within the DBAL code itself: https://github.com/doctrine/dbal/blob/3.5.x/src/Logging/Middleware.php

Meteor Package: Add Custom Options

I've created a Meteor smart package, and would like to add user generated custom options to the API.
However, I'm having issues due to Meteor's automatic load ordering.
SocialButtons.config({
facebook: false
});
This runs a config block that adds defaults.
SocialButtons.config = function (options) {
... add to options if valid ...
};
Which in turn grabs a set of defaults:
var defaults = {
facebook: true,
twitter: true
}
Which are mixed into the settings.
var settings = _.extend(defaults, options);
...(program starts, uses settings)...
The problem is that everything must run in the proper order.
Create SocialButtons object
Run the optional SocialButtons.config()
Create settings & run the program
How can I control the load order in Meteor without knowing where a user might place the optional configuration?
Step 2 will be in a different folder/file, but must run sandwiched between steps 1 & 3.
You can't really control load order right now so it's not guaranteed but placing files at /libs are loaded first but in your case it's doesn't really matter it might be something else here is a very simple package you can view the source on how I setup default options and allow to replace those easily https://github.com/voidale/meteor-bootstrap-alerts
Figured this out.
Put your package into a /lib directory.
Include a setup function that sets the settings when called, and loads the data
Return the data from the startup function
In this case:
SocialButtons.get = function () {
return initButtons();
}
function initButtons() { ... settings, startup, return final value ... }

Resources