I'm using drush locally without any problems, but on my hosting there is an error with locale and db update. I checked so many things and nothing. Strange is that commands like config import/export, sql:dump, drush status works fine, and there is a working sql connection.
Full output:
php74 vendor/bin/drush locale:import pl ../translations/custom-translations.pl.po --type=customized --override=all --debug
[preflight] Config paths: /home/pathtomywebsite/vendor/drush/drush/drush.yml
[preflight] Alias paths: /home/pathtomywebsite/web/drush/sites,/home/pathtomywebsite/drush/sites
[preflight] Commandfile search paths: /home/pathtomywebsite/vendor/drush/drush/src
[debug] Bootstrap further to find locale:import [0.07 sec, 8.75 MB]
[debug] Trying to bootstrap as far as we can [0.07 sec, 8.75 MB]
[debug] Drush bootstrap phase: bootstrapDrupalRoot() [0.07 sec, 8.75 MB]
[debug] Change working directory to /home/pathtomywebsite/web [0.07 sec, 8.75 MB]
[debug] Initialized Drupal 9.2.0 root directory at /home/pathtomywebsite/web [0.07 sec, 8.75 MB]
[debug] Drush bootstrap phase: bootstrapDrupalSite() [0.07 sec, 9.08 MB]
[debug] Initialized Drupal site default at sites/default [0.08 sec, 9.31 MB]
[debug] Drush bootstrap phase: bootstrapDrupalConfiguration() [0.08 sec, 9.31 MB]
[debug] Add service modifier [0.08 sec, 9.49 MB]
[debug] Drush bootstrap phase: bootstrapDrupalDatabase() [0.08 sec, 9.96 MB]
[debug] Successfully connected to the Drupal database. [0.08 sec, 9.96 MB]
[debug] Drush bootstrap phase: bootstrapDrupalFull() [0.08 sec, 9.96 MB]
[debug] Start bootstrap of the Drupal Kernel. [0.08 sec, 9.96 MB]
[debug] Finished bootstrap of the Drupal Kernel. [0.15 sec, 16.23 MB]
[debug] Add a command: twig-tweak:validate [0.2 sec, 21.52 MB]
[debug] Add a command: twig-tweak:debug [0.2 sec, 21.52 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\config\ConfigCommands [0.22 sec, 23.4 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\config\ConfigExportCommands [0.22 sec, 23.43 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\config\ConfigImportCommands [0.22 sec, 23.44 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\BatchCommands [0.22 sec, 23.45 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\CliCommands [0.22 sec, 23.45 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\DrupalCommands [0.22 sec, 23.46 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\DeployHookCommands [0.22 sec, 23.47 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\EntityCommands [0.22 sec, 23.48 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\ImageCommands [0.22 sec, 23.49 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\JsonapiCommands [0.22 sec, 23.5 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\LanguageCommands [0.22 sec, 23.5 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\LocaleCommands [0.22 sec, 23.51 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\MessengerCommands [0.22 sec, 23.53 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\MigrateRunnerCommands [0.22 sec, 23.54 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\QueueCommands [0.22 sec, 23.59 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\RoleCommands [0.22 sec, 23.6 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\StateCommands [0.23 sec, 23.62 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\TwigCommands [0.23 sec, 23.64 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\UserCommands [0.23 sec, 23.64 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\ViewsCommands [0.23 sec, 23.69 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\WatchdogCommands [0.23 sec, 23.71 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\pm\PmCommands [0.23 sec, 23.74 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\pm\ThemeCommands [0.23 sec, 23.76 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeCommands [0.23 sec, 23.76 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeCommentsCommands [0.23 sec, 23.77 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeSessionsCommands [0.23 sec, 23.77 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeUserFieldsCommands [0.23 sec, 23.77 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeUserTableCommands [0.23 sec, 23.78 MB]
[debug] Add a commandfile class: Drupal\entity_reference_revisions\Commands\EntityReferenceRevisionsCommands [0.23 sec, 23.78 MB]
[debug] Add a commandfile class: Drupal\token\Commands\TokenCommands [0.23 sec, 23.79 MB]
[debug] Add a commandfile class: Drupal\pathauto\Commands\PathautoCommands [0.23 sec, 23.79 MB]
[debug] Done with bootstrap max in Application::bootstrapAndFind(): trying to find locale:import again. [0.23 sec, 23.8 MB]
[debug] Starting bootstrap to none [0.23 sec, 23.91 MB]
[debug] Drush bootstrap phase 0 [0.23 sec, 23.91 MB]
[debug] Try to validate bootstrap phase 0 [0.24 sec, 23.91 MB]
[info] Executing: /home/pathtomywebsite/vendor/drush/drush/drush batch-process 15 --uri=default --root=/home/pathtomywebsite/web [0.27 sec, 25.74 MB]
>
>
> Command batch-process was not found. Drush was unable to query the database
> . As a result, many commands are unavailable. Re-run your command with --de
> bug to see relevant log messages.
>
>
In ProcessBase.php line 155:
[InvalidArgumentException]
Output is empty.
Exception trace:
at /home/pathtomywebsite/vendor/consolidation/site-process/src/ProcessBase.php:155
Consolidation\SiteProcess\ProcessBase->getOutputAsJson() at /home/pathtomywebsite/vendor/drush/drush/includes/batch.inc:157
_drush_backend_batch_process() at /home/pathtomywebsite/vendor/drush/drush/includes/batch.inc:80
drush_backend_batch_process() at /home/pathtomywebsite/vendor/drush/drush/src/Drupal/Commands/core/LocaleCommands.php:268
Drush\Drupal\Commands\core\LocaleCommands->import() at n/a:n/a
call_user_func_array() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/CommandProcessor.php:257
Consolidation\AnnotatedCommand\CommandProcessor->runCommandCallback() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/CommandProcessor.php:212
Consolidation\AnnotatedCommand\CommandProcessor->validateRunAndAlter() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/CommandProcessor.php:176
Consolidation\AnnotatedCommand\CommandProcessor->process() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/AnnotatedCommand.php:311
Consolidation\AnnotatedCommand\AnnotatedCommand->execute() at /home/pathtomywebsite/vendor/symfony/console/Command/Command.php:255
Symfony\Component\Console\Command\Command->run() at /home/pathtomywebsite/vendor/symfony/console/Application.php:1027
Symfony\Component\Console\Application->doRunCommand() at /home/pathtomywebsite/vendor/symfony/console/Application.php:273
Symfony\Component\Console\Application->doRun() at /home/pathtomywebsite/vendor/symfony/console/Application.php:149
Symfony\Component\Console\Application->run() at /home/pathtomywebsite/vendor/drush/drush/src/Runtime/Runtime.php:118
Drush\Runtime\Runtime->doRun() at /home/pathtomywebsite/vendor/drush/drush/src/Runtime/Runtime.php:48
Drush\Runtime\Runtime->run() at /home/pathtomywebsite/vendor/drush/drush/drush.php:72
require() at /home/pathtomywebsite/vendor/drush/drush/drush:4
Drush status:
php74 vendor/bin/drush status
Drupal version : 9.2.0
Site URI : http://default
DB driver : mysql
DB hostname : localhost
DB port : 3306
DB username : ****
DB name : ****
Database : Connected
Drupal bootstrap : Successful
Default theme : ttp
Admin theme : seven
PHP binary : /usr/local/php7.4/bin/php
PHP config : /usr/local/php7.4/php.ini
PHP OS : Linux
Drush script : /home/pathtomywebsite/vendor/drush/drush/drush
Drush version : 10.5.0
Drush temp : /tmp
Drush configs : /home/pathtomywebsite/vendor/drush/drush/drush.yml
Install profile : standard
Drupal root : /home/pathtomywebsite/web
Site path : sites/default
Files, Public : sites/default/files
Files, Temp : /tmp
sql:connection returns working string for mysql. Im stuck, maybe someone had similar problem?
Debug output from updatedb command:
php74 vendor/bin/drush updatedb
In Process.php line 266:
The command "/home/pathtomywebsite/vendor/drush/drush/drush updatedb:status --no-entity-updates --uri=default --root=/home/pathtomywebsite/web" failed.
Exit Code: 1(General error)
Working directory:
Output:
================
Error Output:
================
In BootstrapHook.php line 32:
Bootstrap failed. Run your command with -vvv for more information.
user#server:~/somepath$ php74 vendor/bin/drush updatedb -vvv
[preflight] Config paths: /home/pathtomywebsite/vendor/drush/drush/drush.yml
[preflight] Alias paths: /home/pathtomywebsite/web/drush/sites,/home/pathtomywebsite/drush/sites
[preflight] Commandfile search paths: /home/pathtomywebsite/vendor/drush/drush/src
[debug] Starting bootstrap to full [0.06 sec, 8.78 MB]
[debug] Drush bootstrap phase 5 [0.06 sec, 8.78 MB]
[debug] Try to validate bootstrap phase 5 [0.06 sec, 8.78 MB]
[debug] Try to validate bootstrap phase 5 [0.06 sec, 8.78 MB]
[debug] Try to bootstrap at phase 5 [0.06 sec, 8.78 MB]
[debug] Drush bootstrap phase: bootstrapDrupalRoot() [0.06 sec, 8.78 MB]
[debug] Change working directory to /home/pathtomywebsite/web [0.06 sec, 8.78 MB]
[debug] Initialized Drupal 9.2.0 root directory at /home/pathtomywebsite/web [0.06 sec, 8.78 MB]
[debug] Try to validate bootstrap phase 5 [0.06 sec, 8.78 MB]
[debug] Try to bootstrap at phase 5 [0.06 sec, 9.17 MB]
[debug] Drush bootstrap phase: bootstrapDrupalSite() [0.06 sec, 9.17 MB]
[debug] Initialized Drupal site default at sites/default [0.06 sec, 9.34 MB]
[debug] Try to validate bootstrap phase 5 [0.06 sec, 9.34 MB]
[debug] Try to bootstrap at phase 5 [0.06 sec, 9.34 MB]
[debug] Drush bootstrap phase: bootstrapDrupalConfiguration() [0.06 sec, 9.34 MB]
[debug] Add service modifier [0.07 sec, 9.55 MB]
[debug] Try to validate bootstrap phase 5 [0.07 sec, 9.55 MB]
[debug] Try to bootstrap at phase 5 [0.07 sec, 10.06 MB]
[debug] Drush bootstrap phase: bootstrapDrupalDatabase() [0.07 sec, 10.06 MB]
[debug] Successfully connected to the Drupal database. [0.07 sec, 10.06 MB]
[debug] Try to validate bootstrap phase 5 [0.07 sec, 10.06 MB]
[debug] Try to bootstrap at phase 5 [0.07 sec, 10.06 MB]
[debug] Drush bootstrap phase: bootstrapDrupalFull() [0.07 sec, 10.06 MB]
[debug] Start bootstrap of the Drupal Kernel. [0.07 sec, 10.06 MB]
[info] entity_reference_revisions should have an extra.drush.services section in its composer.json. See http://docs.drush.org/en/10.x/commands/#specifying-the-services-file. [0.1 sec, 12.42 MB]
[debug] Found drush.services.yml for token Drush commands [0.1 sec, 12.57 MB]
[info] twig_tweak should have an extra.drush.services section in its composer.json. See http://docs.drush.org/en/10.x/commands/#specifying-the-services-file. [0.1 sec, 12.57 MB]
[debug] Found drush.services.yml for pathauto Drush commands [0.1 sec, 12.57 MB]
[debug] Get container builder [0.1 sec, 12.59 MB]
[debug] Service modifier alter. [0.11 sec, 12.69 MB]
[debug] process drush.console.services console.command [0.17 sec, 17.37 MB]
[debug] Found tagged service twig_tweak.validate [0.17 sec, 17.37 MB]
[debug] Found tagged service twig_tweak.debug [0.17 sec, 17.37 MB]
[debug] process drush.command.services drush.command [0.17 sec, 17.37 MB]
[debug] Found tagged service config.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service config.export.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service config.import.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service batch.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service cli.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service drupal.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service deploy_hook.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service entity.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service image.commands [0.17 sec, 17.37 MB]
[debug] Found tagged service jsonapi.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service language.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service locale.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service messenger.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service migrate_runner.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service queue.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service role.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service state.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service twig.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service user.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service views.commands [0.17 sec, 17.38 MB]
[debug] Found tagged service watchdog.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service pm.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service theme.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service sanitize.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service sanitize.comments.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service sanitize.sessions.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service sanitize.userfields.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service sanitize.usertable.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service entity_reference_revisions.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service token.commands [0.17 sec, 17.39 MB]
[debug] Found tagged service pathauto.commands [0.17 sec, 17.39 MB]
[debug] process drush.command_info_alterer.services drush.command_info_alterer [0.17 sec, 17.39 MB]
[debug] process drush.generator.services drush.generator [0.17 sec, 17.39 MB]
[debug] Finished bootstrap of the Drupal Kernel. [0.3 sec, 26.24 MB]
[debug] Add a command: twig-tweak:validate [0.4 sec, 36.8 MB]
[debug] Add a command: twig-tweak:debug [0.4 sec, 36.8 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\config\ConfigCommands [0.42 sec, 38.48 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\config\ConfigExportCommands [0.42 sec, 38.52 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\config\ConfigImportCommands [0.42 sec, 38.52 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\BatchCommands [0.42 sec, 38.53 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\CliCommands [0.42 sec, 38.54 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\DrupalCommands [0.42 sec, 38.54 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\DeployHookCommands [0.42 sec, 38.56 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\EntityCommands [0.42 sec, 38.56 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\ImageCommands [0.42 sec, 38.57 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\JsonapiCommands [0.42 sec, 38.58 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\LanguageCommands [0.42 sec, 38.59 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\LocaleCommands [0.42 sec, 38.6 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\MessengerCommands [0.42 sec, 38.62 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\MigrateRunnerCommands [0.42 sec, 38.62 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\QueueCommands [0.43 sec, 38.67 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\RoleCommands [0.43 sec, 38.68 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\StateCommands [0.43 sec, 38.71 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\TwigCommands [0.43 sec, 38.72 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\UserCommands [0.43 sec, 38.73 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\ViewsCommands [0.43 sec, 38.77 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\core\WatchdogCommands [0.43 sec, 38.8 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\pm\PmCommands [0.43 sec, 38.83 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\pm\ThemeCommands [0.43 sec, 38.84 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeCommands [0.43 sec, 38.85 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeCommentsCommands [0.43 sec, 38.85 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeSessionsCommands [0.43 sec, 38.85 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeUserFieldsCommands [0.43 sec, 38.86 MB]
[debug] Add a commandfile class: Drush\Drupal\Commands\sql\SanitizeUserTableCommands [0.43 sec, 38.86 MB]
[debug] Add a commandfile class: Drupal\entity_reference_revisions\Commands\EntityReferenceRevisionsCommands [0.43 sec, 38.87 MB]
[debug] Add a commandfile class: Drupal\token\Commands\TokenCommands [0.43 sec, 38.87 MB]
[debug] Add a commandfile class: Drupal\pathauto\Commands\PathautoCommands [0.43 sec, 38.87 MB]
[info] Executing: /home/pathtomywebsite/vendor/drush/drush/drush updatedb:status --no-entity-updates --uri=default --root=/home/pathtomywebsite/web [0.56 sec, 41.87 MB]
In Process.php line 266:
[Symfony\Component\Process\Exception\ProcessFailedException]
The command "/home/pathtomywebsite/vendor/drush/drush/drush updatedb:status --no-entity-updates --uri=default --root=/home/pathtomywebsite/web" failed.
Exit Code: 1(General error)
Working directory:
Output:
================
Error Output:
================
In BootstrapHook.php line 32:
Bootstrap failed. Run your command with -vvv for more information.
Exception trace:
at /home/pathtomywebsite/vendor/symfony/process/Process.php:266
Symfony\Component\Process\Process->mustRun() at /home/pathtomywebsite/vendor/drush/drush/src/Commands/core/UpdateDBCommands.php:67
Drush\Commands\core\UpdateDBCommands->updatedb() at n/a:n/a
call_user_func_array() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/CommandProcessor.php:257
Consolidation\AnnotatedCommand\CommandProcessor->runCommandCallback() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/CommandProcessor.php:212
Consolidation\AnnotatedCommand\CommandProcessor->validateRunAndAlter() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/CommandProcessor.php:176
Consolidation\AnnotatedCommand\CommandProcessor->process() at /home/pathtomywebsite/vendor/consolidation/annotated-command/src/AnnotatedCommand.php:311
Consolidation\AnnotatedCommand\AnnotatedCommand->execute() at /home/pathtomywebsite/vendor/symfony/console/Command/Command.php:255
Symfony\Component\Console\Command\Command->run() at /home/pathtomywebsite/vendor/symfony/console/Application.php:1027
Symfony\Component\Console\Application->doRunCommand() at /home/pathtomywebsite/vendor/symfony/console/Application.php:273
Symfony\Component\Console\Application->doRun() at /home/pathtomywebsite/vendor/symfony/console/Application.php:149
Symfony\Component\Console\Application->run() at /home/pathtomywebsite/vendor/drush/drush/src/Runtime/Runtime.php:118
Drush\Runtime\Runtime->doRun() at /home/pathtomywebsite/vendor/drush/drush/src/Runtime/Runtime.php:48
Drush\Runtime\Runtime->run() at /home/pathtomywebsite/vendor/drush/drush/drush.php:72
require() at /home/pathtomywebsite/vendor/drush/drush/drush:4
I had the same problem after updating the docker container with a newer PHP version.
I was able to track down the issue by comparing php.ini.
The difference was in variables_order setting.
Run the following command in the terminal to check the values:
php -i | grep variables_order
In my case the output was:
variables_order => GPCS => GPCS
After changing to EGPCS, the script successfully ran.
To update the value, set variables_order to "EGPCS" in your php.ini file.
You can get the path of the config file by running in the terminal:
php --ini
It will list all config files included. Use the one on the top with the label Loaded Configuration File, the config must be there. In my case it was:
Loaded Configuration File: /usr/local/etc/php/php.ini
Find variables_order and make sure your configuration goes as:
variables_order = "EGPCS"
In my case the problem is with php version on my hosting. To run with 7.4 I have to use php74 cli:
**php74** vendor/bin/drush updatedb
Drush will not work with custom php binary name. Even after I modified this path in drush classess, got still same error. It's known bug posted few times on drush github issues.
I am trying to connect to a Spark cluster using sparklyr on yarn-client mode.
On local mode (master = "local") my spark setup works, but when I try to connect to the Cluster, I get the following error
Error in force(code) :
Failed during initialize_connection: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
(see full error log below)
The setup is as follows. The spark cluster (hosted on AWS), setup with Ambari, runs on yarn 3.1.1, spark 2.3.2, hdfs 3.1.1, and some other services and works with other platforms (i.e., non R/Python applications setup with Ambari. Note that a setup using Ambari is not possible, as the R machine runs on Ubuntu, and the Spark cluster on CentOS 7).
On my R machine I use the following code. Note that I have installed java 8-openjdk and the correct spark version.
Inside of my YARN_CONF_DIR I have created the yarn-site.xml file, as exported from Ambari (Services -> Download All Client Configs). I have also tried to copy the files hdfs-site.xml and hive-site.xml with the same result.
library(sparklyr)
library(DBI)
# spark_install("2.3.2")
spark_installed_versions()
#> spark hadoop dir
#> 1 2.3.2 2.7 /home/david/spark/spark-2.3.2-bin-hadoop2.7
# use java 8 instead of java 11 (not supported with Spark 2.3.2 only 3.0.0+)
Sys.setenv(JAVA_HOME = "/usr/lib/jvm/java-8-openjdk-amd64/")
Sys.setenv(SPARK_HOME = "/home/david/spark/spark-2.3.2-bin-hadoop2.7/")
Sys.setenv(YARN_CONF_DIR = "/home/david/Spark-test/yarn-conf")
conf <- spark_config()
conf$spark.executor.memory <- "500M"
conf$spark.executor.cores <- 2
conf$spark.executor.instances <- 1
conf$spark.dynamicAllocation.enabled <- "false"
sc <- spark_connect(master = "yarn-client", config = conf)
#> Error in force(code) :
#> Failed during initialize_connection: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
#> ...
I am not really sure how to debug this, on which machine the error originates, or how to fix it, thus any help and or hint is greatly appreciated!
Edit / Progress
So far I have found out, that the spark version installed by sparklyr (from here), depends on glassfish, whereas my cluster depends on an oracle java installation (hence the com/sun/... path).
This applies to the following java packages:
library(tidyverse)
library(glue)
ll <- list.files("~/spark/spark-2.3.2-bin-hadoop2.7/jars/", pattern = "^jersey", full.names = TRUE)
df <- map_dfr(ll, function(f) {
x <- system(glue("jar tvf {f}"), intern = TRUE)
tibble(file = f, class = str_extract(x, "[^ ]+$"))
})
df %>%
filter(str_detect(class, "com/sun")) %>%
count(file)
#> # A tibble: 4 x 2
#> file n
#> <chr> <int>
#> 1 /home/david/spark/spark-2.3.2-bin-hadoop2.7/jars//activation-1.1.1.jar 15
#> 2 /home/david/spark/spark-2.3.2-bin-hadoop2.7/jars//derby.log 1194
#> 3 /home/david/spark/spark-2.3.2-bin-hadoop2.7/jars//jersey-client-1.19.jar 108
#> 4 /home/david/spark/spark-2.3.2-bin-hadoop2.7/jars//jersey-server-2.22.2.jar 22
I have tried to load the latest jar files from maven (e.g., from this) for the files jersey-client.jar and jersey-core.jar and now the connection takes ages and does not finish (at least not the same error anymore, Yay I guess...). Any idea what the cause of this issue is?
Full Error log
Error in force(code) :
Failed during initialize_connection: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:151)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sparklyr.Invoke.invoke(invoke.scala:147)
at sparklyr.StreamHandler.handleMethodCall(stream.scala:136)
at sparklyr.StreamHandler.read(stream.scala:61)
at sparklyr.BackendHandler$$anonfun$channelRead0$1.apply$mcV$sp(handler.scala:58)
at scala.util.control.Breaks.breakable(Breaks.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:14)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 49 more
Log: /tmp/RtmpIKnflg/filee462cec58ee_spark.log
---- Output Log ----
20/07/16 10:20:42 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/07/16 10:20:42 INFO sparklyr: Session (3779) is starting under 127.0.0.1 port 8880
20/07/16 10:20:42 INFO sparklyr: Session (3779) found port 8880 is not available
20/07/16 10:20:42 INFO sparklyr: Backend (3779) found port 8884 is available
20/07/16 10:20:42 INFO sparklyr: Backend (3779) is registering session in gateway
20/07/16 10:20:42 INFO sparklyr: Backend (3779) is waiting for registration in gateway
20/07/16 10:20:42 INFO sparklyr: Backend (3779) finished registration in gateway with status 0
20/07/16 10:20:42 INFO sparklyr: Backend (3779) is waiting for sparklyr client to connect to port 8884
20/07/16 10:20:43 INFO sparklyr: Backend (3779) accepted connection
20/07/16 10:20:43 INFO sparklyr: Backend (3779) is waiting for sparklyr client to connect to port 8884
20/07/16 10:20:43 INFO sparklyr: Backend (3779) received command 0
20/07/16 10:20:43 INFO sparklyr: Backend (3779) found requested session matches current session
20/07/16 10:20:43 INFO sparklyr: Backend (3779) is creating backend and allocating system resources
20/07/16 10:20:43 INFO sparklyr: Backend (3779) is using port 8885 for backend channel
20/07/16 10:20:43 INFO sparklyr: Backend (3779) created the backend
20/07/16 10:20:43 INFO sparklyr: Backend (3779) is waiting for r process to end
20/07/16 10:20:43 INFO SparkContext: Running Spark version 2.3.2
20/07/16 10:20:43 WARN SparkConf: spark.master yarn-client is deprecated in Spark 2.0+, please instead use "yarn" with specified deploy mode.
20/07/16 10:20:43 INFO SparkContext: Submitted application: sparklyr
20/07/16 10:20:43 INFO SecurityManager: Changing view acls to: ubuntu
20/07/16 10:20:43 INFO SecurityManager: Changing modify acls to: ubuntu
20/07/16 10:20:43 INFO SecurityManager: Changing view acls groups to:
20/07/16 10:20:43 INFO SecurityManager: Changing modify acls groups to:
20/07/16 10:20:43 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ubuntu); groups with view permissions: Set(); users with modify permissions: Set(ubuntu); groups with modify permissions: Set()
20/07/16 10:20:43 INFO Utils: Successfully started service 'sparkDriver' on port 42419.
20/07/16 10:20:43 INFO SparkEnv: Registering MapOutputTracker
20/07/16 10:20:43 INFO SparkEnv: Registering BlockManagerMaster
20/07/16 10:20:43 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/07/16 10:20:43 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/07/16 10:20:43 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-583db378-821a-4990-bfd2-5fcaf95d071b
20/07/16 10:20:44 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
20/07/16 10:20:44 INFO SparkEnv: Registering OutputCommitCoordinator
20/07/16 10:20:44 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
20/07/16 10:20:44 INFO Utils: Successfully started service 'SparkUI' on port 4041.
20/07/16 10:20:44 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://{SPARK IP}
Then in the /tmp/RtmpIKnflg/filee462cec58ee_spark.log file
20/07/16 10:09:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/07/16 10:09:07 INFO sparklyr: Session (11296) is starting under 127.0.0.1 port 8880
20/07/16 10:09:07 INFO sparklyr: Session (11296) found port 8880 is not available
20/07/16 10:09:07 INFO sparklyr: Backend (11296) found port 8882 is available
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is registering session in gateway
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is waiting for registration in gateway
20/07/16 10:09:07 INFO sparklyr: Backend (11296) finished registration in gateway with status 0
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is waiting for sparklyr client to connect to port 8882
20/07/16 10:09:07 INFO sparklyr: Backend (11296) accepted connection
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is waiting for sparklyr client to connect to port 8882
20/07/16 10:09:07 INFO sparklyr: Backend (11296) received command 0
20/07/16 10:09:07 INFO sparklyr: Backend (11296) found requested session matches current session
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is creating backend and allocating system resources
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is using port 8883 for backend channel
20/07/16 10:09:07 INFO sparklyr: Backend (11296) created the backend
20/07/16 10:09:07 INFO sparklyr: Backend (11296) is waiting for r process to end
20/07/16 10:09:08 INFO SparkContext: Running Spark version 2.3.2
20/07/16 10:09:08 WARN SparkConf: spark.master yarn-client is deprecated in Spark 2.0+, please instead use "yarn" with specified deploy mode.
20/07/16 10:09:08 INFO SparkContext: Submitted application: sparklyr
20/07/16 10:09:08 INFO SecurityManager: Changing view acls to: david
20/07/16 10:09:08 INFO SecurityManager: Changing modify acls to: david
20/07/16 10:09:08 INFO SecurityManager: Changing view acls groups to:
20/07/16 10:09:08 INFO SecurityManager: Changing modify acls groups to:
20/07/16 10:09:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(david); groups with view permissions: Set(); users with modify permissions: Set(david); groups with modify permissions: Set()
20/07/16 10:09:08 INFO Utils: Successfully started service 'sparkDriver' on port 44541.
20/07/16 10:09:08 INFO SparkEnv: Registering MapOutputTracker
20/07/16 10:09:08 INFO SparkEnv: Registering BlockManagerMaster
20/07/16 10:09:08 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
20/07/16 10:09:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
20/07/16 10:09:08 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d7b67ab2-508c-4488-ac1b-7ee0e787aa79
20/07/16 10:09:08 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
20/07/16 10:09:08 INFO SparkEnv: Registering OutputCommitCoordinator
20/07/16 10:09:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
20/07/16 10:09:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://{THE INTERNAL SPARK IP}:4040
20/07/16 10:09:08 INFO SparkContext: Added JAR file:/home/david/R/x86_64-pc-linux-gnu-library/4.0/sparklyr/java/sparklyr-2.3-2.11.jar at spark://{THE INTERNAL SPARK IP}:44541/jars/sparklyr-2.3-2.11.jar with timestamp 1594894148685
20/07/16 10:09:09 ERROR sparklyr: Backend (11296) failed calling getOrCreate on 11: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:151)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sparklyr.Invoke.invoke(invoke.scala:147)
at sparklyr.StreamHandler.handleMethodCall(stream.scala:136)
at sparklyr.StreamHandler.read(stream.scala:61)
at sparklyr.BackendHandler$$anonfun$channelRead0$1.apply$mcV$sp(handler.scala:58)
at scala.util.control.Breaks.breakable(Breaks.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:38)
at sparklyr.BackendHandler.channelRead0(handler.scala:14)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 49 more
I have installed airflow via github source. I have configured airflow with mysql metadb with local executer. When I tried to start my webserver it couldn't able to start.
install.sh
mkdir -p ~/home
export AIRFLOW_HOME=~/airflow
cd $AIRFLOW_HOME
virtualenv env
source env/bin/activate
mkdir -p /usr/local/src/
cd /usr/local/src/
git clone https://github.com/apache/incubator-airflow.git
cd incubator-airflow
git checkout tags/1.8.2
pip install -e .
pip install -e .[hive]
pip install -e .[gcp_api]
pip install -e .[mysql]
pip install -e .[password]
pip install -e .[celery]
airflow.cfg:
[core]
# The home folder for airflow, default is ~/airflow
airflow_home = /root/airflow
dags_folder = /root/airflow/dags
base_log_folder = /root/airflow/logs
encrypt_s3_logs = False
executor = LocalExecutor
sql_alchemy_conn = mysql://root:*****#localhost/airflow
when I tried to start my webserver using it shows ttou signal handling and existing worker.
airflow webserver -p 8080
[2017-11-20 04:05:30,642] {__init__.py:57} INFO - Using executor LocalExecutor
[2017-11-20 04:05:30,723] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
[2017-11-20 04:05:30,756] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:05:31,437] [3079] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
Running the Gunicorn Server with:
Workers: 8 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
=================================================================
[2017-11-20 04:05:32,074] {__init__.py:57} INFO - Using executor LocalExecutor
[2017-11-20 04:05:32,153] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
[2017-11-20 04:05:32,184] {driver.py:120} INFO - Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
[2017-11-20 04:05:32 +0000] [3087] [INFO] Starting gunicorn 19.3.0
[2017-11-20 04:05:32 +0000] [3087] [INFO] Listening at: http://0.0.0.0:8080 (3087)
[2017-11-20 04:05:32 +0000] [3087] [INFO] Using worker: sync
[2017-11-20 04:05:32 +0000] [3098] [INFO] Booting worker with pid: 3098
[2017-11-20 04:05:32 +0000] [3099] [INFO] Booting worker with pid: 3099
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:05:32 +0000] [3100] [INFO] Booting worker with pid: 3100
[2017-11-20 04:05:32 +0000] [3101] [INFO] Booting worker with pid: 3101
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:05:32 +0000] [3102] [INFO] Booting worker with pid: 3102
[2017-11-20 04:05:32 +0000] [3103] [INFO] Booting worker with pid: 3103
[2017-11-20 04:05:32 +0000] [3104] [INFO] Booting worker with pid: 3104
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:05:32 +0000] [3105] [INFO] Booting worker with pid: 3105
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:05:33,198] [3099] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:33,312] [3098] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:33,538] [3100] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:33,863] [3101] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:33,963] [3102] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:33,987] [3104] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:34,062] [3105] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:05:34,162] [3103] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:06:05 +0000] [3087] [INFO] Handling signal: ttin
[2017-11-20 04:06:05 +0000] [3121] [INFO] Booting worker with pid: 3121
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:06:05,426] [3121] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:06:06 +0000] [3087] [INFO] Handling signal: ttou
[2017-11-20 04:06:06 +0000] [3098] [INFO] Worker exiting (pid: 3098)
[2017-11-20 04:06:36 +0000] [3087] [INFO] Handling signal: ttin
[2017-11-20 04:06:36 +0000] [3136] [INFO] Booting worker with pid: 3136
/root/env/local/lib/python2.7/site-packages/flask/exthook.py:71: ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use flask_cache instead.
.format(x=modname), ExtDeprecationWarning
[2017-11-20 04:06:36,818] [3136] {models.py:167} INFO - Filling up the DagBag from /root/airflow/dags
[2017-11-20 04:06:37 +0000] [3087] [INFO] Handling signal: ttou
[2017-11-20 04:06:37 +0000] [3099] [INFO] Worker exiting (pid: 3099)
[2017-11-20 04:07:07 +0000] [3087] [INFO] Handling signal: ttin
[2017-11-20 04:07:07 +0000] [3144] [INFO] Booting worker with pid: 3144
Your webserver is running fine. Workers are regularly "refreshed" at an interval set by worker_refresh_interval so that they pickup on new or updated DAGs. When this happens, you'll see the signal ttin (increase processes by one) always followed byttou (decrease processes by one), where a new worker is added before the oldest worker is removed.
I am using Nginx for HTTP & TCP load balancing. HTTP is working fine, But i am getting below error for TCP load balancing. I am using Nginx ngx_stream_core_module module.
nginx version: nginx/1.9.13
Nginx error Logs :
2016/07/12 07:46:16 [error] 16737#16737: *32006 recv() failed (104: Connection reset by peer) while proxying connection, client: 27.50.X.X, server: 0.0.0.0:3030, upstream: "10.7.0.12:3030", bytes from/to client:354/318, bytes from/to upstream:318/354
2016/07/12 07:48:53 [error] 16737#16737: *32048 recv() failed (104: Connection reset by peer) while proxying connection, client: 27.50.X.X, server: 0.0.0.0:3030, upstream: "10.7.0.12:3030", bytes from/to client:324/292, bytes from/to upstream:292/324
2016/07/12 07:51:40 [error] 16737#16737: *32109 recv() failed (104: Connection reset by peer) while proxying connection, client: 27.50.X.X, server: 0.0.0.0:3030, upstream: "10.7.0.12:3030", bytes from/to client:324/292, bytes from/to upstream:292/324
Can anyone help me to understand why I am getting this errors ?
Is there any way to enable access logs for tcp requests coming to Nginx?