In Jupyterhub_config.py, we have set c.YarnSpawner.localize_files to a single source path.
c.YarnSpawner.args = ['--NotebookApp.contents_manager_class="hdfscm.HDFSContentsManager"']
c.YarnSpawner.YarnClusterConfig.localize_files = {
'environment': {
'source': 'hdfs:///apps/jupyterhub/jupyter.tar.gz',
'visibility': 'public'
}
}
c.YarnSpawner.prologue = 'source environment/bin/activate'
We have configured Jupyter with pyspark2. Is there a way to configure multiple environments for each kernel?
Related
Let's say that I have a local R package that lives at /home/placey/messyverse.tar.gz
I'd like to start up a nix shell that contains my package as well as ggplot. How do I do that?
First we need to create a nix package that contains the necessary information for your local package.
Lets call it
messverse.nix
with import <nixpkgs> {};
{
messverse = rPackages.buildRPackage rec {
name = "messverse";
version = "0.1";
src = /home/placey/messverse.tar.gz;
buildInputs = with rPackages; [
R
stringr
];
};
}
Then in the same folder we will create the default.nix that defnes what is needed for the nix shell.
default.nix
with import <nixpkgs> {};
with import ./messyverse.nix;
{
myProject = stdenv.mkDerivation {
name = "myProject";
version = "1";
src = if pkgs.lib.inNixShell then null else nix;
buildInputs = with rPackages; with messyverse; [
R
ggplot2
messyverse
];
};
}
now we can execute
nix-shell .
and we have a shell which contains R & our locally specified R package!
SBT is silently failing when it can't download a plugin via SSH from a Git repository.
This is the output of SBT when it's trying to download the repository:
[info] Updating ProjectRef(uri("ssh://git#repository.com/plugin.git"), "plugin")...
# (nothing after that line)
And it just terminates after that with no explanation. This is very likely a bug with SBT's downloading of plugins via SSH from a Git repository.
When downloading the plugin succeeds, this line is printed:
[info] Done updating.
So for some reason, SBT isn't stating what's wrong, even when executed like this:
sbt -Xdebug test
Here are the relevant configuration files:
# project/build-properties
sbt.version=1.1.5
# project/plugins.sbt
lazy val buildPlugin = RootProject(uri("ssh://git#repository.com/plugin.git"))
lazy val root = (project in file(".")).dependsOn(buildPlugin)
Questions:
1. How can I get SBT to print more debugging information?
2. Where in the SBT code could I fix this bug?
3. How can I build and use my own version of SBT?
How can I get SBT to print more debugging information?
Using the latest launching script available from https://www.scala-sbt.org/download.html (1.2.1 as of August, 2018), you can run:
$ sbt -debug
Where in the SBT code could I fix this bug?
See my answer here https://github.com/sbt/sbt/issues/1120#issuecomment-415553592:
Here are some of the relevant code:
Load.builtinLoader - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/internal/Load.scala#L480-L488
RetrieveUnit - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/internal/RetrieveUnit.scala
Resolvers.git - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/Resolvers.scala#L82-L101
Resolvers.creates - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/Resolvers.scala#L145-L155
val git: Resolver = (info: ResolveInfo) => {
val uri = info.uri.withoutMarkerScheme
val localCopy = uniqueSubdirectoryFor(uri.copy(scheme = "git"), in = info.staging)
val from = uri.withoutFragment.toASCIIString
if (uri.hasFragment) {
val branch = uri.getFragment
Some { () =>
creates(localCopy) {
run("git", "clone", from, localCopy.getAbsolutePath)
run(Some(localCopy), "git", "checkout", "-q", branch)
}
}
} else
Some { () =>
creates(localCopy) {
run("git", "clone", "--depth", "1", from, localCopy.getAbsolutePath)
}
}
}
....
def creates(file: File)(f: => Unit) = {
if (!file.exists)
try {
f
} catch {
case NonFatal(e) =>
IO.delete(file)
throw e
}
file
}
How can I build and use my own version of SBT?
https://github.com/sbt/sbt/blob/1.x/CONTRIBUTING.md#build-from-source
For this, you just need sbt/sbt, and publishLocal.
I have built a dynamic library (to add ICU support in this case) which i need to add as a dependency to a pod. For that I created a pod with the following podspec (I removed things like authors, license, ... to keep it short)
Pod::Spec.new do |s|
s.name = 'unicode'
s.version = '57.0'
s.source = { :git => "git#bitbucket.org:mycompany/unicode.git", :tag => "#{s.version}" }
s.requires_arc = false
s.platform = :ios, '8.0'
s.default_subspecs = 'all'
s.subspec 'all' do |ss|
ss.header_mappings_dir = 'icu4c/include'
ss.source_files = 'icu4c/include/**/*.h'
ss.public_header_files = 'icu4c/include/**/*.h'
ss.vendored_libraries = 'Frameworks/lib*.dylib'
end
end
Here i have a second pod where i need to link these libraries too
Pod::Spec.new do |s|
s.name = 'sqlite3'
s.version = '3.14.2'
s.summary = 'SQLite is an embedded SQL database engine'
s.documentation_url = 'https://sqlite.org/docs.html'
s.homepage = 'https://github.com/clemensg/sqlite3pod'
s.authors = { 'Clemens Gruber' => 'clemensgru#gmail.com' }
v = s.version.to_s.split('.')
archive_name = "sqlite-amalgamation-"+v[0]+v[1].rjust(2, '0')+v[2].rjust(2, '0')+"00"
#s.source = { :http => "https://www.sqlite.org/#{Time.now.year}/#{archive_name}.zip" }
s.source = { :git => "git#bitbucket.org:wrthphoenixspeedy/sqlite3.git", :tag => "#{s.version}" }
s.requires_arc = false
s.platform = :ios, '8.0'
s.default_subspecs = 'common'
s.subspec 'common' do |ss|
ss.source_files = "#{archive_name}/sqlite*.{h,c}"
ss.osx.pod_target_xcconfig = { 'OTHER_CFLAGS' => '$(inherited) -DHAVE_USLEEP=1' }
# Disable OS X / AFP locking code on mobile platforms (iOS, tvOS, watchOS)
sqlite_xcconfig_ios = { 'OTHER_CFLAGS' => '$(inherited) -DHAVE_USLEEP=1 -DSQLITE_ENABLE_LOCKING_STYLE=0' }
ss.ios.pod_target_xcconfig = sqlite_xcconfig_ios
ss.tvos.pod_target_xcconfig = sqlite_xcconfig_ios
ss.watchos.pod_target_xcconfig = sqlite_xcconfig_ios
end
# enable support for icu - International Components for Unicode
s.subspec 'icu' do |ss|
ss.dependency 'sqlite3/common'
ss.pod_target_xcconfig = { 'OTHER_CFLAGS' => '$(inherited) -DSQLITE_ENABLE_ICU=1' }
ss.dependency 'unicode', '57.0'
ss.libraries = 'icucore', 'icudata.57.1', 'icui18n.57.1', 'icuio.57.1', 'icule.57.1', 'iculx.57.1', 'icutu.57.1', 'icuuc.57.1'
end
end
And with these i am able to compile it. Cocoapods is copying these libraries on build time into the folder ../Frameworks/ rather than to do while on run time. Instead it fails because it says that it doesn't find the library in ../lib.
dyld: Library not loaded: ../lib/libicudata.57.1.dylib
Referenced from: /var/containers/Bundle/Application/9663CB3A-6ACD-487E-A92D-48F8AFE5260C/MyApp.app/MyApp
Reason: image not found
I have to use use_frameworks! because i am using some Swift frameworks too.
So i am doing something wrong... the question is, can i link a dylib from one pod to another pod? and if so... how?
Based on the disparity between "libs" and "Frameworks", this looks like an issue with either runpath search paths (the running app is not looking for the library from Frameworks), or with the install name of the library not matching the location where it's placed relative to where it is dynamically loaded from.
Make sure that in the app that bundles the dynamic library you have the following paths included in your "Runpath Search Path": #executable_path/../Frameworks, #loader_path/../Frameworks
Make sure that the "Dynamic Library Install Name" name of the library being loaded is set to the equivalent of #rpath/$(EXECUTABLE_PATH) (i.e. in your case it should be "#rpath/libicudata.57.1.dylib"). You can set it during build time using the -install_name compiler (linker?) flag, or with install_name_tool, like so: install_name_tool -id "#rpath/libicudata.57.1.dylib" libicudata.57.1.dylib . Hopefully doesn't come to this though.
I am trying to use the autogrow plugin for ckeditor. As I am new to django and ckeditor I am having trouble with configuration. My settings are not being recognized. Below is a list of steps. I have seen reference to having to build ckeditor over again. I do not know if this is needed.
Platform: Ubuntu, django-cms 3 beta, djangocms-text-ckeditor (installed used pip in virtualenv), python 2.7.
I do not know exactly what I need to do, but I changed the following anyway.
S1. In project's settings.py, added
CKEDITOR_SETTINGS = getattr(settings, 'CKEDITOR_SETTINGS', {
'config.autoGrow_onStartup': True,
'config.autoGrow_minHeight': 200,
'config.autoGrow_maxHeight': 400,
})
S2. In ../site-packages/django_text_ckeditor/static/ckeditor/config.js, edited
CKEDITOR.editorConfig = function( config ) {
// Define changes to default configuration here. For example:
// config.language = 'fr';
// config.uiColor = '#AADC6E';
config.autoGrow_onStartup = true;
config.autoGrow_minHeight = 2000;
config.autoGrow_maxHeight = 4000;
};
S3. Added the autogrow plugin folder to
"../site-packages/django_text_ckeditor/static/ckeditor/plugins/autogrow"
S4. Modified line 45 of
"../site-packages/django_text_ckeditor/static/js/cms.ckeditor.js"
'extraPlugins': 'cmsplugins, autogrow'
S5. Added an extra statement after line 58 of
"../site-packages/django_text_ckeditor/static/js/cms.ckeditor.js"
// this is line 58
CKEDITOR.plugins.addExternal('cmsplugins', settings.static_url + 'ckeditor_plugins/cmsplugins/');
// this is the added line
CKEDITOR.plugins.addExternal('autogrow', settings.static_url + 'ckeditor/plugins/autogrow');
Do not know what else to do? Thoughts? Advice?
I'am using standard django-ckeditor from here: https://github.com/django-ckeditor/django-ckeditor
Your Step S2 and S4 and S5 are not needed. It makes no sens to modify the source of ckeditor. Just download the autogrow plugin from http://ckeditor.com/addon/autogrow and config with settings.py:
CKEDITOR_CONFIGS = {
'default': {
'autoGrow_onStartup': True,
'autoGrow_minHeight': 100,
'autoGrow_maxHeight': 650,
'extraPlugins': 'autogrow',
'toolbar': 'Custom',
'toolbar_Custom': [
['Bold', 'Italic', 'Underline'],
['Format'],
#['NumberedList', 'BulletedList', '-', 'Outdent', 'Indent', '-', 'JustifyLeft', 'JustifyCenter', 'JustifyRight', 'JustifyBlock'],
['Link', 'Unlink'],
['RemoveFormat', 'Source']
],
}
}
I would like to run a webservice and wait for a few seconds after to get the result.
What is the best way to achieve a wait in puppet ?
You could use the linux sleep command with exec and stage it to run after the web-service. something like :
exec { 'wait_for_my_web_service' :
require => Service["my_web_service"],
command => "sleep 10 && /run/my/command/to/get/results/from/the/web/service",
path => "/usr/bin:/bin",
}
My take on a local-only wait + configurable retry.
define wait_for_port ( $protocol = 'tcp', $retry = 10 ) {
$port = $title
exec { "wait-for-port${port}":
command => "until fuser ${port}/${protocol}; do i=\$[i+1]; [ \$i -gt ${retry} ] && break || sleep 1; done",
provider => 'shell',
}
}
wait_for_port { '3000': }
I have a class that executes a DSC resource, but required to wait for 20 seconds, before it executes it.
Hence, I used an exec resource, relying on Powershell, just before the dsc resource:
$command = 'Start-Sleep -Seconds 20'
exec { 'wait_time':
command => $command,
provider => powershell,
timeout => 0,
tries => 3,
try_sleep => 10
}
Of course you can either omit or change the tries and the try_sleep parameters.