Import private macro from Github to Warp10 - warpscript

Do you know how to import a macro from Github to Warp10 using WarpScript ?
[ 'https://github.com/miton18/test-ws/blob/master' ] WF.SETREPOS
WF.GETREPOS
'miton18/test-ws' 'me' IMPORT
[ NEWGTS ] #me/renameWithLabel
content of renameWithLabel.mc2 on Git:
<% DROP DUP LABELS $label GET RENAME %> LMAP

as explained here: https://blog.senx.io/share-your-warpscript-macros/
'https://raw.githubusercontent.com/miton18/test-ws/master' WF.ADDREPO
WF.GETREPOS // check...
[] #utils/renameWithLabel
Your macro must be in a sub folder ;)
But for private repos, try https://user:pass#xxxx or modify the Warp 10 configuration.

Indeed, you always need a subdirectory (as every macros repos, even the macros directory of your Warp 10 egress)
IMPORT is usefull if the full path of your macro contains lots of subdirectories.
'https://raw.githubusercontent.com/pi-r-p/warpscript/master' WF.ADDREPO
'viz' 'a' IMPORT
[] #a/juxtaposeGTS //now I can call a instead of viz.
playable warpscript here.

you can use https://github.com/senx/WarpFleetSynchronizer ;)
You will be able to deploy a private WarpFleet resolver with git repos synchronization.

Related

How can I set up default startup commands in iPython notebooks?

I want to put couple of cells with commands I need in almost every new notebook in every new notebook I create.
For example when I create a new notebook it should put a
%matplotlib inline
import matplotlib.pyplot as plt
in a cell by default but not execute it.
How could I set something like that up?
This will work for both terminal based IPython shell and Browser based Notebook:
Navigate to ~/.ipython/profile_default
Create a folder called startup if it’s not already there
Add a new Python file called start.py
Put your favorite imports (and custom functions may be) in this file
Launch IPython or a Jupyter Notebook and your favorite libraries will be automatically loaded every time!
Here is my sample for start.py:
Another Source
To define set of commands on default startup, you need to add the commands in the templete ipy_user_conf.py file in your ~/.ipython directory.
This module is imported during IPython startup. So you can easily do : import modules, configure extensions, change options, define magic commands, put variables and functions in the IPython namespace etc.
Here is the sample ipy_user_conf.py :
# Most of your config files and extensions will probably start
# with this import
import IPython.ipapi
ip = IPython.ipapi.get()
# You probably want to uncomment this if you did %upgrade -nolegacy
# import ipy_defaults
import os
def main():
#ip.dbg.debugmode = True
ip.dbg.debug_stack()
# uncomment if you want to get ipython -p sh behaviour
# without having to use command line switches
import ipy_profile_sh
import jobctrl
# Configure your favourite editor?
# Good idea e.g. for %edit os.path.isfile
#import ipy_editors
# Choose one of these:
#ipy_editors.scite()
#ipy_editors.scite('c:/opt/scite/scite.exe')
#ipy_editors.komodo()
#ipy_editors.idle()
# ... or many others, try 'ipy_editors??' after import to see them
# Or roll your own:
#ipy_editors.install_editor("c:/opt/jed +$line $file")
o = ip.options
# An example on how to set options
#o.autocall = 1
o.system_verbose = 0
#import_all("os sys")
#execf('~/_ipython/ns.py')
# -- prompt
# A different, more compact set of prompts from the default ones, that
# always show your current location in the filesystem:
#o.prompt_in1 = r'\C_LightBlue[\C_LightCyan\Y2\C_LightBlue]\C_Normal\n\C_Green|\#>'
#o.prompt_in2 = r'.\D: '
#o.prompt_out = r'[\#] '
# Try one of these color settings if you can't read the text easily
# autoexec is a list of IPython commands to execute on startup
#o.autoexec.append('%colors LightBG')
#o.autoexec.append('%colors NoColor')
o.autoexec.append('%colors Linux')
# some config helper functions you can use
def import_all(modules):
""" Usage: import_all("os sys") """
for m in modules.split():
ip.ex("from %s import *" % m)
def execf(fname):
""" Execute a file in user namespace """
ip.ex('execfile("%s")' % os.path.expanduser(fname))
main()
For more details, please refer the link : Customization of IPython.
I hope this is what you wanted to know.
JupyterLab
In a comment to one of the other answers, the OP pointed out the need to insert the actual code instead of having it load in the background. One way is to create a text keyboard shortcut by going to Settings -> Advanced settings editor -> JSON settings Editor and adding the following under User Preferences:
{
"shortcuts": [
{
"command": "apputils:run-first-enabled",
"selector": "body",
"keys": ["Alt I"],
"args": {
"commands": [
"console:replace-selection",
"fileeditor:replace-selection",
"notebook:replace-selection",
],
"args": {"text": "import pandas as pd\nimport altair as alt\n\n"}
}
}
]
}
This will insert the following snippet each time you press Alt + i in the notebook:
import pandas as pd
import altair as alt
# <-- Cursor placed here
More on text shortcuts in jupyterlab
IPython console
If you are interested in automatically importing commonly used libraries in the IPython console only so that they are there for interactive use, but not in the notebook to avoid issues with sharing notebooks lacking some imports, you can launch IPython like so (and set up an alias to not have to type this each time):
ipython -c "import pandas as pd; import numpy as np" -i
(This was what I was looking for when I originally found this question)

How to share version values between project/plugins.sbt and project/Build.scala?

I would like to share a common version variable between an sbtPlugin and the rest of the build
Here is what I am trying:
in project/Build.scala:
object Versions {
scalaJs = "0.5.0-M3"
}
object MyBuild extends Build {
//Use version number
}
in plugins.sbt:
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
results in
plugins.sbt:15: error: not found: value Versions
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % Versions.scalaJs)
Is there a way to share the version number specification between plugins.sbt and the rest of the build, e.g. project/Build.scala?
sbt-buildinfo
If you need to share version number between build.sbt and hello.scala, what would you normally do? I don't know about you, but I would use sbt-buildinfo that I wrote.
This can be configured using buildInfoKeys setting to expose arbitrary key values like version or some custom String value. I understand this is not exactly what you're asking but bear with me.
meta-build (turtles all the way down)
As Jacek noted and stated in Getting Started Guide, the build in sbt is a project defined in the build located in project directory one level down. To distinguish the builds, let's define the normal build as the proper build, and the build that defines the proper build as meta-build. For example, we can say that an sbt plugin is a library of the root project in the meta build.
Now let's get back to your question. How can we share info between project/Build.scala and project/plugins.sbt?
using sbt-buildinfo for meta-build
We can just define another level of build by creating project/project and add sbt-buildinfo to the (meta-)meta-build.
Here are the files.
In project/project/buildinfo.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.3.2")
In project/project/Dependencies.scala:
package metabuild
object Dependencies {
def scalaJsVersion = "0.5.0-M2"
}
In project/build.properties:
sbt.version=0.13.5
In project/buildinfo.sbt:
import metabuild.Dependencies._
buildInfoSettings
sourceGenerators in Compile <+= buildInfo
buildInfoKeys := Seq[BuildInfoKey]("scalaJsVersion" -> scalaJsVersion)
buildInfoPackage := "metabuild"
In project/scalajs.sbt:
import metabuild.Dependencies._
addSbtPlugin("org.scala-lang.modules.scalajs" % "scalajs-sbt-plugin" % scalaJsVersion)
In project/Build.scala:
import sbt._
import Keys._
import metabuild.BuildInfo._
object Builds extends Build {
println(s"test: $scalaJsVersion")
}
So there's a bit of a boilerplate in project/buildinfo.sbt, but the version info is shared across the build definition and the plugin declaration.
If you're curious where BuildInfo is defined, peek into project/target/scala-2.10/sbt-0.13/src_managed/.
For the project/plugins.sbt file you'd have to have another project under project with the Versions.scala file. That would make the definition of Versions.scalaJs visible.
The reason for doing it is that *.sbt files belong to a project build definition at the current level with *.scala files under project to expand on it. And it's...turtles all the way down, i.e. sbt is recursive.
I'm not sure how much the following can help, but it might be worth to try out - to share versions between projects - plugins and the main one - you'd have to use ProjectRef as described in the answer to RootProject and ProjectRef:
When you want to include other, separate builds directly instead of
using their published binaries, you use "source dependencies". This is
what RootProject and ProjectRef declare. ProjectRef is the most
general: you specify the location of the build (a URI) and the ID of
the project in the build (a String) that you want to depend on.
RootProject is a convenience that selects the root project for the
build at the URI you specify.
My proposal is to hack. For example, in build.sbt you can add a task:
val readPluginSbt = taskKey[String]("Read plugins.sbt file.")
readPluginSbt := {
val lineIterator = scala.io.Source.fromFile(new java.io.File("project","plugins.sbt")).getLines
val linesWithValIterator = lineIterator.filter(line => line.contains("scalaxbVersion"))
val versionString = linesWithValIterator.mkString("\n").split("=")(1).trim
val version = versionString.split("\n")(0) // only val declaration
println(version)
version
}
When you call readPluginSbt you will see the contents of plugins.sbt. You can parse this file and extract the variable.
For example:
resolvers += Resolver.sonatypeRepo("public")
val scalaxbVersion = "1.1.2"
addSbtPlugin("org.scalaxb" % "sbt-scalaxb" % scalaxbVersion)
addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.5.1")
You can extract scalaxbVersion with regular expressions/split:
scala> val line = """val scalaxbVersion = "1.1.2""""
line: String = val scalaxbVersion = "1.1.2"
scala> line.split("=")(1).trim
res1: String = "1.1.2"

Can I use sbt's `apiMappings` setting for managed dependencies?

I'd like the ScalaDoc I generate with sbt to link to external libraries, and in sbt 0.13 we have autoAPIMappings which is supposed to add these links for libraries that declare their apiURL. In practice though, none of the libraries I use provide this in their pom/ivy metadata, and I suspect some of these libraries will never do so.
The apiMappings setting is supposed to help with just that, but it is typed as Map[File, URL] and hence geared towards setting doc urls for unmanaged dependencies. Managed dependencies are declared as instances of sbt.ModuleID and cannot be inserted directly in that map.
Can I somehow populate the apiMappings setting with something that will associate an URL with a managed dependency ?
A related question is: does sbt provide an idiomatic way of getting a File from a ModuleID? I guess I could try to evaluate some classpaths and get back Files to try and map them to ModuleIDs but I hope there is something simpler.
Note: this is related to https://stackoverflow.com/questions/18747265/sbt-scaladoc-configuration-for-the-standard-library/18747266, but that question differs by linking to the scaladoc for the standard library, for which there is a well known File scalaInstance.value.libraryJar, which is not the case in this instance.
I managed to get this working for referencing scalaz and play by doing the following:
apiMappings ++= {
val cp: Seq[Attributed[File]] = (fullClasspath in Compile).value
def findManagedDependency(organization: String, name: String): File = {
( for {
entry <- cp
module <- entry.get(moduleID.key)
if module.organization == organization
if module.name.startsWith(name)
jarFile = entry.data
} yield jarFile
).head
}
Map(
findManagedDependency("org.scalaz", "scalaz-core") -> url("https://scalazproject.ci.cloudbees.com/job/nightly_2.10/ws/target/scala-2.10/unidoc/")
, findManagedDependency("com.typesafe.play", "play-json") -> url("http://www.playframework.com/documentation/2.2.1/api/scala/")
)
}
YMMV of course.
The accepted answer is good, but it'll fail when assumptions about exact project dependencies don't hold. Here's a variation that might prove useful:
apiMappings ++= {
def mappingsFor(organization: String, names: List[String], location: String, revision: (String) => String = identity): Seq[(File, URL)] =
for {
entry: Attributed[File] <- (fullClasspath in Compile).value
module: ModuleID <- entry.get(moduleID.key)
if module.organization == organization
if names.exists(module.name.startsWith)
} yield entry.data -> url(location.format(revision(module.revision)))
val mappings: Seq[(File, URL)] =
mappingsFor("org.scala-lang", List("scala-library"), "http://scala-lang.org/api/%s/") ++
mappingsFor("com.typesafe.akka", List("akka-actor"), "http://doc.akka.io/api/akka/%s/") ++
mappingsFor("com.typesafe.play", List("play-iteratees", "play-json"), "http://playframework.com/documentation/%s/api/scala/index.html", _.replaceAll("[\\d]$", "x"))
mappings.toMap
}
(Including scala-library here is redundant, but useful for illustration purposes.)
If you perform mappings foreach println, you'll get output like (note that I don't have Akka in my dependencies):
(/Users/michaelahlers/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar,http://scala-lang.org/api/2.11.7/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-iteratees_2.11/jars/play-iteratees_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-json_2.11/jars/play-json_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
This approach:
Allows for none or many matches to the module identifier.
Concisely supports multiple modules to link the same documentation.
Or, with Nil provided to names, all modules for an organization.
Defers to the module as the version authority.
But lets you map over versions as needed.
As in the case with Play's libraries, where x is used for the patch number.
Those improvements allow you to create a separate SBT file (call it scaladocMappings.sbt) that can be maintained in a single location and easily copy and pasted into any project.
Alternatively to my last suggestion, the sbt-api-mappings plugin by ThoughtWorks shows a lot of promise. Long term, that's a far more sustainable route than each project maintaining its own set of mappings.

Does sbt have something like gradle's processResources task with ReplaceTokens support?

We are moving into Scala/SBT from a Java/Gradle stack. Our gradle builds were leveraging a task called processResources and some Ant filter thing named ReplaceTokens to dynamically replace tokens in a checked-in .properties file without actually changing the .properties file (just changing the output). The gradle task looks like:
processResources {
def whoami = System.getProperty( 'user.name' );
def hostname = InetAddress.getLocalHost().getHostName()
def buildTimestamp = new Date().format('yyyy-MM-dd HH:mm:ss z')
filter ReplaceTokens, tokens: [
"buildsig.version" : project.version,
"buildsig.classifier" : project.classifier,
"buildsig.timestamp" : buildTimestamp,
"buildsig.user" : whoami,
"buildsig.system" : hostname,
"buildsig.tag" : buildTag
]
}
This task locates all the template files in the src/main/resources directory, performs the requisite substitutions and outputs the results at build/resources/main. In other words it transforms src/main/resources/buildsig.properties from...
buildsig.version=#buildsig.version#
buildsig.classifier=#buildsig.classifier#
buildsig.timestamp=#buildsig.timestamp#
buildsig.user=#buildsig.user#
buildsig.system=#buildsig.system#
buildsig.tag=#buildsig.tag#
...to build/resources/main/buildsig.properties...
buildsig.version=1.6.5
buildsig.classifier=RELEASE
buildsig.timestamp=2013-05-06 09:46:52 PDT
buildsig.user=jenkins
buildsig.system=bobk-mbp.local
buildsig.tag=dev
Which, ultimately, finds its way into the WAR file at WEB-INF/classes/buildsig.properties. This works like a champ to record build specific information in a Properties file which gets loaded from the classpath at runtime.
What do I do in SBT to get something like this done? I'm new to Scala / SBT so please forgive me if this seems a stupid question. At the end of the day what I need is a means of pulling some information from the environment on which I build and placing that information into a properties file that is classpath loadable at runtime. Any insights you can give to help me get this done are greatly appreciated.
The sbt-buildinfo is a good option. The README shows an example of how to define custom mappings and mappings that should run on each compile. In addition to the straightforward addition of normal settings like version shown there, you want a section like this:
buildInfoKeys ++= Seq[BuildInfoKey](
"hostname" -> java.net.InetAddress.getLocalHost().getHostName(),
"whoami" -> System.getProperty("user.name"),
BuildInfoKey.action("buildTimestamp") {
java.text.DateFormat.getDateTimeInstance.format(new java.util.Date())
}
)
Would the following be what you're looking for:
sbt-editsource: An SBT plugin for editing files
sbt-editsource is a text substitution plugin for SBT 0.11.x and
greater. In a way, it’s a poor man’s sed(1), for SBT. It provides the
ability to apply line-by-line substitutions to a source text file,
producing an edited output file. It supports two kinds of edits:
Variable substitution, where ${var} is replaced by a value. sed-like
regular expression substitution.
This is from Community Plugins.

Easiest way to specify alternate transmogrifier _path?

I'm doing a content migration with collective.transmogrifier and I'm reading files off the file system with transmogrify.filesystem. Instead of importing the files "as is", I'd like to import them to a sub directory in Plone. What is the easiest way to modify the _path?
For example, if the following exists:
/var/www/html/bar/index.html
I'd like to import to:
/Plone/foo/bar/index.html
In other words, import the contents of "baz" to a subdirectory "foo". I see two options:
Use some blueprint in collective.transmogrifier to mangle _path.
Write some blueprint to mangle _path.
Am I missing anything easier?
Use the standard inserter blueprint to generate the paths; it accepts python expressions and can replace keys in-place:
[manglepath]
blueprint = collective.transmogrifier.sections.inserter
key = string:_path
value = python:item['_path'].replace('/var/www/html', '/Plone/foo')
This thus takes the output of the value python expression (which uses the item _path and stores it back under the same key.

Resources