In a Turbo repo, how to import a local build of an external package from an internal package? - turborepo

I'm working on the structure for a monorepo using Turbo repo.
If I have this:
packages/
development/
dev-packageA/ // This is a mock for dev that consumes packageA
packageA/ // PackageA is external. It gets deployed
build/
My question is:
Can I import packageA inside dev-packageA and get it from packageA/build instead of having to publish a new version of packageA every time I want to test my dev mocks?

Related

Nette - importing naja library

I have a question. I need to make some modification to our nette application (first time working with the framework). I need to import NAJA lib via FTP and by downloading code from github. I do not have access to server or console, so download via composer, npm... is not possible.
I done this:
Downloaded all files from https://github.com/jiripudil/Naja/releases (not the ZIPs) and put them in directory vendor
loaded in app/presenters/templates/#layout.latte with:
<script src="{$basePath}/js/vendor/Naja.js" type="module"></script>
Tried to inicialize it in one of my javascript with:
document.addEventListener('DOMContentLoaded', naja.initialize() );
But then error pop up, saying that naja is not defined. What am I missing?
Naja is written using modern JavaScript, it can be initialized only via ES5+ import, for instance
import naja from 'naja';
and then the event listener for initializing naja can be added after that code, and it can be compiled via webpack like this https://github.com/MinecordNetwork/Website/blob/master/webpack.config.js
You can do it all on your local machine with npm and yarn installed, and then upload bundle.js that was built to the server, it's also used in the mentioned repository so you can check out how and what libraries are needed, to compile the code from /public/js/main.js type yarn encore production or yarn build for development.

Imported deno thirdparties in production

Deno does not use any package manager like npm, it only imports the thirdparty dependencies with a URL. Lets see an example below:
import { Application } from "https://deno.land/x/abc#v1.0.0-rc8/mod.ts";
Does the deployed code in production contain the content of https://deno.land/x/abc#v1.0.0-rc8/mod.ts or the server in production has to send a request to the URL to get the thirdparty code?
For production, deno recommends saving your dependencies to git, if you follow that recommendation, then your server won't need to download anything since it will already be cached.
In order to do that you have to set the environment variable DENO_DIR to specify where do you want to download dependencies.
DENO_DIR=$PWD/vendor deno cache server.ts
# DENO_DIR=$PWD/vendor deno run server.ts
With the above command, all dependencies for server.ts will be downloaded into your project, inside vendor/ directory, which you can commit to git.
Then on the production server, you'll have to set DENO_DIR to read from vendor/ and not for the default path, which can be obtained by issuing:
deno info
If you don't store the dependencies on your version control system, then deno will download the dependencies once, and store them into DENO_DIR directory.
Taken from deno manual:
But what if the host of the URL goes down? The source won't be available.
This, like the above, is a problem faced by any remote dependency
system. Relying on external servers is convenient for development but
brittle in production. Production software should always vendor its
dependencies. In Node this is done by checking node_modules into
source control. In Deno this is done by pointing $DENO_DIR to some
project-local directory at runtime, and similarly checking that into
source control:
# Download the dependencies.
DENO_DIR=./deno_dir deno cache src/deps.ts
# Make sure the variable is set for any command which invokes the cache.
DENO_DIR=./deno_dir deno test src
# Check the directory into source control.
git add -u deno_dir
git commit

bundle only selected libraries in packrat

I am trying to move a project from local machine to a server with no internet access and no privilege to install libraries.
The server is already installed with many of the libraries.
For my current project the are some libraries and dependencies which are not available on server.
So, I am trying to use packrat to bundle and move the project to server.
Now, the bundle size is becoming huge and others. I want to bundle only packages that are not available on server. How can I do this?
Create a project with all your libraries and work, load packrat library and call function bundle()
library(packrat)
bundle()
This create a projname.tar.gz file
Copy this file and paste on your server project folder and call unbundle function as follows, bundle = name of your bundle and "." means unbundle here in that folder
library(packrat)
unbundle(bundle="packlib.tar.gz",where=".")

Symbol export and Meteor local package

I'd like to build a custom authentification procedure for a Meteor app.
To that end, I have created a local package into the myApp/packages folder with the following command:
meteor create --package accounts-custom
As a simple test, I have cloned accounts-password package code there and added the local package to my app:
meteor add accounts-custom
Il would expect this setup to be equivalent of directly adding accounts-password to the app (meteor add accounts-password)
But running the app gives me an error:
Accounts.findUserByEmail is not a function
findUserByEmail is defined by accounts-password... which makes me think that my custom package is not correctly taken into account...
How can I redefine Accounts from a local package? Any insight?
In case this could help anyone, it finally worked adding api.mainModule('server-main.js', 'server') in the package.js file

profile-refresh in Fuse 6.2 does not reload snapshot bundle

I am running JBoss Fuse 6.2.0.
I built a small camel application that just writes to the log every 5 seconds.
I built it and installed the SNAPSHOT bundle jar in my local Maven repository.
In the Karaf console I did the following:
fabric:profile-create --parent feature-camel logdemo
fabric:profile-edit --bundle mvn:com.company.project/logdemo logdemo
fabric:container-create-child --profile logdemo root child1
The camel application now worked as intended.
I then made a small change to the application, rebuilt it and installed the new SNAPSHOT bundle jar in my local Maven repo.
In the Karaf console I then did the following to get Karaf to load the new jar:
fabric:profile-refresh logdemo
But the loaded application is still the old version.
How do I get Karaf to look for the updated jar in my local maven repo? It seems like it has some internal cache it looks in instead.
Note: We're not using Maven to build the application, so all answers about using Maven plugins like the fabric8 plugin will be rejected.
You should use the fabric:watch * command for that. This will update all containers that run a snapshot version of an artifact that is updated in the local maven repo. If you want only a specific container to watch for updates use dev:watch * on the shell of that container.
See http://fabric8.io/gitbook/developer.html

Resources