How do I get the path of a downloaded deno module? - deno

Given a downloaded deno module like:
import x from "https://deno.land/x/reggi#0.0.1/calendar/denodb/fresh/island.ts"
How can I find where this specific module is located on the host machine?
➜ deno.land cd x
cd: no such file or directory: x
➜ deno.land pwd
/Users/thomasreggi/Library/Caches/deno/deps/https/deno.land
There seems to be no "x" dir here 👆
Where is the file stored locally?
How can I create the local file path from url via code / api?

Notes:
The Linking to Third Party Code section in the manual will be a great entrypoint reference on this topic.
The below information was written when Deno's latest release was version 1.25.1.
Deno has a dependency inspector tool:
deno info [URL] will inspect an ES module and all of its dependencies.
This tool prints all of the dependencies of a module to stdout, and the output includes the module's remote origin as well as its cached path on your local storage device.
It also has a (currently unstable) argument --json which will print the dependencies in JSON format (which is easier to parse programmatically if that's your goal).
You can create a subprocess in your program that uses the above command and argument, then parse its output in order to programmatically determine a module's cached location. (Also note that the subprocess API is changing and the currently unstable APIs Deno.spawn and Deno.spawnChild will replace the current one.)
Here's some example output from a module with fewer dependencies than the one in your question:
% deno info https://deno.land/std#0.154.0/testing/asserts.ts
local: /Users/deno/.deno/deps/https/deno.land/ce1734220fb8c205d2de1b52a59b20401c59d0707abcc9765dbeb60c25483df9
type: TypeScript
dependencies: 3 unique (total 46.65KB)
https://deno.land/std#0.154.0/testing/asserts.ts (23.22KB)
├── https://deno.land/std#0.154.0/fmt/colors.ts (11.62KB)
├─┬ https://deno.land/std#0.154.0/testing/_diff.ts (11.11KB)
│ └── https://deno.land/std#0.154.0/fmt/colors.ts *
└── https://deno.land/std#0.154.0/testing/_format.ts (705B)
Output when using the --json argument:
% deno info --json https://deno.land/std#0.154.0/testing/asserts.ts
{
"roots": [
"https://deno.land/std#0.154.0/testing/asserts.ts"
],
"modules": [
{
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/02d8068ecd90393c6bf5c8f69b02882b789681b5c638c210545b2d71e604b585",
"emit": null,
"map": null,
"size": 11904,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/fmt/colors.ts"
},
{
"dependencies": [
{
"specifier": "../fmt/colors.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/fmt/colors.ts",
"span": {
"start": {
"line": 11,
"character": 7
},
"end": {
"line": 11,
"character": 25
}
}
}
}
],
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/62cb97c1d18d022406d28b201c22805c58600e9a6d837b0fc4b71621ed21e30d",
"emit": null,
"map": null,
"size": 11380,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/testing/_diff.ts"
},
{
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/3f50b09108fe404c8274e994b417a0802863842e740c1d7ca43c119c0ee0f14b",
"emit": null,
"map": null,
"size": 705,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/testing/_format.ts"
},
{
"dependencies": [
{
"specifier": "../fmt/colors.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/fmt/colors.ts",
"span": {
"start": {
"line": 10,
"character": 32
},
"end": {
"line": 10,
"character": 50
}
}
}
},
{
"specifier": "./_diff.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/testing/_diff.ts",
"span": {
"start": {
"line": 11,
"character": 44
},
"end": {
"line": 11,
"character": 56
}
}
}
},
{
"specifier": "./_format.ts",
"code": {
"specifier": "https://deno.land/std#0.154.0/testing/_format.ts",
"span": {
"start": {
"line": 12,
"character": 23
},
"end": {
"line": 12,
"character": 37
}
}
}
}
],
"kind": "esm",
"local": "/Users/deno/.deno/deps/https/deno.land/ce1734220fb8c205d2de1b52a59b20401c59d0707abcc9765dbeb60c25483df9",
"emit": null,
"map": null,
"size": 23776,
"mediaType": "TypeScript",
"specifier": "https://deno.land/std#0.154.0/testing/asserts.ts"
}
],
"redirects": {}
}
However, if your ultimate goal is to cache modules (that come from a remote origin) to your local storage device and import them from that location rather than use Deno's built-in cache, I recommend using the built-in tool for that: deno vendor. From its manual page:
deno vendor <specifiers>... will download all remote dependencies of the specified modules into a local vendor folder.
Update: Here's an example script demonstrating the method I described above:
so-73596066.ts:
/// <reference lib="deno.unstable" />
import { assertExists } from "https://deno.land/std#0.154.0/testing/asserts.ts";
/*
`deno info --json` is unstable, and I didn't find any mention of schema for its
output in the docs, but here's a (conservative) partial type for the bits
that are relevant to this example, derived from looking at just a couple
of outputs from Deno v1.25.1:
*/
type ModuleInfo =
& Record<"kind" | "local" | "mediaType" | "specifier", string>
& Record<"emit" | "map", string | null>
& {
dependencies?: unknown[];
size: number;
};
type DependencyInspectorResult = {
modules: ModuleInfo[];
roots: string[];
};
/**
* Creates a formatted error message and allows for improved error handling by
* discriminating error instances
*/
class ProcessError extends Error {
override name = "ProcessError";
constructor(status: Deno.ChildStatus, stdErr?: string) {
let msg = `The process exited with status code ${status.code}`;
if (stdErr) msg += `. stderr:\n${stdErr}`;
super(msg);
}
}
/**
* Parses output from `deno info --json`. The resulting command will look like:
* `deno info --json [...denoInfoArgs] specifier`
* #param specifier local/remote path/URL
* #param denoInfoArgs optional, additional arguments to be used with `deno info --json`
*/
async function getCachedModuleInfo(
specifier: string | URL,
denoInfoArgs?: string[],
): Promise<ModuleInfo> {
const decoder = new TextDecoder();
const specifierStr = String(specifier);
const args = ["info", "--json"];
if (denoInfoArgs?.length) args.push(...denoInfoArgs);
args.push(specifierStr);
const { stderr, stdout, ...status } = await Deno.spawn("deno", { args });
if (!status.success) {
const stdErr = decoder.decode(stderr).trim();
throw new ProcessError(status, stdErr);
}
const result = JSON.parse(
decoder.decode(stdout),
) as DependencyInspectorResult;
const moduleInfo = result.modules.find((info) =>
info.specifier === specifierStr
);
assertExists(moduleInfo, "Module not found in output");
return moduleInfo;
}
/**
* `console.log` truncates long strings and deep object properties by default.
* This overrides that behavior.
*/
function print(value: unknown): void {
const inspectOpts: Deno.InspectOptions = {
colors: true,
depth: Infinity,
strAbbreviateSize: Infinity,
};
const formattedOutput = Deno.inspect(value, inspectOpts);
console.log(formattedOutput);
}
async function main() {
const moduleInfo = await getCachedModuleInfo(
"https://deno.land/std#0.154.0/testing/asserts.ts",
);
const { local, specifier } = moduleInfo;
print({ specifier, local });
}
if (import.meta.main) main();
% deno --version
deno 1.25.1 (release, x86_64-apple-darwin)
v8 10.6.194.5
typescript 4.7.4
% deno run --allow-run=deno --unstable so-73596066.ts
{
specifier: "https://deno.land/std#0.154.0/testing/asserts.ts",
local: "/Users/deno/.deno/deps/https/deno.land/ce1734220fb8c205d2de1b52a59b20401c59d0707abcc9765dbeb60c25483df9"
}

Related

How to configure cypress-sql-server with no cypress.json? (updated)

I'm trying to setup cypress-sql-server, but I'm using version 10.8.0, which does not use cypress.json to configure the environment. All of the setup instructions I've found refer to using cypress.json to configure the plug-in. With the help of u/Fody, I'm closer, but I'm still running into an error:
tasksqlServer:execute, SELECT 'Bob'
CypressError
cy.task('sqlServer:execute') failed with the following error:
The 'task' event has not been registered in the setupNodeEvents method. You must register it before using cy.task()
Fix this in your setupNodeEvents method here:
D:\git\mcare.automation\client\cypress\cypress.config.jsLearn more
node_modules/cypress-sql-server/src/commands/db.js:7:1
5 | }
6 |
> 7 | cy.task('sqlServer:execute', query).then(response => {
| ^
8 | let result = [];
9 |
cypress.config.js
const { defineConfig } = require("cypress");
const sqlServer = require("cypress-sql-server");
module.exports = defineConfig({
e2e: {
setupNodeEvents(on, config) {
// allows db data to be accessed in tests
config.db = {
"userName": "user",
"password": "pass",
"server": "myserver",
"options": {
"database": "mydb",
"encrypt": true,
"rowCollectionOnRequestCompletion": true
}
}
// code from /plugins/index.js
const tasks = sqlServer.loadDBPlugin(config.db);
on('task', tasks);
return config
// implement node event listeners here
},
},
});
testSQL.spec.js
describe('Testing SQL queries', () => {
it("It should return Bob", () => {
cy.sqlServer("SELECT 'Bob'").should('eq', 'Bob');
});
})
My versions:
\cypress> npx cypress --version
Cypress package version: 10.8.0
Cypress binary version: 10.8.0
Electron version: 19.0.8
Bundled Node version:
16.14.2
Suggestions? Is there any more info I can provide to help?
This is the install instruction currently given by cypress-sql-server for Cypress v9
Plugin file
The plug-in can be initialised in your cypress/plugins/index.js file as below.
const sqlServer = require('cypress-sql-server');
module.exports = (on, config) => {
tasks = sqlServer.loadDBPlugin(config.db);
on('task', tasks);
}
Translating that into Cypress v10+
const { defineConfig } = require('cypress')
const sqlServer = require('cypress-sql-server');
module.exports = defineConfig({
e2e: {
setupNodeEvents(on, config) {
// allows db data to be accessed in tests
config.db = {
"userName": "user",
"password": "pass",
"server": "myserver",
"options": {
"database": "mydb",
"encrypt": true,
"rowCollectionOnRequestCompletion": true
}
}
// code from /plugins/index.js
const tasks = sqlServer.loadDBPlugin(config.db);
on('task', tasks);
return config
},
},
})
Other variations work, such as putting the "db": {...} section below the "e2e: {...}" section, but not in the "env": {...} section.
Custom commands
Instructions for Cypress v9
Commands file
The extension provides multiple sets of commands. You can import the ones you need.
Example support/index.js file.
import sqlServer from 'cypress-sql-server';
sqlServer.loadDBCommands();
For Cypress v10+
Just move this code to support/e2e.js
cypress.json is a way to specify Cypress environment variables. Instead of using a cypress.json file, you can use any of the strategies in that link.
If you just wanted to include them in your cypress.config.js, it would look something like this:
const { defineConfig } = require('cypress')
module.exports = defineConfig({
e2e: {
baseUrl: 'http://localhost:1234',
env: {
db: {
// your db values here
}
}
}
})

Configure eslint to read common types in src/types/types.d.ts vue3

My eslint .eslintrc.js, now properly in the src folder, is the following:
module.exports = {
env: {
browser: true,
commonjs: true,
es2021: true
},
extends: [
'plugin:vue/vue3-recommended',
'standard',
'prettier'
],
parserOptions: {
ecmaVersion: 2020,
parser: '#typescript-eslint/parser',
'ecmaFeatures': {
'jsx': true
}
},
plugins: [
'vue',
'#typescript-eslint'
],
rules: {
'import/no-unresolved': 'error'
},
settings: {
'import/parsers': {
'#typescript-eslint/parser': ['.ts', '.tsx']
},
'import/resolver': {
'typescript': {
'alwaysTryTypes': true,
}
}
}
}
I'm attempting to use eslint-import-resolver-typescript, but the documentation is a bit opaque.
I currently get errors on lines where a externally defined type is used (StepData in this example):
setup() {
const data = inject("StepData") as StepData;
return {
data,
};
},
The answer was the following. In types.d.ts (or other file if you want to have different collections of your custom types):
export interface MyType {
positioner: DOMRect;
content: DOMRect;
arrow: DOMRect;
window: DOMRect;
}
export interface SomeOtherType {
.. and so on
Then in the .vue files, import the types I need for the component:
import type { MyType, SomeOtherType } from "../types/types";
Before I was not using the export keyword and the types just worked without being imported. They have to be imported like this if you use export. It's kind of amazing how you are just expected to know this, the documentation for Typescript or Vue is sorely lacking in examples.

Cannot Find Service - GHZ to load test GRPC Service

I'm trying to test a GRPC Service using GHZ. However, I get the error -
Cannot find service "com.server.grpc.Executor"
Config.json file:
"proto": "/Users/dev/Desktop/ghz/execute.proto",
"call": "com.server.grpc.Executor.execute",
"total": 2000,
"concurrency": 50,
"data": {
"param1": "test-data1",
"param2": "test-data2",
},
"max-duration": "10s",
"host": "<ip-address>:9090",
"c": 10,
"n": 200
}
proto file:
option java_package= "com.server.grpc";
option java_multiple_files = true;
service Executor {
rpc execute(ExecuteRequest) returns (ExecuteResponse);
}
message ExecuteRequest {
string param1 = 1;
string param2= 2;
}
message ExecuteResponse {
bool res = 1;
string msg = 2;
}
Running using command: ghz --config=<path/to/config>/config.json
Is there anything I'm missing?
Your protobuf file should contain e.g.:
syntax = "proto3";
package example;
...
Then, your service would be fully-qualified by example.Executor.execute not com.server.grpc.Execute.execute which is a language-specific (I assume, Java by your option) fully-qualified name.
I assume you unintentionally omitted the opening brace ({) of the JSON file but that, of course, is required.
JSON is challenging your "param2": "test-data2" must not be terminated with , because it's the last item in the list; so drop that comma.
{
"proto": "/Users/dev/Desktop/ghz/execute.proto",
"call": "example.Executor.execute",
"total": 2000,
"concurrency": 50,
"data": {
"param1": "test-data1",
"param2": "test-data2"
},
"max-duration": "10s",
"host": "<ip-address>:9090",
"c": 10,
"n": 200
}
Assuming your service is running on <ip-address>:9090, that should then work!

Google Cloud HTTPS function (Error: There is an account problem for the requested project)

UPDATE
It seems that I can't get the bucket's reference correctly or use the bucket functions.
When I do
// Get the photos' bucket
const bucket = gcs.bucket("photos");
console.log("deleting the bucket");
// Delete bucket
bucket.delete();
I get the error too (but the bucket exists in my project)
Error: There is an account problem for the requested project.
at new ApiError (/workspace/node_modules/#google-cloud/common/build/src/util.js:58:15)
at Util.parseHttpRespBody (/workspace/node_modules/#google-cloud/common/build/src/util.js:193:38)
at Util.handleResp (/workspace/node_modules/#google-cloud/common/build/src/util.js:134:117)
at retryRequest (/workspace/node_modules/#google-cloud/common/build/src/util.js:432:22)
at onResponse (/workspace/node_modules/retry-request/index.js:206:7)
at /workspace/node_modules/teeny-request/build/src/index.js:233:13
at process._tickCallback (internal/process/next_tick.js:68:7)
PROBLEM
I am having this error:
{
"error": {
"code": 401,
"message": "There is an account problem for the requested project.",
"errors": [
{
"message": "There is an account problem for the requested project.",
"domain": "global",
"reason": "required",
"locationType": "header",
"location": "Authorization"
}
]
}
}
PassThrough {
Unhandled error { Error: There is an account problem for the requested project
at new ApiError (/workspace/node_modules/#google-cloud/common/build/src/util.js:58:15)
_readableState:
ReadableState {
at Util.parseHttpRespBody (/workspace/node_modules/#google-cloud/common/build/src/util.js:193:38)
...
What I am doing is just:
Get a bucket reference
Get a file from the bucket with the file's path
Get metadata and the signed url of the file.
My problem is the third step. When I call those functions, I don't receive any answer... no metadata, no url.
Here is the code:
const { Storage } = require("#google-cloud/storage");
const gsc = new Storage()
const serviceAccount = require("./serviceAccount.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://my_app_id.firebaseio.com",
storageBucket: "my_app_id.appspot.com",
});
exports.validateImageDimensions = functions
.region("us-central1")
// Increased memory, decreased timeout (compared to defaults)
.runWith({ memory: "2GB", timeoutSeconds: 120 })
.https.onCall(async (data, context) => {
// Get the image's owner
const owner = context.auth.token.uid;
// Get the image's info
const { id, description, location, tags, time } = data;
// Get the photos' bucket
const bucket = gcs.bucket("photos");
// Get the file's path
const filePath = bucket.name + "/" + id;
// Get the file
const file = bucket.file(filePath);
console.log(JSON.stringify(file));
// Check if the file is a jpeg image
const metadata = await file.getMetadata(); // <------- ERROR
console.log(JSON.stringify(metadata));
const isJpgImage = metadata[0].contentType === "image/jpeg";
console.log(`Is jpeg? ${isJpgImage}`);
// Get the file's signed urls
const signedUrls = await file.getSignedUrl({
action: "read",
expires: "01-01-2100",
});
// signedUrls[0] contains the file's public URL
const publicUrl = signedUrls[0];
console.log(`publicUrl ${publicUrl}`);
console.log(`Url: ${publicUrl}`);
The file exists in my storage... and when I console.log it I can see:
{
"_eventsCount": 0,
"_events": {},
"id": "photos%2Fecbb4a5a-aa11-451f-b8af-efd3e8464a59",
"baseUrl": "/o",
"metadata": {},
"acl": { "pathPrefix": "/acl", "owners": {}, "readers": {}, "writers": {} },
"methods": {
"exists": { "reqOpts": { "qs": {} } },
"get": { "reqOpts": { "qs": {} } },
"delete": { "reqOpts": { "qs": {} } },
"getMetadata": { "reqOpts": { "qs": {} } },
"setMetadata": { "reqOpts": { "qs": {} } }
},
"bucket": {
"_eventsCount": 0,
"baseUrl": "/b",
"storage": {
"baseUrl": "https://storage.googleapis.com/storage/v1",
"projectId": "{{projectId}}",
"authClient": {
"cachedCredential": null,
"scopes": [
"https://www.googleapis.com/auth/iam",
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/devstorage.full_control"
],
"jsonContent": null,
"_cachedProjectId": null
},
"globalInterceptors": [],
"acl": {
"OWNER_ROLE": "OWNER",
"READER_ROLE": "READER",
"WRITER_ROLE": "WRITER"
},
...
Any ideas? I have been googling for hours but no answer.
Also, my storage rules:
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
function isSignedIn() {
return request.auth.uid != null;
}
match /photos/{photo} {
function hasValidSize() {
// Max. photo size = 30MB (For all dimensions)
return request.resource.size < 30 * 1024 * 1024;
}
function isImage() {
return request.resource.contentType.matches("image/.*");
}
allow read: if true;
allow write: if isSignedIn() && isImage() && hasValidSize();
}
}
}

weird `Method cannot be called on possibly null / undefined value`

The following narrowed-down code:
// #flow
'use strict';
import assert from 'assert';
class Node<V, E> {
value: V;
children: ?Map<E, Node<V,E>>;
constructor(value: V) {
this.value = value;
this.children = null;
}
}
function accessChildren(tree: Node<number, string>): void {
if (tree.children!=null) {
assert(true); // if you comment this line Flow is ok
tree.children.forEach( (v,k)=>{});
} else {
}
}
… fails Flow type checking with the following message:
$ npm run flow
> simple-babel-serverside-node-only-archetype#1.0.0 flow /home/blah/blah/blah
> flow; test $? -eq 0 -o $? -eq 2
es6/foo.js:21
21: tree.children.forEach( (v,k)=>{});
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ call of method `forEach`. Method cannot be called on possibly null value
21: tree.children.forEach( (v,k)=>{});
^^^^^^^^^^^^^ null
es6/foo.js:21
21: tree.children.forEach( (v,k)=>{});
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ call of method `forEach`. Method cannot be called on possibly undefined value
21: tree.children.forEach( (v,k)=>{});
^^^^^^^^^^^^^ undefined
Found 2 errors
If the line reading: assert(true) is commented out, flow is satisfied !
What gives?
PS: In case anyone wonders, my .flowconfig, .babelrc and package.json files are nondescript:
.flowconfig
$ cat .flowconfig
[options]
esproposal.class_static_fields=enable
.babelrc
$ cat .babelrc
{
"presets": ["es2015"],
"plugins": ["transform-object-rest-spread", "transform-flow-strip-types", "transform-class-properties"]
}
package.json
$ cat package.json
{
"name": "simple-babel-serverside-node-only-archetype",
"version": "1.0.0",
"description": "",
"main": [
"index.js"
],
"scripts": {
"build": "babel es6 --out-dir es5 --source-maps",
"build-watch": "babel es6 --out-dir es5 --source-maps --watch",
"start": "node es5/index.js",
"flow": "flow; test $? -eq 0 -o $? -eq 2"
},
"author": "",
"license": "ISC",
"devDependencies": {
"babel-cli": "^6.6.5",
"babel-core": "^6.7.4",
"babel-plugin-transform-class-properties": "^6.10.2",
"babel-plugin-transform-flow-strip-types": "^6.8.0",
"babel-polyfill": "^6.7.4",
"babel-preset-es2015": "^6.9.0",
"babel-runtime": "^6.6.1",
"flow-bin": "^0.27.0"
},
"dependencies": {
"babel-plugin-transform-object-rest-spread": "^6.8.0",
"babel-polyfill": "^6.7.4",
"source-map-support": "^0.4.0"
}
}
Your case is described here.
Flow cannot know, that assert doesn't change the tree.
Add the following lines to your code and run it – you will have runtime error, because assert function will set tree.children to null when called.
const root = new Node(1);
const child = new Node(2);
root.children = new Map([['child', child]]);
assert = () => root.children = null;
accessChildren(root);
Yes, it is pretty weird code, but Flow doesn't know, you will not to write it.
Others have pointed to the right explanation. Fortunately this works:
// #flow
'use strict';
import assert from 'assert';
class Node<V, E> {
value: V;
children: ?Map<E, Node<V, E>>;
constructor(value: V) {
this.value = value;
this.children = null;
}
}
function accessChildren(tree: Node<number, string>): void {
const children = tree.children; // save possibly mutable reference to local
if (children != null) {
assert(true); // if you comment this line Flow is ok
children.forEach((v, k) => {});
} else {
}
}
Also, in the future Flow will have read-only properties and by declaring children as a read-only property in the class, Flow should be able to preserve type check the original code.

Resources