.jsbeautifyrc does not accept "\crlf" as eol - js-beautify

I am using this config with jsbeautify 1.10.2 in Windows 10.
{
"indent_size": "2",
"indent_char": " ",
"max_preserve_newlines": "1",
"preserve_newlines": true,
"keep_array_indentation": false,
"break_chained_methods": false,
"indent_scripts": "normal",
"brace_style": "collapse-preserve-inline",
"space_before_conditional": true,
"unescape_strings": false,
"jslint_happy": false,
"end_with_newline": false,
"wrap_line_length": "160",
"indent_inner_html": false,
"comma_first": false,
"e4x": false,
"indent_empty_lines": false,
"wrap-attributes": "force-aligned",
"end-with-newline": "true",
"eol": "\crlf"
}`
And calling jsbeautity in this way:
js-beautify --config ./.jsbeautifyrc --replace ./apps/**/*.html
It throws this error:
SyntaxError: Unexpected token c in JSON at position 599
at JSON.parse (<anonymous>)
at exports.parse (c:\center\node_modules\config-chain\index.js:54:19)
at exports.json (c:\center\node_modules\config-chain\index.js:70:10)
at module.exports (c:\center\node_modules\config-chain\index.js:17:15)
at Object.exports.interpret (c:\center\node_modules\js-beautify\js\lib\cli.js:279:15)
at Object.<anonymous> (c:\center\node_modules\js-beautify\js\bin\js-beautify.js:4:5)
at Module._compile (internal/modules/cjs/loader.js:776:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:787:10)
at Module.load (internal/modules/cjs/loader.js:653:32)
at tryModuleLoad (internal/modules/cjs/loader.js:593:12)
Error while loading beautifier configuration.
Configuration file chain included:
c:\center\.jsbeautifyrc
c:\center\.jsbeautifyrc
c:\center\node_modules\js-beautify\js\lib/../config/defaults.json
if I remove the eol key it works but of course it sets the end of line in a way I do not want to.
I have tried with \\crlf /\crlf and crlf. But that just put that exact test in each new line.
Is this a know issue? I am declaring it wrongly?

The symbol for carriage return is \r (ASCII 13) and for linefeed is \n (ASCII 10).
Try:
"eol": "\r\n"
CRLF is merely the abbreviation.

Related

Turning on the 'esModuleInterop' flag

I'm using Firebase Functions for my App. I installed these firebase functions on my PC but I can't use the following command:
Firebase deploy --only functions
I get the following error:
node_modules/google-gax/build/protos/iam_service.d.ts:17:23 - error TS2497: This module
can only be referenced with ECMAScript imports/exports by turning on the
'esModuleInterop' flag and referencing its default export.
17 import * as Long from 'long';
~~~~~~
node_modules/google-gax/build/protos/operations.d.ts:17:23 - error TS2497: This module
can only be referenced with ECMAScript imports/exports by turning on the
'esModuleInterop' flag and referencing its default export.
17 import * as Long from 'long';
~~~~~~
Found 2 errors in 2 files.
Errors Files
1 node_modules/google-gax/build/protos/iam_service.d.ts:17
1 node_modules/google-gax/build/protos/operations.d.ts:17
Error: functions predeploy error: Command terminated with non-zero exit code2
Does anyone know how to turn on this flag?
just add "skipLibCheck": true to your tsconfig.json as shown below:
"compilerOptions": {
"module": "commonjs",
"noImplicitReturns": true,
"noUnusedLocals": true,
"outDir": "lib",
"sourceMap": true,
"strict": true,
"target": "es2017",
"skipLibCheck": true
},

Ansible-Vault conf file not being decrypted when running playbook

I'm working on this ansible playbook to sign certificates. Inside the playbook I use a conf file with an api key inside to hide the key I have encrypted the file with ansible vault. The problem with this is when I run the playbook, it errors out with a stdout saying file contains no section headers.
fatal: [cxlabs-alln01-sslapi]: FAILED! => {
"changed": true,
"cmd": [
"/usr/local/bin/sslapi_cli",
"sign",
"-csr",
"/etc/sslapi_cli/xxxxxxxx.cisco.com.csr",
"-out",
"/etc/sslapi_cli/xxxxxxxx.cisco.com.cer",
"-confFile",
"/etc/sslapi_cli/sslapi_cli.conf",
"-validityPeriod",
"one_year"
],
"delta": "0:00:00.209337",
"end": "2022-04-04 15:47:37.772535",
"invocation": {
"module_args": {
"_raw_params": "/usr/local/bin/sslapi_cli sign -csr /etc/sslapi_cli/xxxxxxxx.cisco.com.csr -out /etc/sslapi_cli/xxxxxxxx.cisco.com.cer -confFile /etc/sslapi_cli/sslapi_cli.conf -validityPeriod one_year",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 2,
"start": "2022-04-04 15:47:37.563198",
"stderr": "File contains no section headers.\nfile: '/etc/sslapi_cli/sslapi_cli.conf', line: 1\n'$ANSIBLE_VAULT;1.1;AES256\\n'",
"stderr_lines": [
"File contains no section headers.",
"file: '/etc/sslapi_cli/sslapi_cli.conf', line: 1",
"'$ANSIBLE_VAULT;1.1;AES256\\n'"
],
"stdout": "File contains no section headers.\nfile: '/etc/sslapi_cli/sslapi_cli.conf', line: 1\n'$ANSIBLE_VAULT;1.1;AES256\\n'",
"stdout_lines": [
"File contains no section headers.",
"file: '/etc/sslapi_cli/sslapi_cli.conf', line: 1",
"'$ANSIBLE_VAULT;1.1;AES256\\n'"
]
}
I'm not sure what this means, but I think It's because the sslapi_cli.conf is not being decrypted when the playbook is reading it.
Ansible vault purpose is not encrypting files, it is encrypting variables. When you encrypt a file with ansible-vault, it is assumed that the file is .yml formatted and therefore it can be processed as ansible variables.
You need to define the api key in an encrypted file, or encrypt inline (https://docs.ansible.com/ansible/latest/user_guide/vault.html#creating-encrypted-variables).
# encrypted_file.yml
my_api_key: foo
# variable ecrypted inline:
my_api_key: !vault |
$ANSIBLE_VAULT;1.1;AES256
62313365396662343061393464336163383764373764613633653634306231386433626436623361
6134333665353966363534333632666535333761666131620a663537646436643839616531643561
Then you need to create a template of your sslapi_cli.conf file with something like this:
sslapi_cli.conf.j2
ssl_api_key: {{ my_api_key}}
And before you execute your task you need to run a template (https://docs.ansible.com/ansible/latest/collections/ansible/builtin/template_module.html) task, generating the sslapi_cli.conf file with the correct api key.

Failed to interpret Gremlin query: Query parsing failed. "single and double quotes are not parsing"

Failed to interpret Gremlin query: Query parsing failed. "single and double quotes are not parsing"
g.V('a48543e9-d527-4928-b045-71da15a76bfe').property(single, 'title', '{'fr': '', 'en': 'Title Edit 02'}')
Getting error:
{
"detailedMessage": "Failed to interpret Gremlin query: Query parsing failed at line 1, character position at 77, error message : no viable alternative at input 'g.V('a48543e9-d527-4928-b045-71da15a76bfe').property(single,'title','{\\'fr':'",
"requestId": "4307b026-c0b5-45b0-9ec1-293822ee35ef",
"code": "MalformedQueryException"
}
I think you just need to escape your quotes or perhaps simply choose to wrap in double quotes:
g.V('a48543e9-d527-4928-b045-71da15a76bfe').
property(single, 'title', "{'fr': '', 'en': 'Title Edit 02'}")

Ecto Changeset add functionality for warnings

I created a fork of ecto repository to extend Ecto.Changeset module with the ability to add warnings to the changeset. I wanted to have an add_warnings/4 function which adds a warning to the changeset as a simple list of warnings with this structure warnings: [{atom, {String.t, Keyword.t}}], similar to errors. The difference between the behavior of warnings and errors is that when an error occurs the data are not persisted, but when a warning occurs the data are persisted.
Ecto.Changeset struct extended with keys warnings and warningless?:
defstruct valid?: false, warningless?: false, data: nil, params: nil, changes: %{}, repo: nil,
errors: [], warnings: [], validations: [], required: [], prepare: [],
constraints: [], filters: %{}, action: nil, types: nil,
empty_values: #empty_values
Ecto functions for casting, changing, processing params, etc. adjusted. Function add_warnings/4 added:
#spec add_warning(t, atom, String.t, Keyword.t) :: t
def add_warning(%{warnings: warnings} = changeset, key, message, keys \\ []) when is_binary(message) do
%{changeset | warnings: [{key, {message, keys}}|warnings], warningless?: false}
end
The result is that I receive changeset with expected keys:
#Ecto.Changeset<action: nil, changes: %{}, data: #Company.Booking<>, errors: [],
valid?: true, warnings: [], warningless?: true>
When I make a change with error and warning I receive:
#Ecto.Changeset<action: nil,
changes: %{pickup_address: #Ecto.Changeset<action: :update,
changes: %{street_name: nil}, data: #Company.Address<>,
errors: [street_name: {"can't be blank", [validation: :required]}],
valid?: false,
warnings: [phone_number: {"This phone number is not common in Netherlands",
[]}], warningless?: false>}, data: #Company.Booking<>, errors: [],
valid?: false, warnings: [], warningless?: true>
So, everything is as expected, as far as warnings are concerned. Then, when I make a change with a warning but without an error, I receive:
#Ecto.Changeset<action: nil,
changes: %{pickup_address: #Ecto.Changeset<action: :update,
changes: %{street_name: "sss"}, data: #Company.Address<>, errors: [],
valid?: true,
warnings: [phone_number: {"This phone number is not common in Netherlands",
[]}], warningless?: false>}, data: #Company.Booking<>, errors: [],
valid?: true, warnings: [], warningless?: true>
Everything is as expected. When I don't make any changes to the form I still should receive a warning for phone number, but I receive:
#Ecto.Changeset<action: nil, changes: %{}, data: #Company.Booking<>, errors: [],
valid?: true, warnings: [], warningless?: true>
I got a changeset without any warnings as there is no changes key in changeset because the data didn't change.
The question is as follows, how to implement warnings functionality to always have warnings in the changeset, even if no change was made?
You should consider to prefill the warnings at the very beginning of the each changeset function you would create - since you can't use plug there you can come up to write a macro that will handle this logic for you, __using__ is advised, so it would be quite easy to distinguish your logic from Ecto's default logic.
Your validation shouldn't add warnings to warnings list, but you have to implement it another way around - if the field is fine, you would remove already existing warnings from this list. That way you would be sure that your changeset is fine when it's warningless, because it removed all the warnings from this list and it would works perfectly for empty changes in changeset.

Chef::Exceptions::ChecksumMismatch when installing nginx-1.7.8 from source

I get the following error when running vagrant up --provision to set up my development environment with vagrant...
==> default: [2014-12-08T20:33:51+00:00] ERROR: remote_file[http://nginx.org/download/nginx-1.7.8.tar.gz] (nginx::source line 58) had an error: Chef::Exceptions::ChecksumMismatch: Checksum on resource (0510af) does not match checksum on content (12f75e)
My chef JSON has the following for nginx:
"nginx": {
"version": "1.7.8",
"user": "deploy",
"init_style": "init",
"modules": [
"http_stub_status_module",
"http_ssl_module",
"http_gzip_static_module"
],
"passenger": {
"version": "4.0.53",
"gem_binary": "/home/vagrant/.rbenv/shims/gem"
},
"configure_flags": [
"--add-module=/home/vagrant/.rbenv/versions/2.1.5/lib/ruby/gems/2.1.0/gems/passenger-3.0.18/ext/nginx"
],
"gzip_types": [
"text/plain",
"text/html",
"text/css",
"text/xml",
"text/javascript",
"application/json",
"application/x-javascript",
"application/xml",
"application/xml+rss"
]}
and Cheffile has the following cookbook:
cookbook 'nginx'
How do I resolve the Checksum mismatch?
The nginx cookbook requires you to edit the checksum attribute when using another version of nginx. The remote_file resource that is causing you an error is:
remote_file nginx_url do
source nginx_url
checksum node['nginx']['source']['checksum']
path src_filepath
backup false
end
You need to update the checksum value. Specifically node['nginx']['source']['checksum'].
So in your JSON, you would add this line:
"source": {"checksum": "insert checksum here" }
Edit: As pointed out in the comments, the checksum is SHA256. You can generate the checksum of the file like so:
shasum -a 256 nginx-1.7.8.tar.gz

Resources