Jupyter Password Not Hashed - jupyter-notebook

When I try to set up the jupyter notebook password, I don't get a password hash when I open up the jupyter_notebook_config.json file.
This is the output of the json file:
{
"NotebookApp": {
"password":
"argon2:$argon2id$v=19$m=10240,t=10,p=8$pcTg1mB/X5a3XujQqYq/wQ$/UBQBRlFdzmEmxs6c2IzmQ"
}
}
I've tried running passwd() from python as well, like in the instructions for Preparing a hashed password instructions found online but it produces the same results as above. No hash.
Can someone please let me know what I'm doing wrong?
I'm trying to set up a Jetson Nano in similar fashion to the Deep Learing Institute Nano build. With that build you can run Jupyter Lab remotely so the nano can run headless. I'm trying to do the same things with no luck.
Thanks!

This is the default algorithm (argon2):
https://github.com/jupyter/notebook/blob/v6.5.2/notebook/auth/security.py#L23
you can provide a different algorithm like sha1 if you like:
>>> from notebook.auth import passwd
>>> from notebook.auth.security import passwd_check
>>>
>>> password = 'myPass123'
>>>
>>> hashed_argon2 = passwd(password)
>>> hashed_sha1 = passwd(password, 'sha1')
>>>
>>> print(hashed_argon2)
argon2:$argon2id$v=19$m=10240,t=10,p=8$JRz5GPqjOYJu/cnfXc5MZw$LZ5u6kPKytIv/8B/PLyV/w
>>>
>>> print(hashed_sha1)
sha1:c29c6aeeecef:0b9517160ce938888eb4a6ec9ca44e3a31da9519
>>>
>>> passwd_check(hashed_argon2, password)
True
>>>
>>> passwd_check(hashed_sha1, password)
True
Check whether you don't have a different Jupyter server running on your machine. It happened to me that I was trying over and over a password on port 8888 while my intended server was on port 8889.
Another time, Anaconda started a server on localhost:8888, and I was trying to reach a mapped port from a docker container, also on port 8888, and the only way to access was actually on 0.0.0.0:8888.

Related

Problem with Python script when setting up LDAP for MacOS

I am trying to set up Google secure LDAP on my Macbook Pro running Monterey 12.3 following these instructions from Google.
request.appendData_(NSData.dataWithBytes_length_(CONFIG,
len(CONFIG))) TypeError: Expecting byte-buffer, got str
See the script from the guide:
#!/usr/bin/python
from OpenDirectory import ODNode, ODSession, kODNodeTypeConfigure
from Foundation import NSMutableData, NSData
import os
import sys
# Reading plist
GOOGLELDAPCONFIGFILE = open(sys.argv[1], "r")
CONFIG = GOOGLELDAPCONFIGFILE.read()
GOOGLELDAPCONFIGFILE.close()
# Write the plist
od_session = ODSession.defaultSession()
od_conf_node, err = ODNode.nodeWithSession_type_error_(od_session, kODNodeTypeConfigure, None)
request = NSMutableData.dataWithBytes_length_(b'\x00'*32, 32)
request.appendData_(NSData.dataWithBytes_length_(CONFIG, len(CONFIG)))
response, err = od_conf_node.customCall_sendData_error_(99991, request, None)
# Edit the default search path and append the new node to allow for login
os.system("dscl -q localhost -append /Search CSPSearchPath /LDAPv3/ldap.google.com")
os.system("bash -c 'echo -e \"TLS_IDENTITY\tLDAP Client\" >> /etc/openldap/ldap.conf' ")
I have tried to find some solutions on Google (e.g. .encode, b'..) But I do not really understand it.
Thanks for the help.
Okay, I found the solution, actually here it was posted earlier.
Error running python script to create google ldap configuration on Macos

SFTP with Google Cloud Composer

I need to upload a file via SFTP into an external server through Cloud Composer. The code for the task is as follows:
from airflow import DAG
from airflow.operators.python_operator import PythonVirtualenvOperator
from airflow.operators.dummy_operator import DummyOperator
from datetime import datetime, timedelta
def make_sftp():
import paramiko
import pysftp
import os
from airflow.contrib.hooks.ssh_hook import SSHHook
import subprocess
ssh_hook = SSHHook(ssh_conn_id="conn_id")
sftp_client = ssh_hook.get_conn().open_sftp()
return 0
etl_dag = DAG("dag_test",
start_date=datetime.now(tz=local_tz),
schedule_interval=None,
default_args={
"owner": "airflow",
"depends_on_past": False,
"email_on_failure": False,
"email_on_retry": False,
"retries": 5,
"retry_delay": timedelta(minutes=5)})
sftp = PythonVirtualenvOperator(task_id="sftp",
python_callable=make_sftp,
requirements=["sshtunnel", "paramiko"],
dag=etl_dag)
start_pipeline = DummyOperator(task_id="start_pipeline", dag=etl_dag)
start_pipeline >> sftp
In "conn_id" I have used the following options: {"no_host_key_check": "true"}, the DAG runs for a couple of seconds and the fail with the following message:
WARNING - Remote Identification Change is not verified. This wont protect against Man-In-The-Middle attacks\n[2022-02-10 10:01:59,358] {ssh_hook.py:171} WARNING - No Host Key Verification. This wont protect against Man-In-The-Middle attacks\nTraceback (most recent call last):\n File "/tmp/venvur4zvddz/script.py", line 23, in <module>\n res = make_sftp(*args, **kwargs)\n File "/tmp/venvur4zvddz/script.py", line 19, in make_sftp\n sftp_client = ssh_hook.get_conn().open_sftp()\n File "/usr/local/lib/airflow/airflow/contrib/hooks/ssh_hook.py", line 194, in get_conn\n client.connect(**connect_kwargs)\n File "/opt/python3.6/lib/python3.6/site-packages/paramiko/client.py", line 412, in connect\n server_key = t.get_remote_server_key()\n File "/opt/python3.6/lib/python3.6/site-packages/paramiko/transport.py", line 834, in get_remote_server_key\n raise SSHException("No existing session")\nparamiko.ssh_exception.SSHException: No existing session\n'
do I have to set other options? Thank you!
Configuring the SSH connection with key pair authentication
To SSH into the host as a user with username “user_a”, an SSH key pair should be generated for that user and the public key should be added to the host machine. The following are the steps that would create an SSH connection to the “jupyter” user which has the write permissions.
Run the following commands on the local machine to generate the required SSH key:
ssh-keygen -t rsa -f ~/.ssh/sftp-ssh-key -C user_a
“sftp-ssh-key” → Name of the pair of public and private keys (Public key: sftp-ssh-key.pub, Private key: sftp-ssh-key)
“user_a” → User in the VM that we are trying to connect to
chmod 400 ~/.ssh/sftp-ssh-key
Now, copy the contents of the public key sftp-ssh-key.pub into ~/.ssh/authorized_keys of your host system. Check for necessary permissions for authorized_keys and grant them accordingly using chmod.
I tested the setup with a Compute Engine VM . In the Compute Engine console, edit the VM settings to add the contents of the generated SSH public key into the instance metadata. Detailed instructions can be found here. If you are connecting to a Compute Engine VM, make sure that the instance has the appropriate firewall rule to allow the SSH connection.
Upload the private key to the client machine. In this scenario, the client is the Airflow DAG so the key file should be accessible from the Composer/Airflow environment. To make the key file accessible, it has to be uploaded to the GCS bucket associated with the Composer environment. For example, if the private key is uploaded to the data folder in the bucket, the key file path would be /home/airflow/gcs/data/sftp-ssh-key.
Configuring the SSH connection with password authentication
If password authentication is not configured on the host machine, follow the below steps to enable password authentication.
Set the user password using the below command and enter the new password twice.
sudo passwd user_a
To enable SSH password authentication, you must SSH into the host machine as root to edit the sshd_config file.
/etc/ssh/sshd_config
Then, change the line PasswordAuthentication no to PasswordAuthentication yes. After making that change, restart the SSH service by running the following command as root.
sudo service ssh restart
Password authentication has been configured now.
Creating connections and uploading the DAG
1.1 Airflow connection with key authentication
Create a connection in Airflow with the below configuration or use the existing connection.
Extra field
The Extra JSON dictionary would look like this. Here, we have uploaded the private key file to the data folder in the Composer environment's GCS bucket.
{
"key_file": "/home/airflow/gcs/data/sftp-ssh-key",
"conn_timeout": "30",
"look_for_keys": "false"
}
1.2 Airflow connection with password authentication
If the host machine is configured to allow password authentication, these are the changes to be made in the Airflow connection.
The Extra parameter can be empty.
The Password parameter is the user_a's user password on the host machine.
The task logs show that the password authentication was successful.
INFO - Authentication (password) successful!
Upload the DAG to the Composer environment and trigger the DAG. I was facing key validation issue with the latest version of the paramiko=2.9.2 library. I tried downgrading paramiko but the older versions do not seem to support OPENSSH keys. Found an alternative paramiko-ng in which the validation issue has been fixed. Changed the Python dependency from paramiko to paramiko-ng in the PythonVirtualenvOperator.
from airflow import DAG
from airflow.operators.python_operator import PythonVirtualenvOperator
from airflow.operators.dummy_operator import DummyOperator
from datetime import datetime, timedelta
def make_sftp():
import paramiko
from airflow.contrib.hooks.ssh_hook import SSHHook
ssh_hook = SSHHook(ssh_conn_id="sftp_connection")
sftp_client = ssh_hook.get_conn().open_sftp()
print("=================SFTP Connection Successful=================")
remote_host = "/home/sftp-folder/sample_sftp_file" # file path in the host system
local_host = "/home/airflow/gcs/data/sample_sftp_file" # file path in the client system
sftp_client.get(remote_host,local_host) # GET operation to copy file from host to client
sftp_client.close()
return 0
etl_dag = DAG("sftp_dag",
start_date=datetime.now(),
schedule_interval=None,
default_args={
"owner": "airflow",
"depends_on_past": False,
"email_on_failure": False,
"email_on_retry": False,
"retries": 5,
"retry_delay": timedelta(minutes=5)})
sftp = PythonVirtualenvOperator(task_id="sftp",
python_callable=make_sftp,
requirements=["sshtunnel", "paramiko-ng", "pysftp"],
dag=etl_dag)
start_pipeline = DummyOperator(task_id="start_pipeline", dag=etl_dag)
start_pipeline >> sftp
Results
The sample_sftp_file has been copied from the host system to the specified Composer bucket.

Mosquitto: Starting in local only mode

I have a virtual machine that is supposed to be the host, which can receive and send data. The first picture is the error that I'm getting on my main machine (from which I'm trying to send data from). The second picture is the mosquitto log on my virtual machine. Also I'm using the default config, which as far as I know can't cause these problems, at least from what I have seen from other examples. I have very little understanding on how all of this works, so any help is appreciated.
What I have tried on the host machine:
Disabling Windows defender
Adding firewall rules for "mosquitto.exe"
Installing mosquitto on a linux machine
Starting with the release of Mosquitto version 2.0.0 (you are running v2.0.2) the default config will only bind to localhost as a move to a more secure default posture.
If you want to be able to access the broker from other machines you will need to explicitly edit the config files to either add a new listener that binds to the external IP address (or 0.0.0.0) or add a bind entry for the default listener.
By default it will also only allow anonymous connections (without username/password) from localhost, to allow anonymous from remote add:
allow_anonymous true
More details can be found in the 2.0 release notes here
You have to run with
mosquitto -c mosquitto.conf
mosquitto.conf, which exists in the folder same with execution file exists (C:\Program Files\mosquitto etc.), have to include following line.
listener 1883 ip_address_of_the_machine(192.168.1.1 etc.)
By default, the Mosquitto broker will only accept connections from clients on the local machine (the server hosting the broker).
Therefore, a custom configuration needs to be used with your instance of Mosquitto in order to accept connections from remote clients.
On your Windows machine, run a text editor as administrator and paste the following text:
listener 1883
allow_anonymous true
This creates a listener on port 1883 and allows anonymous connections. By default the number of connections is infinite. Save the file to "C:\Program Files\Mosquitto" using a file name with the ".conf" extension such as "your_conf_file.conf".
Open a terminal window and navigate to the mosquitto directory. Run the following command:
mosquitto -v -c your_conf_file.conf
where
-c : specify the broker config file.
-v : verbose mode - enable all logging types. This overrides
any logging options given in the config file.
I found I had to add, not only bind_address ip_address but also had to set allow_anonymous true before devices could connect successfully to MQTT. Of course I understand that a better option would be to set user and password on each device. But that's a next step after everything actually works in the minimum configuration.
For those who use mosquitto with homebrew on Mac.
Adding these two lines to /opt/homebrew/Cellar/mosquitto/2.0.15/etc/mosquitto/mosquitto.conf fixed my issue.
allow_anonymous true
listener 1883
you can run it with the included 'no-auth' config file like so:
mosquitto -c /mosquitto-no-auth.conf
I had the same problem while running it inside docker container (generated with docker-compose).
In docker-compose.yml file this is done with:
command: mosquitto -c /mosquitto-no-auth.conf

Proxy authentication using wget on cygwin

My institute recently installed a new proxy server for our network. I am trying to configure my Cygwin environment to be able to run wget and download data from a remote repository.
Browsing the internet I have found two different solutions to my problem, but no one of them seem to work in my case.
The first one I tried was to follow these instructions, so in Cygwin:
cd /cygdrive/c/cygwin64/etc/
nano wgetrc
at the end of the file, I added:
use_proxy = on
http_proxy=http://username:password#my.proxy.ip:my.port/
https_proxy=https://username:password#my.proxy.ip:my.port/
ftp_proxy=http://username:password#my.proxy.ip:my.port/
(of course, using my user and password)
The second approach was what was suggested by this SO post, so in my Cygwin environment:
export http_proxy=http://username:password#my.proxy.ip:my.port/
export https_proxy=https://username:password#my.proxy.ip:my.port/
export ftp_proxy=http://username:password#my.proxy.ip:my.port/
in both cases, if I try to test my wget, I get the following:
$ wget http://www.google.com
--2020-01-30 12:12:22-- http://www.google.com/
Resolving my.proxy.ip (my.proxy.ip)... 10.1XX.XXX.XX
Connecting to my.proxy.ip (my.proxy.ip)|10.1XX.XXX.XX|:8XXX... connected.
Proxy request sent, awaiting response... 407 Proxy Authentication Required
2020-01-30 12:12:22 ERROR 407: Proxy Authentication Required.
It looks like if my user and password are not ok, but I actually checked them on my browsers and my credentials work just fine.
Any idea on what this could be due to?
This problem was solved thanks to the suggestion of a User of the community AskUbuntu.
Basically, instead of editing the global configuration file wgetrc, I should have created a new .wgetrc with my proxy configuration in my Cygwin home directory.
In summary:
Step 1 - Create a .wgetrc file;
nano ~/.wgetrc
Step 2 - record in this file the proxy info:
use_proxy=on
http_proxy=http://my.proxy.ip:my.port
https_proxy=https://my.proxy.ip:my.port
ftp_proxy=http://my.proxy.ip:my.port
proxy_user=username
proxy_password=password

How to run Jupyter notebooks locally with password and no token?

Since update, jupyter notebook command will run jupyter with a token, by default. So that you have to open a URL like http://localhost:8889/?token=46b110632ds2f...
It is not very inconvenient, since you need to copy-paste this token from terminal. How can I run a jupyter server with a predefined password, so that I can save it in my browser and don't need to copy-paste the token from the command line?
You can from the command line run:
jupyter notebook password
The command prompt will ask you for the password and then set the hash in a JSON document in your configuration directory.
You can determine that with:
jupyter --config-dir
If you delete the file, the password will no longer work.
You may wish to set up SSL as well.
You can make a configuration to all option in a file, generated by command jupyter notebook --generate-config. This will produce a file with all configuration explained and commented out in folder ~/.jupyter/jupyter_notebook_config.py .
In this file you can un-comment
## Allow password to be changed at login for the notebook server.
#
# While loggin in with a token, the notebook server UI will give the opportunity
# to the user to enter a new password at the same time that will replace the
# token login mechanism.
#
# This can be set to false to prevent changing password from the UI/API.
c.NotebookApp.allow_password_change = True
and set some starting or no with token.
## Token used for authenticating first-time connections to the server.
#
# When no password is enabled, the default is to generate a new, random token.
#
# Setting to an empty string disables authentication altogether, which is NOT
# RECOMMENDED.
c.NotebookApp.token = ''
$ jupyter notebook --port 5000 --no-browser --ip='*' --NotebookApp.token='' --NotebookApp.password=''
this will give the following warnings. understand the risk.
[W 09:04:50.273 NotebookApp] WARNING: The notebook server is listening on all IP addresses and not using encryption. This is not recommended.
[W 09:04:50.274 NotebookApp] WARNING: The notebook server is listening on all IP addresses and not using authentication. This is highly insecure and not recommended.

Resources