In rfc1928, first, client must send a message to server:
client send to server:
+----+----------+----------+
|VER | NMETHODS | METHODS |
+----+----------+----------+
| 1 | 1 | 1 to 255 |
+----+----------+----------+
and then server should return message following :
+----+--------+
|VER | METHOD |
+----+--------+
| 1 | 1 |
+----+--------+
So i am using command line to test:
$ echo -e '0x01''0x01''0x01' | nc test.com 30
this command should return to me
0x01 0x01
But return me nothing with empty blank
The decimal numbers appearing in packet-format diagrams represent the length of the corresponding field, in octets, not the value you are supposed to use.
The version field for a socks5 proxy should be 0x05. The values of NMETHODS and METHODS depend on which methods you want to check.
Lastly, the raw output of the response message consists of non-printable characters, so you normally wouldn't be able to see it even if it was there, and apparently you need to end the input of netcat with a CR-LF or wait forever so it actually sends the message.
So in summary, to check for method 0x00 (no authentication), you could do something like
printf "\x05\x01\x00\r\n" | nc test.com 30 | hd
and get
00000000 05 00 |..|
00000002
on success.
Related
I've got this Unix script and I'm trying to translate it to Python. The problem is I don't understand Unix and I have no idea what any of it means.
This is the code:
1 17-23 * * * curl -X PUT -d EM_OperatingMode=10 --header 'Auth-Token: 512532eb-0d57-4a59-8da0-e1e136945ee8' http://192.168.1.70:80/api/v2/configurations
The code you took is in crontab of Unix system. Let's split the code into two parts.
The first part, 1 17-23 * * * , represents a schedule. It means between 17:00 to 23:00, the program runs every hour at the first minute of that hour. Crontab follows min hour day month weekday. (Refer crontab guru.)
The second part, curl -X PUT -d EM_OperatingMode=10 --header 'Auth-Token: 512532eb-0d57-4a59-8da0-e1e136945ee8' http://192.168.1.70:80/api/v2/configurations, represents a PUT request to an URL.
curl is a program used to communicate with a server. You use a browser to interact with a server right. You can think of curl as a command line tool to do the same. The parameter contains PUT request which updates data in the server. Here EM_OperationMode=10 is being passed to the server for updating. --header represents parameters to be sent to the URL. Here an authentication token is being sent to the URL to validate its access. Finally, the URL for communication is http://192.168.1.70:80/api/v2/configurations.
So, the value is being passed to the server at 192.168.1.70 as per the schedule I mentioned above. You can use Python requests library to replicate the server communication while you can retain the scheduler for calling the python program to run it as per the same schedule.
It seems to be cron job that runs given command automatically on the given interval: "At minute 1 past every hour from 17 through 23."
Similar Python code would be
import requests
requests.put(
"http://192.168.1.70:80/api/v2/configurations",
data={"EM_OperatingMode":10},
headers={"Auth-Token":"512532eb-0d57-4a59-8da0-e1e136945ee8"}
)
It is crontab, example:
* * * * * Command_to_execute
| | | | |
| | | | Day of the Week ( 0 - 6 ) ( Sunday = 0 )
| | | |
| | | Month ( 1 - 12 )
| | |
| | Day of Month ( 1 - 31 )
| |
| Hour ( 0 - 23 )
|
Min ( 0 - 59 )
Here is the source: https://www.tutorialspoint.com/unix_commands/crontab.htm
And second part is curl with some options (flags): https://curl.se/docs/manpage.html
I want to use Robot Framework to write and execute Test Cases in Gherkin format.
What I want is when I execute a Test Case, output in the console besides the name
of the Scenario, each step (When, Then...) and log as well if the step passes or not.
You could achieve such functionality with a listener that uses the listener interface of the framework.
The end_keyword listener method will be invoked during execution when a keyword is finished. It will get the keyword name and its attributes as a parameter, so you can log both the name and the status.
You have to filter it so only keywords starting with Given, When, Then will be logged on the console.
Example:
ROBOT_LISTENER_API_VERSION = 2
def start_test(name, attributes):
# Add an extra new line at the beginning of each test case to have everything aligned.
print(f'\n')
def end_keyword(name, attributes):
if name.startswith('Given') or name.startswith('When') or name.startswith('Then'):
print(f'{name} | {attributes["status"]} |')
Console output for the behavior-driven development example of the user guide.
robot --pythonpath . --listener listener.py test.robot
==============================================================================
Test
==============================================================================
Add two numbers
Given I have Calculator open | PASS |
.When I add 2 and 40 | PASS |
.Then result should be 42 | PASS |
Add two numbers | PASS |
------------------------------------------------------------------------------
Add negative numbers
Given I have Calculator open | PASS |
.When I add 1 and -2 | PASS |
.Then result should be -1 | PASS |
Add negative numbers | PASS |
------------------------------------------------------------------------------
Test | PASS |
2 critical tests, 2 passed, 0 failed
2 tests total, 2 passed, 0 failed
==============================================================================
I am trying to count the amount of connections while groupping them by their 'state'.
This command achieve that goal :
netstat -ant | awk '{ print $6}' | sort | uniq -c
which provide an output that looks like that :
4 CLOSE_WAIT
1 established)
127 ESTABLISHED
1 Foreign
2 LAST_ACK
39 LISTEN
9 TIME_WAIT
I am trying to combine my command with the watch command like that :
watch -n 1 "netstat -ant | awk '{ print $6}' | sort | uniq -c"
But, the output is just of the netstat -ant command (and not the last output of the pipe).
How can I use that complex command with watch ?
This works:
watch -n1 "netstat -ant | awk '{ print \$6}' | sort | uniq -c"
You're passing a double quoted string that happens to contain single quotes. Inside a double quoted string, $s meant as literal $s must be escaped ($6 => \$6).
When you don't escape it, watch will likely receive
"netstat -ant | awk '{ print }' | sort | uniq -c"
(as $6 is likely to be unset), which would explain the output you're getting (awk '{ print }' in a pipeline is essentially a no-op, like cat).
I got a basic question of efi mm command.
I need to control a controller (MAC) inside a SOC, and let it generate MDIO traffic to external PHY chip, to read its ID.
The instruction told me to do the following under efi shell
Shell> mm xxxxxxx yyyyyyy -w 4 -MEM -n
Shell> mm xxxxxxx -w 4 -MEM -n
I'm wondering what's the mm command do?
Looks like it write to xxxxxx register the yyyyyy data, and then "mm" this register again?
Not sure why.
Can anyone help me on this?
The mm command is explained in the UEFI Shell Specification:
mm address [value] [-w 1|2|4|8] [-MEM | -PMEM | -MMIO | -IO | -PCI | -PCIE] [-
n]
The description states "If value is specified, which should be typed in hex format, this command will write this value to specified address. Otherwise when this command is executed, the current contents of address are displayed.".
So your first command writes the 32-bit value yyyyyyy to address xxxxxxx, and the second command reads a 32-bit value from address xxxxxxx - presumably to verify that the write took effect.
Sample startup.nsh
#Sample startup.nsh
#echo -off
# Clear screen
cls
# Print date and time
date
time
# Set special register of the CPU (Intel Denverton C3758R)
# EFI Shell style is "mm fe000018 29C0202C -w 4 -MEM -n"
# (more detail usage, Use "help mm")
# but not implement in startup.nsh with error "Invalid data width"
# No need option "-n" for Non-interactive write
mm fe000018 4 :29C0002C
# Non-interactive read
mm fe000018 4 -n
mm e00fa0a4 4 -n
# System shutdown
reset -s
I am getting data from the server in a file (in format1) everyday ,however i am getting the data for last one week.
I have to archive the data for 1.5 months exactly,because this data is being picked to make some graphical representation.
I have tried to merge the the files of 2 days and sort them uniquely (code1) ,however it didn't work because everyday name of raw file is changing.However Time-stamp is unique in this file,but I am not sure how to sort the unique data on base of a specific column also,is there any way to delete the data older than 1.5 months.
For Deletion ,The logic i thought is deleting by fetching today's date - least date of that file but again unable to fetch least date.
Format1
r01/WAS2/oss_change0_5.log:2016-03-21T11:13:36.354+0000 | (307,868,305) | OSS_CHANGE |
com.nokia.oss.configurator.rac.provisioningservices.util.Log.logAuditSuccessWithResources | RACPRS RNC 6.0 or
newer_Direct_Activation: LOCKING SUCCEEDED audit[ | Source='Server' | User identity='vpaineni' | Operation
identifier='CMNetworkMOWriterLocking' | Success code='T' | Cause code='N/A' | Identifier='SUCCESS' | Target element='PLMN-
PLMN/RNC-199/WBTS-720' | Client address='10.7.80.21' | Source session identifier='' | Target session identifier='' |
Category code='' | Effect code='' | Network Transaction identifier='' | Source user identity='' | Target user identity='' |
Timestamp='1458558816354']
Code1
cat file1 file2 |sort -u > file3
Data on Day2 ,the input file name Differ
r01/WAS2/oss_change0_11.log:2016-03-21T11:13:36.354+0000 | (307,868,305) | OSS_CHANGE |
com.nokia.oss.configurator.rac.provisioningservices.util.Log.logAuditSuccessWithResources | RACPRS RNC 6.0 or
newer_Direct_Activation: LOCKING SUCCEEDED audit[ | Source='Server' | User identity='vpaineni' | Operation
identifier='CMNetworkMOWriterLocking' | Success code='T' | Cause code='N/A' | Identifier='SUCCESS' | Target element='PLMN-
PLMN/RNC-199/WBTS-720' | Client address='10.7.80.21' | Source session identifier='' | Target session identifier='' |
Category code='' | Effect code='' | Network Transaction identifier='' | Source user identity='' | Target user identity='' |
Timestamp='1458558816354']
I have written almost similar kind of code a week back.
Awk is a good Tool ,if you want to do any operation column wise.
Also , Sort Unique will not work as file name is changing
Both unique rows and least date can be find using awk.
1 To Get Unique file content
cat file1 file2 |awk -F "\|" '!repeat[$21]++' > file3;
Here -F specifies your field separator
Repeat is taking 21st field that is time stamp
and will only print 1st occurrence of that time ,rest ignored
So,finally unique content of file1 and file2 will be available in file3
2 To Get least Date and find difference between 2 dates
Least_Date=`awk -F: '{print substr($2,1,10)}' RMCR10.log|sort|head -1`;
Today_Date=`date +%F` ;
Diff=`echo "( \`date -d $Today_Date +%s\` - \`date -d $Start_Date +%s\`) / (24*3600)" | bc -l`;
Diff1=${Diff/.*};
if [ "$Diff1" -ge "90" ]
then
Here we have used {:} as field separator, and finally substring to get exact date field then sorting and finding least
value.
Subtracting today's Date by using Binary calculator and then removing decimals.
Hope it helps .....