How to set template in GT-521F32 fingerprint scanner? [closed] - serial-port

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I'm working on a project using the GT-521F32 fingerprint scanner, i want to store the templates in a database in my computer rather than have them stay in the fingerprint scanner's memory. While I am able to get the templates from the fingerprint scanner and store it in my computer, I am unable to set the templates back to the fingerprint scanner. I am currently using RealTerm to send commands to the fingerprint scanner.
First, I would get the template from ID 1
Send command (Get template from ID 1):
0x55 0xAA 0x01 0x00 0x01 0x00 0x00 0x00 0x70 0x00 0x71 0x01
Output:
55 AA 01 00 00 00 00 00 30 00 30 01 (Acknowledge)
5A A5 01 00 ...(498 bytes of template data)... FD DF (Data Packet)
I would then delete the template in ID 1, and set the template that I got from ID 1 in ID 2
Send Command (Set template to ID 2):
0x55 0xAA 0x01 0x00 0x02 0x00 0x00 0x00 0x71 0x00 0x73 0x01
Output:
55 AA 01 00 00 00 00 00 30 00 30 01 (Acknowledge)
Send Command (Data packet):
0x5A 0xA5 0x01 0x00 ...(former ID 1 template data)... 0xFD 0xDF
Doing this doesn't result in an acknowledgement packet
Even when using the demo program provided, when I use the setTemplate option, it just results with "Communication error". How do I set templates from my computer to the fingerprint scanner's memory?

I have experienced the same issue with UART interface. Switching to the onboard USB interface (J1 pads on the bottom of the PCB) will eliminate the error message but you would need to use USB Mass Storage interface instead of UART which is not ideal.

Related

Bluetooth LE, once read fails, it returns "Read Characteristic Fail" for subsequent requests, but able to receive notifications

Scenario 1 -
Reading characteristics from BLE device, one request fails ("Read Timeout"), then all subsequent requests fails ("Read Characteristic Fail"), but when BLE device sends notification, it receives. but unable to Read or Write. Device is still connected.
`12-28 13:25:00.620 25648 25666 D Device : resolve: read|<F9...SERVICE_UUID>|<06...CHARACTERISTIC_UUID> 03 E8 00 64 01 7D 50 00 01 F5 01 2D 00 5B 00 47 00 15 00 09
// FULL 20 BYTES RETURNED at 13:25:00, it was working normally then
12-28 13:25:26.230 25648 25666 D Device : reject: read|<F9...SERVICE_UUID>|<06...CHARACTERISTIC_UUID>
// REJECTED at 13:25:26, errors started from here`
Scenario 2 -
Reading characteristics from BLE device, one request returns ~half data (was expecting 20 bytes, but only received 9 bytes). then all subsequent requests fails ("Read Characteristic Fail"), but when BLE device sends notification, it receives. but unable to Read or Write. Device is still connected.
Logs
`12-28 13:25:00.620 25648 25666 D Device : resolve: read|<F9...SERVICE_UUID>|<06...CHARACTERISTIC_UUID> 03 E8 00 64 01 7D 50 00 01 F5 01 2D 00 5B 00 47 00 15 00 09
// FULL 20 BYTES RETURNED at 13:25:00, it was working normally then
12-28 13:25:26.230 25648 25666 D Device : resolve: read|<F9...SERVICE_UUID>|<06...CHARACTERISTIC_UUID> 0F 00 C5 00 00 00 10 00 00
// ONLY 9 BYTES RETURNED at 13:25:26, errors started from here`
Can't BLE ignore one "read fail", and continue reading ? when it is still connected. When reading same characteristic again, it also returns "Read Characteristic Fail"
Most probably notifications and read requests are conflicting (can't avoid, because notification can come anytime), and starting/ stopping notifications multiple times causes same issues.
Environment
Ionic--6.20.6
Angular--15.0.4
Capacitor--4.6.1
capacitor-community/bluetooth-le--2.0.1
Device
Xiaomi Redmi Note 10 Pro (android 12)
Also tested with other android/ios phones, but same result
I did tried -
Reading same characteristic again, but same error
Tried reading 1 second after notification came (1 second time gap after notification received, it is not possible to add time gap before notification arrives because it can arrive at any time, and in random numbers)
implemented Queue, at a time only 1 request will be active.

How to program XBee S2 (standalone)

I currently have XBee S2 and I want to program the GPIO pins to switch ON/OFF LED. I saw a lot of tutorial by using external MCU(like Arduino) but in my case I want to use the XBee without any external MCU attached. It seems that XCTU software only allow the GPIO pins to be set HIGH, LOW, input, etc without any logic that can change their conditions.
So is there any IDE or software that allow XBee GPIO pins to be program?
Digi sells a "Programmable XBee" that includes a separate 8-bit processor you can compile C code to, but that probably isn't what you had in mind.
If you create a network with two XBee modules (A and B), you can have a computer wired to XBee A send a "Remote AT Request" (in API mode) to XBee B to change the I/O pin wirelessly. If you read through the documentation, you'll find everything you need to know about API mode and the Remote AT Request frame type.
Is that the sort of control you were looking for? If not, can you describe your use case in more detail?
Have you found your solution?
Using XCTU has settings for the DIO pins which include HIGH and LOW. What that means is when the XBee comes out of reset, that will be the default setting. So if set HIGH, coming out of reset, the DIO pin will default HIGH. That is my understanding anyway and that seems to be correct.
I am going through the exercise to have a non-programmable XBee sans micro controller to switch some circuits on and off remotely. The setup is as follows:
1) a control XBee which will be attached to a micro
2) a router XBee without a micro (standalone)
Fairly mundane. What I noticed in my setup was that the standalone XBee's associate pin would toggle for a bit then stop for a bit and repeat. This didn't seem right. When I sent a command from the controller XBee it was hit or miss as to whether the command was carried out by the standalone unit. A bit puzzling.
To check out what was going on with the standalone XBee, I hooked up a computer and fired up XCTU. The problem went away. To cut to the chase, I discovered that when I hooked the computer's serial TX to the XBee's RX pin solved the issue!
My XBees are setup with a level shifting IC so the outside world is 5 volt and the XBee is at 3.3v. My best guess is a floating RX pin which is connected to a level shifter is the culprit. Now holding the RX high at 3.3v seems to overcome the problem.
Then it is just a matter up using the API to set the DIO pin HIGH or LOW
In my case I am toggling DIO_1. Here is the API call from the controller XBee:
7E 00 10 17 01 00 7D 33 A2 00 41 50 EA D5 FF FE 02 44 31 05 69
The hex 44 31 is D1 in ASCII the 05 sets the pin HIGH
And for low:
7E 00 10 17 01 00 7D 33 A2 00 41 50 EA D5 FF FE 02 44 31 04 6A
Again the 44 31 but now set to 04 - LOW
And the standalone XBee responds properly to the command. No micro attached!
The floating RX pin may or may not be your issue, but the API string should work using your config params. And don't forget the PAN ID must be the same.
Hope this helps.

Implementing OSDP encryption problems

I'm having trouble implementing the encryption part of the OSDP protocol on an Arduino.
http://www.siaonline.org/SiteAssets/SIAStore/Standards/OSDP_V2%201_5_2014.pdf
I've successfully done the negotiation part and have verified the RMAC-I response by decrypting the data and comparing with the plaintext. The part I'm stuck on is the encryption of the data packets. According to the spec, I use the RMAC-I response as my ICV for the aes128 CBC and I encrypt the packet using the S-MAC2 key.
My POLL packet (in hex) is as follows:
53 01 0e 00 0c 02 15 60
This gets padded
53 01 0e 00 0c 02 15 60 80 00 00 00 00 00 00 00
This gets xored with the ICV then encrypted with S-MAC2 as the key.
The first 4 bytes of the result is stored in the packet and sent
53 01 0e 00 0c 02 15 60 91 86 b9 3d 4a 29
Unfortunately the reader rejects the poll command with a NAK 06
I'm presuming my MAC values have not been computed correctly as I've compared my packet with the HID DTK tool (obviously the MAC and CRC values are the only difference). Can someone validate my process?
Seems my process was correct but was let down by implementation (off by one error).
2.1.7 is the current SIA spec. IEC 60839-11-5 should be out soon. (the IEC standards version.)
The processing your describe is for the MAC suffix not the payload encryption. MAC2 because it's only one block long (else you'd use MAC1 and then MAC2.) OSDP uses AES to encrypt a throw-away copy of the entire message and then uses some bytes of the last cipherblock as the MAC that is transmitted. OSDP encrypts the payload, if there is one. In modern AES implementations you pass in an IV and a key and a buffer so one would not look at it as xoring the IV with the plaintext.

Controlling projector with serial port

I've been trying to control Panasonic PT AE3000U through computer serial port for a last few days. Computer that I am using has serial port on its back, and I've confirmed that port functions at some level by connecting RXD and TXD with jump wire.
Technical information about projector (10Mb): http://www.projectorcentral.com/pdf/projector_manual_4506.pdf
Section that tells about serial communications starts from page 51.
According to manual and that what I've read from online, start-up command in hex should be 02 50 4f 4e 03, but I haven't been able to get it work with that.
Variations of that command I've tried to send with RealTerm and Termite:
02 50 4f 4e 03
02504f4e03
0x02 0x50 0x4f 0x4e 0x03
0x020x500x4f0x4e0x03
And other similar combinations and formats
I assume that command on third row should be the right one, because if I send that as numbers with Realterm, and connect RDX and TXD together similarly as I did before, I get this on screen (display settings are "display as ascii").
http://i.imgur.com/PjpRms7.png
But when I connect cable to the projector and repeat the process, nothing happens. Baudrate is set at 9600, which should be correct.
So, I guess it somehow solved itself. I decided to restart my computer and projector. "Just in case there would be something left from all of mine failed attempts"
And it started to work, I sent line 0x02 0x50 0x4f 0x4e 0x03 with Realterm as numbers only.
My settings were:
Baudrate 9600
Parity: None
Data bits: 8
Stop bits: 1

Xbee 64-bit address in API mode

I'm currently working on a project in which I use antennas such xbee XBee 2mW Wire Antenna - Series 2 (ZigBee Mesh).
how can I get my antenna64bit address so I can set it up using my software automatically?
Can I send zigbee message to antenna so that it returns a message that contains it`s antenna address, then I decode the message and know the address of my antenna.
thanks.
If you want an easy way of doing this, you can send one message from the Router/End-Device to the Coordinator in your ZigBee network. You can use the special 16-bit Network Address 0x0000 to address the Coordinator.
This message should contain the 16-bit Network Address (or the 64-bit Address), so later the Coordinator can use this address in order to communicate back with this node. That is how you can do if you work with AT Mode. If you work with API Mode, the "Receive Packet" already contains the address of the sender, so you do not need to explicitely add it to your message.
When you press one time the commission button: the module sends a node identification broadcast transmission.
Thus, I assume you are using the API mode, so from your Coordinator API (software side) you can send a Remote AT Command Request, in broadcast, which set the CB (commission button) to 1. This is the same of press the commission button virtually at one time. Here is the packet:
7E 00 10 17 00 00 00 00 00 00 00 FF FF FF FE 00 43 42 01 67
Then, when all your devices receive this packet, they should answer to the coordinator with a Node Identification Indicator, which contains their 16-bit and 64-bit address. This way, you can automatically set up your network on software.

Resources