Is there any memory limit for NFC card emulation? - arduino

I want to send information from an Arduino to a phone via NFC.
To do this I have a PN532 module. The way I want to send information is to use the module to emulate an NFC tag and read the message from the phone. The reason I don’t want to use a real NFC card is due to the memory limitations. Most of them have near 800 bytes of memory and the ones with more memory are expensive. In case I emulate a card with PN532 module, will I still have some memory limitation?
I founded this in the documentation:
PN532-HCE
What I saw that was important was APDU bytes limitations. I’m not really an expert in NFC and I don’t know if this would affect me in the emulated card memory.
The information that I wanted to have is a JSON in plain text. I think that is supported in NDEF messages and so iPhones would be able to read it. The JSON could have up to 2500 characters or bytes and would change a lot of times each day so the rewrite part of the physical card is a problem as well.

My understanding is that ISO 14443-4 is a transmission protocol https://webstore.iec.ch/preview/info_isoiec14443-4%7Bed4.0%7Den.pdf and therefore is a limit of how much that you can send/receive in one Command. This does not limit you from using multiple commands to send and receive to emulate more memory.
So really what should happen a device would issue iso 7816-4 commands to the emulated card over ISO 14443-4.
A device when reading should obey what has been identified as the max transceive length which the device says it should supports (in your case should be 256 Bytes for Short APDU command) and thus it should read multiple 256 Byte chunks to read the whole file (memory)
See the ISO 7816-4 read binary command https://cardwerk.com/smart-card-standard-iso7816-4-section-6-basic-interindustry-commands/#chap6_1 it has offset and length parameters
So for larger data basically your HCE response code on the Arduino should get passed from the PN532 a "read binary of 0 to 255 bytes" command for which you would respond with the first 256 bytes of the JSON data.
Then a second "read binary of 256 to 512 bytes" would be issued by the device, etc until all data you want to return has been returned.
Therefore it is reading the emulated file (memory) is chunks of the max size that can be transmitted by the short APDU (256k) supported by this device.
Note I've not done any coding with this just have knowledge of the standards.
Note you can get cards with up to 32K storage, yes they cost more but a 4Kbyte Desfire card is only about 150% the price of an Ntag216 with 888byte memory.

Related

Can I send sensor (couple of bytes) over beacon?

I want to send some sensor data over BLE to multiple nodes.
I thought of changing advertising data at 4Hz. Can it be done?
Yes! It is a common approach to use a BLE beacon packet to advertise sensor data. A few points:
Embedded BLE platforms typically allow advertising at a minimum of 10Hz, and let you change the advertisement between transmissions. I have done this on the Nordic 52x chips, but hopefully STM32 supports it as well.
BLE 4.0 advertising packets are limited to 23 usable data bytes, but you typically need to reserve a few to indicate it is "your" transmission. There are significant further restrictions if you intend to use iOS devices to scan the transmissions. If using Android, Linux, or other embedded system scanners you can use nearly the full 23 bytes.
Keep in mind that anybody in radio range can scan for these advertisements and read the data. Make sure the sensor data are not sensitive enough to warrant a security layer.

How does a kernel driver talk with another device?

I have an FPGA board with unix-based firmware. I need write out the program to run on this firmware that will send commands to some devices via I2C bus and will receive responses. I use for this special character file in Unix that i map in my program and write to it special commands & read from it responses. Each memory area in this mapped memory corresponds specific register of the FPGA which specified in Unix-based firmware (as i understand).
So, the question is the next one. As i understand, when i write some command to that mapped memory region of the special character file the kernel calls certain driver to handle bytes that I've written and send them through I2C bus (for example). Am I right? If so, is there some guarantee that the response from that device will be buffered and I will be able to read it from the mapped region in any time? Or does it depend on implementation specific driver?
I'm sorry if question is not clear some way, I am a newbie in this stuff.

Multi-drop bus to rs232 Convert

I have a project using MDB (multi-drop bus) for vending machine (VDM).
The VDM has a MDB-RS232.
I'm not sure if it converts 9bit - 8bit (MDB-UART).
How do I read data from VDM in my computer?
Thanks all
MDB (multi-drop bus) is 9 bit, because after the standard 8 data bits (like in standard RS232 UART communication) there is a 9th bit called "mode".
(Wikipedia on MDB: "the mode bit differentiates between ADDRESS and DATA bytes.")
But you can read such data even with regular 8-bit RS232 interfaces, e.g. a plain standard USB-to-RS232 device for PC.
Here is how:
Use 9600 baud, 8 data bits, 1 stop bit, but RS232 parity setting "Space". Make sure you receive the original character value even in case of a Parity Error indication. Any MDB address byte from your VDM will be received with a Parity Error (but still be displayed correctly). Any data byte will be displayed without error.
For sending MDB ADDRESS and DATA bytes using a standard 8-bit RS232 port, you could apply temporary parity changes: Change the parity setting to "Mark" before sending an address byte, then change back to "Space" before sending data bytes.
On Windows, you can do such tricks with our Docklight software (see Docklight and MDB). It's free for basic testing and there is also a related 9-bit example project.
On Linux / Raspberry Pi other users have successfully implemented the parity trick, too, see this stackexchange post about a MDB + Pi.
But also with RealTerm, Teraterm, Termite, Bray, YAT or any other RS232 application you should be able to read the data, as long as it handles "Space" or "Mark" parity settings correctly.
You'll need an adapter which will do all convert operations on-the-fly and in real time. If you want to emulate VMC (master), you'll need MDB-UART master adapter. If you want to emulate MDB peripheral device (such as coin changer, bill validator etc), you'll need this. For two-way "sniffing" MDB bus you'll need a combination of these devices.
Direct connection PC's RS-232 to MDB will not work due to strict MDB timings (delay between VMC command and peripheral response must not exceed 5ms, delays between POLL requests are 50-300ms in general). I mean pretty reliable functioning available for commercial purposes.

Can't get OK response from XBee upon "+++"

I have been trying to set up two XBees to communicate since the last three days. X-CTU seems to be the perfect option to do so, however, it is a real menace when it comes to discovering XBees on serial ports.
I was able to detect one XBee by luck just once and the other one never showed up. I have even replaced both my XBees. I am trying to figure out the alternative, i.e. using a serial console to perform the operation. I haven't been able to receive an OK response from the device upon issuing +++.
Since I haven't had a good experience using a PC to communicate with ESP8266 devices earlier, I tried to figure out a workaround by using the second Serial port of an Arduino to send such configuration messages and read the response by printing it out on the default serial console.
It also appears that configuration messages can differ depending on the mode of the device. If it's in API mode, the frame has to be generated in a specific format (I use the X-CTU frame generator for this purpose).
Why am I not able to receive a response from the XBee upon issuing a +++?
The devices are Series 1 XBees and the exact part number is XB24-AWI-001. Any help is highly appreciated.
Have you considered the XBee being in API mode? Maybe should you consider to reflash the device in AT mode to start playing with it.
To test if it's in API mode, you can refer to the guide, chapter 9 for the API mode structure:
http://eewiki.net/download/attachments/24313921/XBee_ZB_User_Guide.pdf?version=1&modificationDate=1380318639117&api=v2
Basically, a datagram in API mode starts with ~, and it's built as follows:
[0x7E|length(2B)|Command(1B)|Payload(length-1B)|Checksum(1B)]
As 0x7E is ~ on the ASCII table, you should try typing a bogus datagram in a serial terminal session like:
~ <C-d> AAAA
N.B.: The <C-d> characters means Control-d under unix., which is the EOF character.
Obviously such a message isn't likely to work, and you will receive a reply asking you to send that datagram again. That's because the EOF character being ASCII code 4, it means that the length of the datagram will be 4 bytes. So then you send four bogus bytes, the checksum will be A, which is very likely to be right, and the receiver will assume the transmission has been corrupted. So the datagram will be asked again, meaning you will receive a datagram to do that query.
Though I can only advice you to consider running it only in API mode (more reliable and a better API, but you cannot play around with it and understand what's going on by tapping on the line with a logic analyzer… though giving enough time, you'll start to read API datagrams like it's English ☺).
I wrote a page with a few resources to check on how to reflash the XBees:
https://github.com/hackable-devices/polluxnzcity/wiki/Flash-zigbee
and here's other advices from another totally unrelated project:
https://github.com/andrewrapp/xbee-api#documentation
And I also wrote a lib (aimed at beaglebones but you can tweak it for your use) that handles API mode 2 with XBees:
https://github.com/hackable-devices/polluxnzcity/blob/master/PolluxGateway/include/xbee/xbee_communicator.h
https://github.com/guyzmo/polluxnzcity/blob/master/PolluxGateway/src/xbee/xbee_communicator.C
but I bet with a little google search you can find more widely used libraries than those ones, and even some aimed to be run on Arduinos (N.B.: that lib was originally written for Arduinos, and then adapted to run for Beaglebone, so reversing the operation shouldn't be hard).

Can an iBeacon have a data payload

I know that the definition of an iBeacon is a fixed specification of the advertising packet that it is transmitting:
9 bytes iBeacon prefix
16 byte UUID
2 bytes Major
2 bytes Minor
1 byte TX power
That being said, is there anything that would prevent a beacon from both sending out advertising iBeacon packets to wake up a phone's app and also transmit actual data content as part of a BLE packet? Would there be a lot of handshaking required in order to send / transmit additional data?
Is there some other way for a beacon to transmit data? One of my large concerns is spoofing of my beacons to falsify the data I am attempting to collect. I was hoping that being able to transmit some data along with an iBeacon packet would allow me to limit the spoofing.
Is something like that even feasible?
A few possibilities:
You can tack on one extra data byte to the end of the iBeacon transmission before it reaches its max advertisement length. This byte cannot be read by iOS devices, though, because Apple blocks reading raw data of iBeacon adverts. It would work on Android/Mac/Linux.
You can interleave a second advertisenent with mostly data fields and line the two up with a common identifier like the minor. The more bytes you allocate to lining up the advertisements, the fewer you have to use for data. You can't use the mac tobline them up, because that is unreadable in iOS for the iBeacon transmission.
You can make the beacon connectable via GATT, and read data fields with GATT attributes. The beacon will stop advertising, though, when connected. This limits throughput and reliability.
All of these options require you to build a custom BLE beacon that does multiple advertisements. It is not a trivial undertaking.

Resources