I`m new to media foundation and C++.
But I want to create a virtual video capture device which can be used by Microsoft Expression Encoder.
Can you tell me in which direction to look?
I think it should be something working asynchronously and a source will be byte stream from mobile device.
Thanks in advance.
I don't think you want to look into Media Foundation for this. Expression Encoder uses a richer API to capture video with, DirectShow. You want a virtual DirectShow camera, which was discussed multiple times and has a simple sample project to start from.
Virtual webcam input as byte stream
Simulate a DirectShow Webcam
How to use directshow filter as a live input for Expression Encoder 4?
Supported USB Capture Devices Expression Encoder 4:
Any device that provides a dshow filter is supported by EE4.
Currently, there is no list currently available of supported devices,
though most usb devices have little to no issues with encoder.
Related
I bought one of these:
https://www.aliexpress.com/item/Smart-finder-Key-finder-Wireless-Bluetooth-Tracker-Anti-lost-alarm-Smart-Tag-Child-Bag-Pet-GPS/32806261079.html
As far as I can tell it is a BLE (Bluetooth Low Energy) location tag.
I downloaded the app for it onto my iphone, and the app instantly recognised it and connected to it. The iPhone app seems to know how far away the tag is - it has a little map of the local area and says how many feet away. I was able to set the device name via the app, but I'm not sure if that set it locally or on the tag itself. The iPhone app also has a "find" button - when you press it, the tag beeps.
So I want to know how I can program this thing myself. I want to be able to identify it when it is nearby, connect to it and make it beep. I've searched for quite a while but not come up with much.
I'm assuming (wrongly/rightly?) that there is some general standard or approach for talking to these BLE location devices and carrying out the basic functions with them - but what is that standard - where is the documentation?
Does anyone have any idea how to program these BLE location tag devices?
BLE devices typically communicate using GATT, either using standard GATT services, or custom ones. The command to make it beep is probably implemented using a custom GATT service.
For finding out the distance to the beacon, typically the RSSI is used. This is a measure of the received power. It needs to be compared to the output power at the emitter. Usually beacons will put their output power in advertisement data, so it can be used without connecting to them. Here since the app is also able to send commands to the beacon, chances are it keeps a connection to it and has a custom GATT protocol to retrieve the output power.
Here is what I would suggest:
Read up on BLE, especially advertising and GATT. For instance read this for advertising and this for GATT. The full BLE spec is available here but should be used for reference and not introduction
Sniff the communication between your device and your phone. You can see this other answer of mine to get started
Replicate the communication protocol in you own app. For that you'll need to use your target platform's BLE libraries. For instance for iOS it is CoreBluetooth
I wonder why my Sparkfun 13.56 Eval card (with sm130) is loosing its green 'search' led when reading my cell phones NFC, and won't come back until reset.
The 'found' led lights for a tiny moment when my Mifare cards reads. I thought NFC could be used with it.
I read something about NFC uses a NDEF formatting, but I can't get a grip on it :-)
I have tried Nokia 1520, Sony Experia Z2, Iphone 5, Samsung Galaxy S2 S3 S4 & S5. All with same result.
Can I use it with NFC, and if so: How?
That depends on what you want to achieve. The SM130 is a MIFARE reader that supports only cards/tags with MIFARE Classic and MIFARE Ultralight protocols. Most Android NFC devices cannot emulate such tags.
Some Android NFC devices have an embedded secure element (or can be used with a UICC-based secure element) that is capable of emulating MIFARE Classic. However it is impossible to modify the data on those secure elements unless you re the logical owner of them (for embedded SE: that's typically the device manufacturer; for UICC: that's typically the mobile network operator).
Many new Android NFC devices support host-based card emulation (HCE). However, with this card emulation functionality you can only emulate smartcard applications on top of ISO/IEC 7816-4 + ISO/IEC 14443-4, so you cannot emulate a card/tag that can be read with the SM130 (as both MIFARE Classic and MIFARE Ultralight protocols operate on lower protocol layers only).
With the SM130 "SELECT TAG" and "SEEK FOR TAG" commands you should be able to detect the presence of an Android NFC device (as well as a Nokia Lumia, as well as an iPhone 6). However, for Android devices, you will typically receive a random serial number (that's a requirement for peer-to-peer mode) on every selection -- thus, no usefule information to identify a device. With an iPhone 6 (with activated Apple Pay) you should receive a serial number that can be used to identify the device.
Yes. That is to say it is technically capable. It can read the frequency and the format.
There is a lot to it but I'm sure it will be a very rewarding journey.
Github link with an NDEF library:
https://github.com/bjepson/Arduino_NDEF_Reader
On that page you will also find a link for the layout and code on which he based his project. It uses the sm130 so you should be able to use this to get running.
Happy coding!
I want use the webcam for image capturing and interface that web cam with avr atmega 16. Since the images are big in size where should I store the data. And secondly, Since the webcam gives the images in a form of file format how can i decode that file format and store the details in a particular storage medium?
Thankyou in advance
I got the answer. I can use the sd card for my solution. I found the blog over the internet that showed how to interface arduino with webcam
Please excuse my ignorance, im very new to windows.
Windows 7 64bit
Point Grey Grasshopper 2 GigE Camera
I have a gigE Camera that I want to use in processing. In processing I can use any camera that shows up as a quicktime device. But the gigE camera does not show up in the camera list.
The camera is registered as a directShow device.
Is there a way to either get the camera to be available as a quicktime device, or use the directShow device in processing?
The [effectively] primary video capture API in Windows is DirectShow. If you have it available as DirectShow device then you certainly can get video and process it. As you did not ask specific questions about processing, perhaps this should be a good starting point for you:
Windows SDK Samples in $(WindowsSdk)\Samples\multimedia\directshow
DirectShow Samples on MSDN
we want to route the call to ip network instead of GSM network. for that purpose first we are connecting our mobile phone to a PC through bluetooth. So we want to transfer the real time audio data to that PC over bluetooth. Could you please help..
If you can trick the mobile phone to function as a bluetooth headeset [profile] -- which I doubt any phones would support because phones are, well, the consumers of such headset devices -- then you could use it like any other "headset" device. (It would be much more feasible to just purchase a headset or plug in a microphone.)
As far as "where the real time audio data is stored?" -- well, that just won't go anywhere. The "data" (perhaps already in an encoded/delta form) is briefly (like milliseconds!) "stored" in a few small buffers / integrated circuits in the radio circuitry. This circuity will vary based upon phone/radio module used and is not accessible from a PC.
Happy doing productive things.