Raspberry Pi : use VLC to stream webcam : Logitech C920 [H264 Video without transcoding + Audio + LED control] - SpyCam / BabyCam - http

I have a RaspberryPi and a Logitech C920 Webcam.
I want to use these devices to work as a surveillance / babycam, i.e. : Stream audio + video over HTTP (or any other protocol) without cpu intensive video
transcoding
The C920 webcam is able to stream H264 natively, so theoretically I won't need to ask RaspberyPi+VLC to transcode the video stream.
The built-in C920 Microphone stream does not seem to be included in the webcam stream.
Cam and microphone are 2 separate devices.
The C920 also has a built-in led indicator. I want to control that to avoid the LED to ligth up while recording.
How can I achieve that ?

This solution is tested and working with versions indicated below.
Using this method, the RaspberryPi3 is always around 5% CPU.
edit 2018-11-18:
One can also see the all-in-one solution prototype on RaspiVWS project homepage (for curious people, see GitHub project)
0. Preliminary checks
1. Webcam video configuration
2. Microphone identification
3. Stream using VLC
4. Make RaspberryPi3+ a Wifi access point
(If you have no existing network to connect your Pi to)
5. Script at startup or as a service
6. [EDIT] Additional commands : infinite loop recording & split video
7. [EDIT] Program execution at a given instant
8. [EDIT] TROUBLESHOOTING
0. Preliminary checks
The answer is working with Raspbian 9.4 Stretch.
Check your version with the following command :
lsb_release -a
You should see:
No LSB modules are available.
Distributor ID: Raspbian
Description: Raspbian GNU/Linux 9.4 (stretch)
Release: 9.4
Codename: stretch
We can rely on the following tools :
v4l allows to control the webcam. It offers the command v4l2-ctl which will allows us to control and config the webcam.
VLC which is not only a video player, but also has powerful streaming capabilities
You can install them with the following commands :
sudo apt-get install vlc
sudo apt-get install v4l-utils
Once everything is installed, you can configure your C920 webcam.
1. Webcam video configuration
v4l2-ctl --all lists all available devices and their config
pi#raspberrypi:~ $ v4l2-ctl --all
Driver Info (not using libv4l2):
Driver name : uvcvideo
Card type : HD Pro Webcam C920
Bus info : usb-3f980000.usb-1.5
Driver version: 4.14.30
Capabilities : 0x84200001
Video Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
Width/Height : 1920/1080
Pixel Format : 'H264'
Field : None
Bytes per Line : 3840
Size Image : 4147200
Colorspace : sRGB
Transfer Function : Default
YCbCr/HSV Encoding: Default
Quantization : Default
Flags :
Crop Capability Video Capture:
Bounds : Left 0, Top 0, Width 1920, Height 1080
Default : Left 0, Top 0, Width 1920, Height 1080
Pixel Aspect: 1/1
Selection: crop_default, Left 0, Top 0, Width 1920, Height 1080
Selection: crop_bounds, Left 0, Top 0, Width 1920, Height 1080
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 30.000 (30/1)
Read buffers : 0
brightness (int) : min=0 max=255 step=1 default=-8193 value=128
contrast (int) : min=0 max=255 step=1 default=57343 value=128
saturation (int) : min=0 max=255 step=1 default=57343 value=128
white_balance_temperature_auto (bool) : default=1 value=1
gain (int) : min=0 max=255 step=1 default=57343 value=255
power_line_frequency (menu) : min=0 max=2 default=2 value=2
white_balance_temperature (int) : min=2000 max=6500 step=1 default=57343 value=4822 flags=inactive
sharpness (int) : min=0 max=255 step=1 default=57343 value=128
backlight_compensation (int) : min=0 max=1 step=1 default=57343 value=0
exposure_auto (menu) : min=0 max=3 default=0 value=3
exposure_absolute (int) : min=3 max=2047 step=1 default=250 value=333 flags=inactive
exposure_auto_priority (bool) : default=0 value=1
pan_absolute (int) : min=-36000 max=36000 step=3600 default=0 value=0
tilt_absolute (int) : min=-36000 max=36000 step=3600 default=0 value=0
focus_absolute (int) : min=0 max=250 step=5 default=8189 value=0 flags=inactive
focus_auto (bool) : default=1 value=1
zoom_absolute (int) : min=100 max=500 step=1 default=57343 value=100
led1_mode (menu) : min=0 max=3 default=3 value=3
led1_frequency (int) : min=0 max=255 step=1 default=0 value=0
The last 2 lines gives us clues to control the built-in LED indicator, for instance, to deactivate the LED indicator.
The -d0 parameter indicates on which device the modifcation should be applied (if you ahve several cams or its device name changed)
v4l2-ctl -d0 --set-ctrl=led1_mode=0
v4l2-ctl -d0 --set-ctrl=led1_frequency=30
2. Microphone identification
The command arecord -l will give us the list of ALSA devices. (ALSA is the audio manager in RaspberryPi)
pi#raspberrypi:~ $ arecord -l
**** Liste des Périphériques Matériels CAPTURE ****
carte 1: C920 [HD Pro Webcam C920], périphérique 0: USB Audio [USB Audio]
Sous-périphériques: 1/1
Sous-périphérique #0: subdevice #0
This means that the built-in microphone is located on hardware 1, periph 0. You can check that in command line with alsamixer -c 1 -V capture
3. Stream using VLC
VLC can be launched using command line.
Since we do not have video and audio already mixed together in a single stream access, we need to ask VLC to do that.
It is the role of the transcoding feature of VLC.
Stream over HTTP
We also want to stream over HTTP, VLC can also achieve that.
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
Explanation
v4l2:///dev/video0:chroma=h264 gives VLC input data : it grabs the video stream from /dev/video0 and that it is a h264 encoding (if your webcam is the 0th video device, it could also be another number, refer to v4l2-ctl --all command)
:input-slave=alsa://hw:1,0 tells VLC to take another input stream with the video. It is the audio stream identified from the arecord above
--sout tells VLC how to handle the output stream
#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1} tells VLC to convert the audio to mpga codec, 128 kbits/s, 2 channels, 44100 Hz sampling, using all 4 RaspberryPi3+ cores. audiosync is optional. It took me some time to realize this : the webcam h264 video stream is kept as provided (no video transcoding).
:standard{access=http,mux=ts,mime=video/ts,dst=:8099} tells VLC to provide stream over HTTP on port 8099 with the TS muxing format.
On any other device, you can use VLC to access your RaspberryPi3+ VLC stream :
vlc http://<raspberrypi-ip>:8099
It works with any VLC client :
windows
unix
mac
confirmed with iPhone 7 (v11.2.1 (15C153)) with VLC app (3.0.3 (305))
NB : Having the video already in H264 1920x1080 30fps in output of the webcam saves a lot of RaspberryPi3+ CPU.
Different containers
You can also record to various containers, or even containers + stream, here are some examples:
record to MKV
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=file,mux=mkv,dst='/home/pi/Webcam_Record/MyVid.mkv'}'
record to MP4
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=file,mux=mp4,dst='/home/pi/Webcam_Record/MyVid.mp4'}'
record + stream
cvlc v4l2:///dev/video0:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:duplicate{dst=standard{access=file,mux=mp4,dst='/home/pi/Webcam_Record/MyVid.mp4'},dst=standard{access=http,mux=ts,mime=video/ts,dst=:8099}}'
Format filenames, timestamps
You can also use formatted string for filenames. Prefix command like this:
cvlc --sout-file-format v4l2:///dev/video0:<...> dst='/home/pi/Webcam_Record/%F_%T_MyVid.mp4'}
It will produce a file named YYYY-MM-DD_HH:MM:SS_MyVid.mp4 (: are authorized in unix filenames, but not in windows filenames)
4. Make RaspberryPi3+ a Wifi access point
If you have no existing network to connect your Pi to:
You can follow instructions from official RaspberryPi3+ website : https://www.raspberrypi.org/documentation/configuration/wireless/access-point.md
Otherwise, if you already have a network you can connect to your pi using its IP.
See part 3
On any other device, you can use VLC to access your RaspberryPi3+ VLC
stream : vlc http://<raspberrypi-ip>:8099
5. Script at startup
You can put many commands into a bash file my_bash_file.sh.
For instance :
#!/bin/bash
# auto stream launch + led off
#cvlc -vvv for verbose debug
# change this value to adapt to your webcam device number
deviceNb=0
# force video format + led off
v4l2-ctl -d${deviceNb} --set-fmt-video=width=1920,height=1080,pixelformat=1 --set-ctrl=led1_mode=0
# if delay needed
# cvlc v4l2:///dev/video${deviceNb}:chroma=h264 :input-slave=alsa://hw:1,0 :live-caching=2500 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
# no delay
cvlc v4l2:///dev/video${deviceNb}:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
Basic method
You can then make the rc.local script use your custom script to be executed at startup.
You can follow instructions from official RaspberryPi3+ website : https://www.raspberrypi.org/documentation/linux/usage/rc-local.md
Another method : Create a deamon service
We will create a "webcam-stream" service, assuming all necessary bash commands are located /home/pi/Webcam_Record/vlc_webcam_stream_service.sh
cd /lib/systemd/system/
sudo nano webcam-stream.service
And write in it:
[Unit]
Description=Custom Webcam Streaming Service
After=multi-user.target
[Service]
Type=simple
ExecStart=/home/pi/Webcam_Record/vlc_webcam_stream_service.sh
Restart=on-abort
[Install]
WantedBy=multi-user.target
Make the service file and the script executable:
sudo chmod 644 /lib/systemd/system/webcam-stream.service
chmod +x /home/pi/Webcam_Record/vlc_webcam_stream.sh
Allow VLC to be excuted as root:
sudo sed -i 's/geteuid/getppid/' /usr/bin/vlc
Reload deamons and enable our service:
sudo systemctl daemon-reload
sudo systemctl enable webcam-stream.service
Check it is recognized and working:
sudo service webcam-stream status
sudo service webcam-stream start
You can check with another computer that the video is correctly streamed.
Note that the webcam won't be available while the service is running.
Once you're done, you can connect to the RaspberryPi3+ wifi access point and access your video stream.
6. [EDIT] Additional commands : infinite loop recording & split video
The following bash scripts allows infinite recording of 15 s long videos with timestamped filenames and streaming
#!/bin/bash
# auto stream launch + led off
#cvlc -vvv for verbose debug
# adapt to video device name
deviceNb=1
# loop duration
duration=15
#infinite recording
#loopOption=
loopOption=--loop
# force video format + led off
v4l2-ctl -d ${deviceNb} --set-fmt-video=width=1920,height=1080,pixelformat=1 --set-ctrl=led1_mode=0
# if delay needed :live-caching=2500
cvlc --sout-file-format --run-time=${duration} ${loopOption} v4l2:///dev/video${deviceNb}:chroma=h264 :input-slave=alsa://hw:1,0 --sout '#transcode{acodec=mpga,ab=128,channels=2,samplerate=44100,threads=4,audio-sync=1}:duplicate{dst=standard{access=file,mux=mp4,dst='/home/pi/Webcam_Record/%F_%T_Spy.mp4'}:dst=standard{access=http,mux=ts,mime=video/ts,dst=:8099}'
7. [EDIT] Program execution at a given instant
EDIT 04 aug 2018
To launch the execution today at 14:00, you can use the following command:
./my_vlc_webcam_script.sh | at 1400
See the at command manual for further details.
8. TROUBLESHOOTING
EDIT 07 jul 2018
I recently ran into VLC error after a dist-upgrade:
VLC media player 2.2.6 Umbrella (revision 2.2.6-0-g1aae78981c)
[00acb230] pulse audio output error: PulseAudio server connection failure: Connection refused
The solution I found is to launch VLC in GUI mode and change the default audio device to ALSA (instead of Automatic). I can also be done in command line.
See the solution found here VLC issues with PulseAudio
cvlc -A alsa,none --alsa-audio-device default

You need the vcodec= for video to work and deinterlace if you want that.
cvlc v4l2:///dev/video0:chroma=h264
:input-slave=alsa://hw:1,0
:live-caching=2500
--sout '#transcode{
deinterlace,
vcodec=mpgv,
acodec=mpga,
ab=128,
channels=2,
samplerate=44100,
threads=4,
audio-sync=1}
:standard{
access=http,
mux=ts,
mime=video/ts,
dst=0.0.0.0:8099}'

Related

How to send RTP stream to Janus from NGINX RTMP module?

This is my first post here, even though this platform has already helped me a lot.
So, i'm trying to create a stream and display it in a browser. I have already configured NGINX with the rtmp module and my stream works very well with HLS (between 5 and 10 seconds of latency).
Now I would like to set up a low-latency stream and that's why I have installed the janus-gateway webRTC server that allows to take in input an RTP stream and provide in output a webRTC stream.
Here's the schema I'd like to follow :
OBS -> RTMP -> Nginx-rtmp-module -> ffmpeg -> RTP -> Janus -> webRTC -> Browser
But I have a problem with this part : "nginx-rtmp-module -> ffmpeg -> janus"
In fact, my janus's server is running and demos streaming works very well in localhost, but when i try to provide an RTP stream, Janus don't detect the stream in the demos (it shows "No remote video available").
Anyone can help me, please ?
Ressources :
My janus.plugin.streaming.jcfg configuration :
rtp-sample: {
type = "rtp"
id = 1
description = "Opus/VP8 live stream coming from external source"
metadata = "You can use this metadata section to put any info you want!"
audio = true
video = true
audioport = 5002
audiopt = 111
audiortpmap = "opus/48000/2"
videoport = 5004
videopt = 100
videortpmap = "VP8/90000"
secret = "adminpwd"
}
My nginx.conf application :
application test {
deny play all;
live on;
on_publish http://localhost/test/backend/sec/live_auth.php;
exec ffmpeg -i rtmp://localhost/test/$name -an -c:v copy -flags global_header -bsf dump_extra -f rtp rtp://localhost:5004;
}
If you need something more for help me, don't hesitate ! Thank you in advance, and sorry for my bad english :)
I finally solved this problem with the following command :
sudo ffmpeg -y -i "rtmp://127.0.0.1/app/stream" -c:v libx264 -profile:v main -s 1920x1080 -an -preset ultrafast -tune zerolatency -g 50 -f rtp rtp://127.0.0.1:5004
Unfortunately, when I use -c:v copy, it doesn't work. It only works when encoding with libx264 which adds latency and I got between 3 and 4 seconds of latency.
However, when I installed Janus, my goal was to do better than with HLS, protocol with which I reach 2.5 seconds of latency.
So Janus did not meet my need. Moreover I was warned that it was not a stream server. After some research I came across Github on the Oven Media Engine project, a stream server that offers a latency of less than 1s. The documentation is complete on the dedicated site and a player (Oven Media Player) adapted to this server is available under MIT license. The server is under GPLv2 license.
Here is the current schema of my architecture :
OBS -> Nginx (which allows to allow streaming with on_publish, because OME doesn't allow it yet. The stream is then pushed to the OME server) -> OME -> Transcoding in different bitrate and resolution (optional) -> OME -> Edge OME (optional) -> player.
If you have any questions, don't hesitate, the support is very friendly !
Hope it helps

openocd - problem flashing nrf52 using stlink

I've got some issues flashing firmware using an STLINKv2 (from a Nucleo board) with an nRF52 target device using SWD. In short, I'm able to "connect" to the nRF52 (I can open a telnet session), but as soon as I try a Program command, I get a non-helpful error (below). Hex file is compiled using Arduino "export bin" file option. (Using a SEGGER isn't an option right now)
More details below:
OSX version 10.14.6, OpenOCD version 0.10.0.
Running in terminal openocd -f interface/stlink-v2-1.cfg -f target/nrf52.cfg returns:
Open On-Chip Debugger 0.10.0
Licensed under GNU GPL v2
For bug reports, read
http://openocd.org/doc/doxygen/bugs.html
Info : auto-selecting first available session transport "hla_swd". To override use 'transport select <transport>'.
Info : The selected transport took over low-level target control. The results might differ compared to plain JTAG/SWD
adapter speed: 10000 kHz
Info : Unable to match requested speed 10000 kHz, using 4000 kHz
Info : Unable to match requested speed 10000 kHz, using 4000 kHz
Info : clock speed 4000 kHz
Info : STLINK v2 JTAG v28 API v2 SWIM v17 VID 0x0483 PID 0x374B
Info : using stlink api v2
Info : Target voltage: 0.014192
Error: target voltage may be too low for reliable debugging
Info : nrf52.cpu: hardware has 6 breakpoints, 4 watchpoints
I can then telnet in using telnet localhost 4444:
telnet: connect to address ::1: Connection refused
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
Open On-Chip Debugger
In telnet, I can run basic commands on the nRF52, and when I execute: program /full/path/to/hex/file.hex I get the following message:
target halted due to debug-request, current mode: Thread
xPSR: 0x01000000 pc: 0xfffffffe msp: 0xfffffffc
** Programming Started **
embedded:startup.tcl:476: Error: ** Programming Failed **
in procedure 'program'
in procedure 'program_error' called at file "embedded:startup.tcl", line 532
at file "embedded:startup.tcl", line 476
Would love to know what I'm doing wrong!
ETA: The voltage error is not related; that was due to a bug in the Nucleo SWD programmer hardware one has to solder a jumper wire from the VDD-TARGET to the outward pin of R24. (When you break off the Nucleo from the STLINK board, you break off this link) Have done that, however the problem persists.

Problem Flashing nrf52 chip using Openocd

I have a custom nrf52 chip on a pcb with swd pins exposed. I have cloned and installed the latest openocd from https://github.com/ntfreak/openocd. The latest version includes all the latest pathes for the nrf52 chip, so no need for any additional changes as suggested in many older guides online. I am able to connect to the chip using ST-LinkV2. when connected I can read and write memory locations using mdw and mdb. I can also run some basic openocd commands like dump_image e.t.c, which confirms that the setup is good. But halt and program commannds always lead to errors like:
JTAG failure -4
JTAG failure -4
JTAG failure -4
JTAG failure -4
JTAG failure -4
JTAG failure -4
target halted due to debug-request, current mode: Thread
xPSR: 00000000 pc: 00000000 msp: 00000000
jtag status contains invalid mode value - communication failure
Polling target nrf52.cpu failed, trying to reexamine
Examination failed, GDB will be halted. Polling again in 100ms
Previous state query failed, trying to reconnect
jtag status contains invalid mode value - communication failure
Polling target nrf52.cpu failed, trying to reexamine
if I try to use flash image_write I get the error,
JTAG failure
Error setting register
error starting target flash write algorithm
Failed to enable read-only operation
Failed to write to nrf52 flash
error writing to flash at address 0x00000000 at offset 0x00000000
in procedure 'dap'
jtag status contains invalid mode value - communication failure
Polling target nrf52.cpu failed, trying to reexamine
I have read different guides online, and one of the possible solutions involves the APPPROTECT register which has to be disabled to enable any writes to flash.
APP_PROTECT, But the dap commmand which is supposed to help us access this bit,
dap apreg 1 0x04 0x01
returns an error:
invalid subcommand apreg 1 0x04 0x01
Please, I will like to know if anyone has had success programming a new empty nrf52 chip with the stlink-v2 and the steps which are necessary, or if any one has encountered similar problems. Thanks.
Here is my config file:
#nRF52832 Target
source [find interface/stlink.cfg]
transport select hla_swd
source [find target/nrf52.cfg]
#reset_config srst_nogate connect_assert_srst
I solved the "protected nRF52" chip problem this way, on Windows, using a Particle.io Debugger https://store.particle.io/products/particle-debugger setup to program nRF52 chips from Arduino as described in https://www.forward.com.au/pfod/BLE/LowPower/index.html
Note: The recovery process described here does NOT need Arduino to be installed
Download OpenOCD-20181130.7z pre-compiled openocd for windows from
http://gnutoolchains.com/arm-eabi/openocd/
The latest version of openocd src on https://github.com/ntfreak/openocd should also work as it includes the apreg cmd in target\arm_adi_v5.c
unzip, open cmd prompt to unzip dir, enter cmd
bin\openocd.exe -d2 -f interface/cmsis-dap.cfg -f target/nrf52.cfg
response
Info : auto-selecting first available session transport "swd". To override use '
transport select <transport>'.
adapter speed: 1000 kHz
cortex_m reset_config sysresetreq
Info : Listening on port 6666 for tcl connections
Info : Listening on port 4444 for telnet connections
Info : CMSIS-DAP: SWD Supported
Info : CMSIS-DAP: FW Version = 1.10
Info : CMSIS-DAP: Interface Initialised (SWD)
Info : SWCLK/TCK = 1 SWDIO/TMS = 1 TDI = 0 TDO = 0 nTRST = 0 nRESET = 1
Info : CMSIS-DAP: Interface ready
Info : clock speed 1000 kHz
Info : SWD DPIDR 0x2ba01477
Error: Could not find MEM-AP to control the core
Info : Listening on port 3333 for gdb connections
Open telnet program eg teraTerm and connect to localhost on port 4444, i.e. 127.0.0.1 telnet port 4444
cmd window shows
Info : accepting 'telnet' connection on tcp/4444
in telnet (i.e. teraTerm) type
nrf52.dap apreg 1 0x04
returns 0 <<< chip protected
then
nrf52.dap apreg 1 0x04 0x01
then
nrf52.dap apreg 1 0x04
returns 1 << chip un-protected
then power cycle board
Can now use arduino ide to flash softdevice and code low power BLE
Even though the dap command is listed by openOCD help, it isn't implemented for transport hla_swd that you have to use with ST-Link.
If the ST-Link is a generic type from China, it can be upgraded to CMSIS-DAP which uses the swd transport and supports the nrf52.dap apreg 1 0x04 0x01 command to disable the readback protection and erase the flash. You'll need another ST-Link to do that, or you can instead install CMSIS-DAP on a generic STM32F103C8T6 board.
After that you can either use ST-Link to program the nRF52 or continue using CMSIS-DAP, which can also be used to program STM32 MCU.
Nucleo board embedded ST-Links can also be upgraded to J-Link, which allow the use of the "recover" option in nRFgo Studio to erase the flash, it should also work with "nrfjtool --recover" or OpenOCD.
If anyone encounters this problem, I solved the problem by getting an original Jlink-Edu. I also had to pull the reset pin of the microcontroller high to get the jlink working.
There are lots of JTAG messages.
I think you might be missing the
transport select hla_swd
line in your (board) cfg file. The NRF5x chips only work properly with SWD, and ST-Link uses the hla_swd variant.

Arduino OpenOCD command works in IDE but not from CMD prompt. What am I missing? (NRF)

Arduino does the following successfully. But when I try it from the command line it fails. Why is that?
C:\Users\???\AppData\Local\Arduino15\packages\sandeepmistry\tools\openocd\0.10.0-dev.nrf5/bin/openocd.exe -d2
-f interface/jlink.cfg
-c transport select swd;
-f target/nrf52.cfg
-c program {{C:\???\EddystoneURL.ino.hex}} verify reset; shutdown;
Result:
Open On-Chip Debugger 0.10.0-dev-00254-g696fc0a (2016-04-10-10:13)
Licensed under GNU GPL v2
For bug reports, read
http://openocd.org/doc/doxygen/bugs.html
debug_level: 2
swd
adapter speed: 10000 kHz
cortex_m reset_config sysresetreq
jaylink: Failed to open device: LIBUSB_ERROR_NOT_SUPPORTED.
Info : No device selected, using first device.
Info : J-Link OB-SAM3U128-V2-NordicSemi compiled Jan 21 2016 17:58:20
Info : Hardware version: 1.00
Info : VTarget = 3.300 V
Info : Reduced speed from 10000 kHz to 1000 kHz (maximum).
Info : Reduced speed from 10000 kHz to 1000 kHz (maximum).
Info : clock speed 10000 kHz
Info : SWD IDCODE 0x2ba01477
Info : nrf52.cpu: hardware has 6 breakpoints, 4 watchpoints
nrf52.cpu: target state: halted
target halted due to debug-request, current mode: Thread
xPSR: 0x01000000 pc: 0x000008e4 msp: 0x20000400
** Programming Started **
auto erase enabled
Info : nRF51822-QFN48(build code: B00) 512kB Flash
Warn : using fast async flash loader. This is currently supported
Warn : only with ST-Link and CMSIS-DAP. If you have issues, add
Warn : "set WORKAREASIZE 0" before sourcing nrf51.cfg to disable it
wrote 28672 bytes from file C:\???\EddystoneURL.ino.hex in 0.835260s (33.522 KiB/s)
** Programming Finished **
** Verify Started **
verified 26768 bytes in 0.144835s (180.486 KiB/s)
** Verified OK **
** Resetting Target **
shutdown command invoked
When I try the above from the command line I get the following:
C:\WINDOWS\system32>C:\Users\???\AppData\Local\Arduino15\packages\sandeepmistry\tools\openocd\0.10.0-dev.nrf5/bin/openocd.exe -d2 -f interface/jlink.cfg -c transport select swd; -f target/nrf52.cfg -c program {{C:\???\EddystoneURL.ino.hex}} verify reset; shutdown;
Open On-Chip Debugger 0.10.0-dev-00254-g696fc0a (2016-04-10-10:13)
Licensed under GNU GPL v2
For bug reports, read
http://openocd.org/doc/doxygen/bugs.html
debug_level: 2
interface_transports transport ...
transport
transport init
transport list
transport select [transport_name]
transport : command requires more arguments
in procedure 'transport'
I have replaced the full path's to the hex files to make it easier to read.
I am trying to use Arduino as my tool-chain to upload a pre-built binaries with it. From the IDE I can do it but only with the Arduino Built code.
What am I missing?
I figured it out!!!
The command parameters need to be in quotes or Windows will think they are the next parameter because of spaces in them.
I get the feeling folder/file names with spaces will have the same issue.
C:\Users\???\AppData\Local\Arduino15\packages\sandeepmistry\tools\openocd\0.10.0-dev.nrf5/bin/openocd.exe -d2
-f interface/jlink.cfg
-c "transport select swd;"
-f target/nrf52.cfg
-c "program {{C:\???\EddystoneURL.ino.hex}} verify reset; shutdown;"

Embed video from USB webcam into web page using ffserver and ffmpeg

I need to stream images from a USB webcam to a webpage on my embedded system. The operating system used is Linux.
I successfully installed ffserver and ffmpeg, and also mplayer.
This is my /etc/ffserver.conf (it's not definitive, I am just testing it):
# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090
# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0
# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 2
# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 2
# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 1000
# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -
# Suppress that if you want to launch ffserver as a daemon.
NoDaemon
<Feed feed1.ffm>
File /tmp/feed1.ffm #when remarked, no file is beeing created and the stream keeps working!!
FileMaxSize 200K
# Only allow connections from localhost to the feed.
ACL allow 127.0.0.1
# the output stream format - SWF = flash
Format swf
# this must match the ffmpeg -r argument
VideoFrameRate 5
# another quality tweak
VideoBitRate 320
# quality ranges - 1-31 (1 = best, 31 = worst)
VideoQMin 1
VideoQMax 3
VideoSize 640x480
# wecams don't have audio
NoAudio
</Stream>
# FLV output - good for streaming
<Stream test.flv>
# the source feed
Feed feed1.ffm
# the output stream format - FLV = FLash Video
Format flv
VideoCodec flv
# this must match the ffmpeg -r argument
VideoFrameRate 5
# another quality tweak
VideoBitRate 320
# quality ranges - 1-31 (1 = best, 31 = worst)
VideoQMin 1
VideoQMax 3
VideoSize 640x480
# wecams don't have audio
NoAudio
</Stream>
<Stream stat.html>
Format status
</Stream>
<Redirect index.html>
# credits!
URL http://ffmpeg.sourceforge.net/
</Redirect>
From the shell I can execute:
# ffserver -f /etc/ffserver.conf
and
# ffmpeg -f video4linux2 -s 320x240 -r 5 -i /dev/video0 http://127.0.0.1:8090/test.flv
No errors are reported during the execution. Sounds good but maybe it's not OK at all.
Then, in the webpage, I wrote this simple code:
<video controls>
<source src="http://127.0.0.1:8090/test.flv">
</video>
I read on another thread here on stack overflow (I lost the link) that this code should be enough.. But it's not working for me.
But I can see the file /tmp/feed1.ffm has been created, so I think I can use this stream to show the camera images on my webpage. Am I right ?
What it the simplest solution ?
Thank you.
EDIT
I allowed the connections into the ffserver's configuration file:
<Feed feed1.ffm>
File /tmp/feed1.ffm #when remarked, no file is beeing created and the stream keeps working!!
FileMaxSize 200K
ACL allow 127.0.0.1
ACL allow localhost
ACL allow 192.168.2.2 192.168.2.10
</Feed>
But still does not work.
ffmpeg -f video4linux2 -s 320x240 -r 5 -i /dev/video0
http://127.0.0.1:8090/test.flv
As described in the documentation, you should stream to the feed1.ffm file, not to the test.flv file. ffmpeg -> ffserver communication is the ffm file, and ffserver -> webbrowser communication is the .flv file.
I think HTML doesn't like the pseudo files like pipes or .ffm :)
Maybe you could use the <embed>tag from HTML5.
<embed type="video/flv" src="http://127.0.0.1:8090/test.flv" width="320" height="240">
Or however you want to set the size.

Resources