frame

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Sign In

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

video_objects.py

How to use this script so I can use USB webcam? It only uses recorded .mp4 video file.

Comments

  • 5 Comments sorted by Votes Date Added
  • I created this post on Rpi forum to discuss UDP Streaming from Rpi (with pi camera) to Rpi (with portable HDMI screen):
    https://www.raspberrypi.org/forums/viewtopic.php?f=63&t=225955&p=1386513#p1386513

    It seems that the rpi doesn't perform video decoding very well using only cpu,

    The camera enabled pi uses gpu hardware to stream.

    The HDMI rpi so far I have managed to get 24fps streaming using hardware decoding with omxplayer and ffmpeg/avconv. However, the latency is around 6 seconds between sending rpi and receiving rpi.

    Maybe another h264 plugin might be better such as omxh264dec but I haven't been able to figure it out yet.

    My goal is a few things:
    -Low latency Digital video
    -Between one SOC board and another (I just happen to have 2 x rpi & a Jetson TK1)
    -The receiving soc board will connect to a NCS
    -12v or 5v soc boards
    - Using SSD Mobilenets with the Video stream (a mix between stream_infer.py & video_objects.py)

    I'm not sure about the performance of the rpi + ncs with stream_infer.py & video_objects.py yet, I have only been using them with my laptop and they run just fine there

    Robotics project :smiley:

  • @VictoryKnocks Edit the video_objects.py python file and change line 338 from cap = cv2.VideoCapture(input_video_file) to cap = cv2.VideoCapture(0) for video index device 0. Also try 1 if you have a USB camera in addition to your laptop webcam.

  • oooh, you're a clever man!

  • Right then Mr Clever pants let's see if you can help with this one

    What are the fps differences between SSD Mobilenet and Tiny Yolo on the NCS, which do you recommend for performance/best fps

    And

    I used stream_infer.py to stream gstreamer UDP video from my rpi to my laptop the other day and it runs just great, but I am eventually hoping to stream the same setup from rpi to another rpi with screen. Have you tried this setup and have you any good advice for setting up the gstreamer pipeline, at the moment i'm using this pipeline on my laptop and it works nicely:

    gst-launch-1.0 udpsrc port=9000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink

    But I am wondering in stream_infer.py it suggests using:
    (Line 82) #SINK_NAME="glimagesink' # use for Raspbian Jessie platforms

    How come glimagesink and not autovideosink or xvimagesink?

  • @VictoryKnocks I haven't tried streaming from rpi to another rpi. If you can share the details on how to do this, I'm sure other members of the NCS dev community would be interested in your work.

    As far as the stream_infer.py application, glimagesink was what was working for us so that's what we went with at the time and xvideo was not working on Jessie. I haven't checked it lately so I can't say if it is/is not working for the Raspian Stretch at the time of writing.

    As far as FPS, you can use the benchmark_ncs app to run some benchmarks and compare them yourself.
    Run this command in the benchmarkncs folder for Tiny Yolo v1: python3 benchmarkncs.py ../../caffe/TinyYolo/ ../../data/images 448 448
    and use this command for SSD Mobilenet: python3 benchmarkncs.py ../../caffe/SSD_MobileNet ../../data/images 300 300

Sign In or Register to comment.