It looks like you're new here. If you want to get involved, click one of these buttons!Sign In
It looks like you're new here. If you want to get involved, click one of these buttons!
I have two sticks, one is Intel neural compute stick and another is Intel neural compute stick 2, I want to use multi sticks to inference models. The first situation is to use two sticks to infer one CNN model, the operation system dispatches the computation burden to each stick automatically.
Another situation is to infer a complete CNN model on each stick respectively, for example, use one stick to inference model1, use another stick to inference model2, can I manually dispatch this mission to each stick?
Can these two assumptions be realized and how to do that in python? Can you offer some sample codes?
In the Openvino samples, I only found python codes to infer one model on one stick.