Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to start guide #59

Open
s3ni0r opened this issue Aug 28, 2023 · 1 comment
Open

How to start guide #59

s3ni0r opened this issue Aug 28, 2023 · 1 comment

Comments

@s3ni0r
Copy link

s3ni0r commented Aug 28, 2023

It would be great if we can create a detailed wiki page on how to setup the project and run it, and also describe the hardware used (gimbal, camera ...) and the tracking modes available.
As for me, the goal is to work on an alternative to the soloshot 3 for surfing, and using object detection and tracking would simply not cut it, as the environnement at the beach is too harsh and noisy, so i suggest to work on an additional tracking mode based on GPS and bluetooth.

@maiermic
Copy link
Owner

Documentation would definetley useful/required to get started. I can guide you in the meantime, how to get started. Just let me know, which hardware you'd like to use.

My robot cameraman is a modular system that can be build from different parts: Camera, gimbal (or separate devices to pan and tilt) and a computer to control the camera and gimbal. At the moment, I'm using a Ikan DS2-A Beholder gimbal with a Panasonic DMC-LF1 camera and a Raspberry Pi to control them. In development I usually use my Linux laptop. Further, I use a Google Coral USB accelerator (at the moment required for tracking using object detection).

You actually should already be able to use any gimbal with a SimpleBGC controller (by Basecam Electronics). You could even build your own gimbal using such a controller. Otherwise, another interface would have to be implemented for your gimbal. Theoretically, you could even use a gimbal from Ronin, Feiyu, etc. However, AFAIK they do not have an open-source/documented API. Some of those gimbals have a Bluetooth API that might be reverse engineered from the mobile apps used to control the gimbals. However, I guess it is not worth the effort as you have better control options using a SimpleBGC gimbal and you can get a used gimbal for 100$ (e.g. on Ebay) if you are lucky (patient enough).

The camera part is similar. You should already be able to use any Panasonic camera that supports Wi-Fi, i.e. can be controlled using the Panasonic Image App. I already had a Panasonic camera that was easy to reverse engineer and has a great API. There is a official/documented Wi-Fi API for Sony cameras, too. It is not as good (e.g. fast) as the Panasonic API, but it should be useable and quite easy to implement the interface of my robot cameraman. If there is no API to control the camera, you still should be able to use it with a HDMI capture, if the camera has a HDMI output. Thereby, you could even use a SS3 (in manual mode) 😂

I've implemented tracking based on object- and color-detection (may be used outdoors and indoors). Both (alone) are probably not suited for surfing, since they require that the target (surfer) is in the camera's field of view (almost) the whole time. Otherwise, the target may be lost and is hard to find and re-identify (another surfer might be tracked instead). I plan to implement GPS tracking. Therefore, a device (similar to the SS3 tag) is required to send its GPS location to the robot cameraman (e.g. the Raspberry Pi). The easiest way is to use a smartphone for this. However, the transmitting options of a regular smartphone are limited. For example, Wi-Fi has a small range, mobile internet might not be available (limited cell reception), Bluetooth 5 Long Range is only supported by a handful of smartphones and the range improvements are disappointing. That being said, I guess you're unlikely to carry your smartphone while surfing in the water. Presumably, a custom device will have to be developed for this. Actually, my principle for the project is to use, if possible, only existing devices that you can buy and you don't have to assemble yourself. But I am afraid that there is no such device to buy. On the other hand, it should be reasonable if you can assemble the individual parts (e.g. a STM32 or Arduino with a nRF24L01, batteries and a case) quite easily with instructions in a few minutes. An additional challenge is to build the whole thing waterproof.

Besides, to run the project without any special hardware (e.g. on a Linux laptop), you can run

python -m robot_cameraman  --gimbal Dummy --detectionEngine Dummy --liveView Webcam --select-target-strategy manually

in the project root directory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants