-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Long-Range Depth Model - OAK-D-LR #247
Comments
Welcome OAK-D-LR! I would suggest to include following few features: add polarizer filter lenses option: since this long range version primarily is for outdoor application, there might be glare caused by the windshield glass, or raining road surface reflection, etc. Adding polarizer might remove or attenuate the glare significantly. keembay: wish this new product using latest 3rd Gen VPU! Ship with 2 sets of lenses: I would suggest to ship this product with two sets of M12 rectilinear lenses: One set of Narrow HFOV (e.g. around 40 to 50 deg) and one set of Wide HFOV (e.g. around 80 to 120 deg). Also the unit should store manufacturing calibrated matrixes for these two sets of lenses. The shipping configuration for example could be preinstalled Narrow_FOV lenses and the default LensMode.Narrow_FOV, such that the rectifiedLeft/Right will assume to apply the Narrow_FOV stereo calibrated matrixes. If user would like to change the lenses to Wide_FOV, then in programming, after user changed the lenses themselves and then need to set LensMode.Wide_FOV to produce correct rectifiedLeft/Right output. The reason of requesting to ship two lenses sets is a convenient product configuration for an end user like me. Such that I don't need to go out and do the rectilinear lenses sourcing myself and sometime cannot get a quality rectilinear lenses with low quantity like 2 lenses ; ) Synced Left/Right requirement: I am not sure the current OAK-D left/right sync requirement in terms of time difference. I would suggest that new product left/right sync frame time difference < 100us. If it can be less than 50us or smaller would be even better. When the left/right exposure start/end time not in sync, the disparity map will not be accurate for moving objects. The problem will be amplified particularly when the ego view is turning (with angular speed): for example, when a car is turning at the cross road, say at 90 deg/ 5.4 sec => 16.6 deg/sec, then at around 25 m away, there will be disparity error by 1 pix for every 1 ms of out of sync left/right exposure. The further the distance, the bigger the disparity error will be (at around 250m, then it will be amplified by 10X..) Future product roadmap: PoE: wish will have IP67 enclosure PoE version available soon. |
Thanks @ynjiun !
This is the beauty of M12. This makes it possible to simply buy M12 solutions that have this capability built-in.
Great point. We can likely design this using the OAK-SoM-Pro or OAK-SoM-MAX to have KeemBay support.
Great suggestion on lens types/etc. and agreed on the FOVs here. One catch is that once the lenses are removed, the calibration is invalidated. So probably what is best here is to have just 2 different purchase options as the default for this model, such that they can come pre-calibrated. And then there's a decent chance that for a given purpose these might be "Good enough". And then a no-lens option for folks who need their own lenses (so as to not waste lenses/etc.)
We'll then also sell lenses on our site, just like our friends OpenMV do, here. Note they even have the polarization filter as well, which would very likely work here (not yet tested of course, as we haven't even started work on OAK-D-LR yet).
Agreed. Good points. Doing them as separately-orderable options will save cost. And it's required for factory-calibration to be applied and usable.
Yes. We'll do hardware sync here. My understanding of the OAK-D sync is it's likely in the sub-1-microsecond range. It gets very hard to actually test that though in terms of pixels. But as far as our testing has revealed (leveraging MIPI readouts/etc.) we're at least under a couple microseconds. And the same would be true here.
Agreed. We'll do a PoE as well. I actually think PoE will be more important here. USB is a tad quicker to get out to test optics/etc. so we'll simply do that first - to learn the Thanks again! |
I would be very interested in this product as well, especially the PoE version with an IP67 enclosure. On the subject of filters and mechanics I would love the option to order the camera with a set of two "front plates", with the glass on one of these plates incorporating a polarization layer directly if that's possible. Or alternatively (but more complicated) having a gasket-sealed "hatch" on the side of the assembly that would allow to slide a filter in front of the 3 lenses, held in place by a groove for example. This could then also allow the use of other filters for specific wavelengths of light or ND filters to reduce exposure without mounting anything to the outside of the sensor.... but I realise this second option is more complicated as the use of filters would depend on the size of the lenses and sensors on the board. |
Question: is that possible to sync all 3 cameras including the center 4K camera with the other two stereo cameras? Center 4K camera position: for this 15cm baseline version, I would recommend the center 4K camera position at right side with distance to right camera of 5cm and distance to the left camera of 10cm. Plus if possible to sync all three cameras, then we might have a stereo system with 3 baselines: 15cm, 10cm, and 5cm in operation simultaneously. If the software and keembay hardware throughput can handle 3 pairs of stereo disparity computation, then we could have a hybrid system that fuse 3 disparity maps together to generate a much higher accuracy disparity map. What do you think? I would happy to join the prototyping work on this front if you can send me an early prototype ; )) |
I love this idea! We will do this for sure. And yes, I do like the idea of like separate front covers for this. So that they can still be sealed. Thanks, |
For this one we were thinking only 2x AR0234 global shutter color 2.3MP. So no 4K rolling shutter at all. Thoughts on that? |
For my applications rolling sur if any kind is a no-go, and 2MP global shutter sensors would actually be ideal for balance between light sensitivity and quantity of data :)
*Stephan Sturges*
…On Thu, May 26 2022 at 6:04 PM, Luxonis-Brandon < ***@***.*** > wrote:
For this one we were thinking only 2x AR0234 global shutter color 2.3MP.
So no 4K rolling shutter at all. Thoughts on that?
—
Reply to this email directly, view it on GitHub (
#247 (comment)
) , or unsubscribe (
https://github.com/notifications/unsubscribe-auth/AE3BDJR2UTXF7RIS4HSFUI3VL6OILANCNFSM5V6OC6ZQ
).
You are receiving this because you are subscribed to this thread. Message
ID: <luxonis/depthai-hardware/issues/247/1138735751 @ github. com>
|
Brandon, |
Hi @gtech888AU , So actually for stereo disparity depth having cameras angled-in prevents the disparity depth from working properly. As with disparity depth the key is for the cameras to see the information from the same vantage point, just displaced. And with angled cameras, one camera will see the left side of an object, and the other will see the right - and so feature matching won't work. That and any angle between the sensors reduces FOV of the depth. So it's a disadvantage to do so all around. Hi @stephansturges, Thanks, makes sense. And PS I edited your post to remove private information. Thanks all, |
I think no 4K rolling shutter is fine. Then I would suggest to enhance this unit with an extra AR0234 global shutter (thus total of 3x AR0234) at the following position: left <- 1.5 cm -> middle <- 13.5cm -> right So the left/right baseline is 15cm All three cameras are in h/w sync. |
YES! That's a great idea @ynjiun . As that gives both long-range and not-long-range depth out of the same model. And with that idea I'm now pondering what the "right" distribution of baselines is. My gut is something like the following might be super useful. left <- 5cm -> middle <- 10cm -> right As this gives the 3 following baselines:
The And then similar to 5cm, that can allow fairly-close in depth compared to both 10cm an 15cm. Thoughts? Thanks, |
Agree. This would definitely address those who need different baseline in a meaningful seperation. |
Fascinating. Thanks. I just updated the specs at the very top with the culmination of discussions here. Added [NEW] for things that came from the discussion. |
I would love to know if there is already a tentative timeline attached to this project? |
Unknown as of yet. We're triaging internally on when we can implement this. We'll circle back when we have a better view of schedule. We're also triaging interest on this vs. other models. So this helps that there's interest. And if you want to pre-order, we can get you links for that. :-). Helps with customer voting. ;-) |
Yes, send me the pre-order link. |
Thanks @microaisystems . Will get it up now. :-) |
It's live here: (The LR will have to be quite a bit more expensive because it has the (expensive) 2.3MP global shutter color and (expensive) large/nice optics, which then also make the whole thing bigger (and more expensive).) |
This is like computer vision catnip to me... I've ordered 2 units 😄 |
Great! Before I order d-lr, couple questions: would this pre-ordered unit shipped with Myriad X or KeemBay? would this unit have Raspberry Pi Module 4 integrated as well? Thank you. |
Hi @ynjiun, |
The option to use either module would be very interesting to me, but especially if they are user-replaceable. |
Hi @stephansturges, |
That's ideal, thanks. |
Hey guys! I see on the product page in the store that you are targeting Jan 2023, is this still correct? If there's any update ton the timeline I'd love to know more :) |
Looking good! I can't wait to use this. |
@Luxonis-Brandon the renderings look pretty awesome. What did you guys decide on for the baselines? |
@chad-green we used 5cm for shorter and 15cm for longer baseline |
@ynjiun At the moment I don't really have anything that would ease the calibration besides that you can use the parameter |
@themarpe Any updates on hardware sync all three AR0234 cameras on the LR units? Is it already supported, or still being worked on? Thanks! |
Hi @chengguizi , could you please create Issue/feature request on depthai-core? Just for tracking purposes for our development team. |
@chengguizi currently available as a temporary change in |
Perhaps it has already been merged to develop (commit here)? |
Correct, also in latest |
@Erol444 Ok let me see if it is indeed not done yet! |
@chengguizi at the moment this is an OAK-D-LR specific. A common FSync between all sensors must be available, which then drives the sensors to capture at the same time. We are yet to expose this mechanism in a general way however |
@themarpe Ok, got it! Would it be possible to expose at all? I guess it is implemented through triggering from VPU's GPIO pin at fixed interval? |
@chengguizi |
Hey guys, is there any update on the IP-rated version of this sensor? I'm getting close to the point where I will be able to test this in real-world conditions and IP rating would make that a lot easier! |
@stephansturges yes we did improve the IP rating by existing design adding rubber sealings, improving rain cover... |
Is there any
Is there are specific branch that I should use to try the latest improvements in depth quality for the LR sensor? |
CC: @njezersek @stephansturges we'll be merging That should help with calibrating |
Hey guys, is there any update on the IP5X rated-version, or future availability of the current version? I will probably need better IP rating and will have to make my own case anyway, but I need to start planning :) |
Hi @stephansturges , We actually have that in plan already - our next production batch OAK-D-LR will (should) be IP5X rated :) |
@Erol444 yep I saw that but I think I will need IP63 at minimum so I will need to make my own enclosures probably, which is why I was curious on when the first samples from the final design will be available for me to work with 😄 |
Any information on when will the camera be in stock? We would like to try this camera in our robotics lab for outdoor depths sensing. @Erol444 |
@jingnanshi we will start shipping the cameras shortly. |
@Luxonis-David Thanks for the info! We haven't placed an order yet but I'll pass the news to my colleagues. We will probably place an order soon. |
@Luxonis-David Do the OAK-D LR cameras support hardware time synchronization / triggering? I see that many of the other Oak cameras do, but I don't see any reference to it for the LR. |
@GoldenZephyr unfortunately OAK-D-LR was not targeted for applications where time synchronization is needed, so you are right, it does not support the hardware FrameSync signal out of the box. |
Hello, are there any updates on the shipment. We order a couple of these and were just wondering when they would be sent. |
Hi @robsan7777 , I believe we plan to ship these in January. |
Is that the first shipment that will be done, or old orders were shipped already? We ordered them about 3 to 4 months ago so I am just trying to get an idea of when we will get them. Thank you!! |
Hi @robsan7777 , is your order number 15425? If so, then we should start shipping these in a week or two, we are now working on the calibration of these devices. We apologize for the delay. |
Yes, that is my order number! Thank you for the update. |
@robsan7777 Also the installer for windows is not working for me |
@jimas95 I tried multiple methods and ended up setting the ros_depthai_driver on ROS Iron since it is supported in just a few distributions. Was able to only see live camera images but not depth or anything else. It is a relatively new camera, not sure if they are still working on the drivers and supportive software for it. |
@jimas95 |
Pre-order on shop here
Start with the
why
:While the OAK-D Series 1 and 2 (here) cover medium depth ranges (from ~20cm to 16 meters), and the coming OAK-D-SR (short-range) series (USB #241, PoE #244) specializes in close-in depth (from 0 to 1 or 2 meters), OAK/DepthAI technology is actually capable of much longer depth sensing - and we've tested up to 350 meters (as below) when doing a 30cm stereo baseline:
And the only current way to accomplish this is to use OAK-FFC (e.g. OAK-FFC-3P or OAK-FFC-4P with modular cameras such as the OV9282; 1MP Global Shutter Grayscale, OV9782; 1MP Global Shutter Color, or AR0234; 2.3 MP Global Shutter Color.
This works, but is not "production ready" and dealing with FFC cables is just annoying. So although it is possible to make a wide-stereo-baseline + narrow-FOV stereo pair using the options above (as shown below) such a setup is larger and harder to integrated than is desirable in many cases.
For longer ranges, using the setups above, we've found that a stereo baseline of 15cm (2x the 7.5cm baseline of the OAK-D Series 2), coupled with variable optics, can cover quite a wide range of depth sensing needs. And also from this testing, we've found the AR0234 to be quite beneficial for long-range sensing, given its large pixel size (matching the OV9282) while having a higher resolution of 2.3MP, which effectively doubling the maximum depth sensing range compared to the OV9282.
The AR0234 also provides the benefit of supporting both global shutter grayscale (for maximizing low-light performance) and also global shutter color (for native pixel-aligned RGBD).
The desired maximum depth range various quite a bit per application - with some situations requiring 100 meters, others 200 meters, and some 300+ meters. (The furthers the Luxonis team has tested with the AR0234 is 350 meters.) Supporting M12-mount lenses in the design enables choosing optics with FOVs (fields of view) corresponding to the required sensing distance.
Move to the
how
:Leverage existing DepthAI ecosystem support for AR0234 to implement a dual-AR0234 M12-mount OAK-D-LR.
Move to the
what
: [NEW; based on feedback below]The text was updated successfully, but these errors were encountered: