Intel Takes Lidar Indoors

(zdnet.com)

86 points | by stambros 1597 days ago

14 comments

  • noen 1595 days ago
    Im done being excited by anything out of Intel that isn't an desktop/laptop CPU.

    I've been burned personally and professionally with every single Intel IoT device I've touched.

    Remember the Edison platform? Huge promises and possibilities that turned into fatally flawed silicon that took Intel 2 years to admit.

    The Compute Stick? The whole Atom ecosystem?

    I was so excited by the realsense cameras, we got a bunch of them and thought we must have gotten a bad batch. The hardware was so bad compared to similar cost machine vision cameras, it was astounding.

    The SDK was great at first glance, a really easy OOBE for multi camera setup and PCL processing. Then you discover over a few weeks how flaky everything is, how brittle the SDK and drivers are (like every other Intel dev platform it seems) and after spending thousands of dollars on hardware and hundreds of hours of dev time, you finally chuck it all in a bin and say "I will never buy Intel crap again" for the third time.

    Hopefully this Lidar device will buck that trend, but I doubt it. They keep making random IOT hardware platforms with seemingly no long term strategy and no path to commerical implementation.

    • snovv_crash 1595 days ago
      Similar experience with Realsense, except that I need to run them in USB2 mode due to GPS interference from USB3. Their USB2 interface isn't just the USB3 at lower bandwidth, no, it has a bunch of completely unrelated bugs that only appear on USB2. Moreover, even if you turn down the framerate to make sure it isn't a bandwidth issue, the point cloud quality on USB2 is worse than USB3. No idea why, but we've binned our Realsense cameras too.
    • Reventlov 1595 days ago
      My main problem with non "classic" intel products is the shit user experience that come with them: I don't want to use ubuntu 16.04, just package software in a maintainable way, ffs.
      • rubicks 1595 days ago
        When Intel touts a device "On Linux!(tm)", I have to lower my already meager expectations. So long as you expect the drivers to be very thin open-source wrappers around very brittle proprietary blobs, you won't be unpleasantly surprised.
      • ptsneves 1595 days ago
        Which is ironic given they used to push a lot for yocto where everything is a recipe that build stuff from source.
    • ptsneves 1595 days ago
      I feel you. If it makes you feel better they do not take a dump only on small customers. Their axiaa business also screws their customers so much. Worst vendor in the industry. We are talking about multi million businesses.
  • stefan_ 1595 days ago
    I love the Intel RealSense stuff, you get very cutting edge sensor silicon at consumer gear prices with open-source software to go along with it.

    The only real problem is that I have no clue why Intel is in this business, and I suspect they won't be for much longer.

    • rubicks 1595 days ago
      The Realsense hardware might be great. The software that goes with it is not:

      * DKMS kernel module for what should be a plain vanilla USB 3.0 device

      * firmware updates require closed-source libraries

      * breaking API and ABI changes that do not respect semver or SOVERSION

      The worst part of my dayjob is wrangling the Realsense software suite.

      • TaylorAlexander 1595 days ago
        I’ve met someone who is constantly asking me “why haven’t you tried realsense?” and you just confirmed my suspicions. When the first realsense products came out, they only supported windows. This is madness for a robotics focused product. Finally my friend tells me now they support Linux. But for me the damage has already been done. They have proven that they don’t understand me as a robotics engineer. And you’ve just confirmed that for me. So I stick with trying to use high resolution cameras and structure from motion algorithms to understand the world. No need for a specific proprietary piece of hardware. Since I’m mostly doing research in to what is possible, I prefer this non proprietary approach.

        This little lidar looks nice but the last thing I need is another weird kernel module and some closed source library to support my hardware. No thanks.

        • rubicks 1595 days ago
          I mean... your friend's not wrong, if by "Linux" she/he means "Ubuntu 16.04 LTS" with the caveats:

          * disable Secure Boot xor create your own efi signing key pair, get friendly with `mokutil`, and pray your firmware's UEFI implementation supports that complicated custom KEK

          * Ubuntu 18.04 support, but forcibly install at least one 16.04 package they couldn't be bothered to build for the latest stable release of their chosen distro --- or `patchelf` the shared object and, again, pray.

          * accept that the debugging symbols they provide still bear the source paths from the Jenkins instance that packaged them

          * Oh, yeah; sometimes the device is detected as USB 2.1. That's fun when it happens 2 hours into a calibration run

          They're good if all you need is a flakey proof of concept. It sounds to me like you require something better.

          • TaylorAlexander 1594 days ago
            Damn that sounds awful. Yeah, I don’t need another headache. I can imagine times where the hardware is the right tool for the job, but with all those hoops you have to jump through to make it work I’d avoid that at all costs.
      • throwaway6734 1595 days ago
        If you feel comfortable sharing, what kind of work are you doing?
    • mpoteat 1595 days ago
      Intel is a big company, but this also has the background context of their fierce competition (which arguably they are losing) with other chip makers in the enthusiast space. Maybe would explain an attempt for more R&D green field projects.
    • _trampeltier 1595 days ago
      This market hasn't really started now. It's is just a bet for the future.
    • cyorir 1595 days ago
      Why not stick to the business? It's a supplementary good to their vision processing products. Some developers may be attracted to the option of buying this together with something like the neural compute stick.
      • sbierwagen 1595 days ago
        They sold that, more or less: the Intel Euclid, a linux SBC and realsense camera in one package https://www.intel.com/content/www/us/en/support/articles/000...

        I bought one. Bad thermals and weirdly flaky wifi, thanks to the overheating. To get it to run the fan fast enough you had to manually edit the startup scripts. Intel quietly killed it a year later, and it appears to be totally unsupported.

  • BubRoss 1595 days ago
    https://newsroom.intel.com/news/intel-realsense-lidar-camera...

    Here is the actual press release instead of the zdnet rehash.

    https://www.intelrealsense.com/lidar-camera-l515/

    This is the actual page for this camera.

  • opwieurposiu 1595 days ago
    Previously the realsense stereo depth cameras suffered from a lot of depth noise compared to TOF cameras like kinect. I had to use a lot of filtering which limited the usable frame rate. Hopefully this new lidar cam has less noise.

    The realsense API is pretty good, I found it much easier to use then the kinect API.

    • echelon 1595 days ago
      Can you use multiple sensors with a spherically overlapping FOV?

      Kinect for Azure purports to be able to support overlapping FOV (whereas Kinect 1 for Xbox did not)

      • opwieurposiu 1592 days ago
        The stereo realsense cameras like D415 support overlapping FOV. They also have a way to use a sync cable to sync the shutters.
  • Animats 1595 days ago
    There have been lots of little indoor LIDAR units. The SwissRanger, around 2005, was one of the early ones. The Kinect, version 2, is one. The Kinect, version 1, was a random dot pattern projector and two cameras for triangulation. Intel made something similar, the RealSense.

    So far, the most popular use for these things is video background removal, allowing "green screen" type effects without needing an actual green screen.

    • BubRoss 1595 days ago
      Is that actually common? Basing a matte off of the depth map would be extremely noisy and low resolution without some big time filtering.
      • Animats 1595 days ago
        • hmottestad 1595 days ago
          BubRoss asked if it would be low res. This video shows that it is extremely low res.

          Lidar instead of green screen is not for professional grade background removal like you see on TV.

  • swiley 1595 days ago
    I really wonder how safe lidar really is for humans. Our retina are sensitive enough to detect single photons (when healthy) and lidar is known to damage digital camera sensors.
    • manmal 1595 days ago
      It really depends on the amount and duration of exposure. Eg lasers that are pulsed won’t warm or irritate tissue as much as continuous radiation. Some energy is also absorbed by the eyeball before it reaches the retina. The wavelength also plays a part, with some wavelengths penetrating water (=tissue) better than others. Laser light is more powerful than eg ordinary LED light because the emitted photons act coherent (possible in both the spatial and temporal dimensions), so they are more efficient heating things up, or reacting with chromophores in cells. But if the energy arriving at the retina is low enough, this will be no issue.
    • deepnotderp 1595 days ago
      Lidar damages CMOS image sensor pixels due to thermal effects. Your eyes have ample cooling capability to deal with that.
    • bdamm 1595 days ago
      In addition to safety I wonder about interference. Wouldn't lidar become ineffective if there's so much lidar around that all the lidar sources start interfering with each other and effectively blinding all receivers with noise? I really wonder why lidar-based autonomous agents plan to deal with this problem. It seems fundamental.
      • namibj 1595 days ago
        Usually not. They require resilience against ambient light already, so they are either very dim and use coding gains or they use short pulses which only yield a short time window for valid returns. You basically don't get non-malicious interference issues, except for e.g. the dot projector systems.

        Real ToF sensors can easily filter any accidental noise. You can often spoof them, however, and there's not much one can do against it considering a blinding DoS is often technically easier (track the LIDAR with a camera to keep the laserpointer on-target)

      • swiley 1595 days ago
        They probably use PRN codes with low self correlation.
  • echelon 1595 days ago
    I wonder if multiple sensors can be used with overlapping FOV. The website claims,

    >> Can multiple L515 cameras be used simultaneously?

    > Multiple cameras can share the same field of view utilizing our hardware sync feature.

    I really want to get accurate 3D spherical volumes in real time. (30fps is sufficient, 60fps would be ideal)

    I've thought about using Kinect "for Azure", because I think it satisfies this use case and does hardware clock syncing between devices:

    https://azure.microsoft.com/en-us/services/kinect-dk/

    Edit: It looks like their RealSense cameras can be set up in an inward-facing configuration:

    https://dev.intelrealsense.com/docs/multiple-depth-cameras-c...

    • kypro 1595 days ago
      I'm working on RealSense project at the moment. You won't be able to do it out the box, but their SDK does come with a lot of sample code, one which makes use of the RGB sensor on the D400 series to calibrate the cameras in world space. With just depth data it's a bit trickier.
  • fnord77 1595 days ago
    $349 for pre-order on their store. Shipping next April
  • bluegreyred 1595 days ago
    Am I cynical to expect one of these in every Echo/Home/Portal "assistant" within a decade? You know, strictly for 3D-avatar VR communication purposes only.
  • melling 1595 days ago
    Direct link to Intel’s announcement with a video.

    https://newsroom.intel.com/news/intel-realsense-lidar-camera...

    The camera is the size of a tennis ball.

    There are probably lots of industrial uses.

    • ganzuul 1595 days ago
      ~~Ooh, it's the solid-state LIDAR tech I heard about a couple of years ago! They must have bought the company that invented it.~~

      ~~The price is also just around where they expected it to be. They talked about going down to 100 eurodollars per unit when they hit mass manufacturing.~~

      ED: No this is a MEMS device. The device I'm talking about is actually solid-state, scanning the laser by way of, IIRC, acousto-optic modulation. Car companies were interested in it.

  • justinclift 1595 days ago
    Wish I had the spare time to try hooking some of these into some kind of a machine vision system, for automatically verifying that an object being created (3D printer / CNC) was created as intended.

    It'd help with automating production, but I'm not sure it'd be worth the effort.

  • georgeburdell 1595 days ago
    I don’t know why they are in the business, but a cheap Lidar camera is very interesting to me from a computer vision/home robotics standpoint. Here’s to hoping for a long life for this product line
    • kypro 1595 days ago
      It's because of the vision processing chip. My understanding is that many of the Windows facial unlock cameras are powered by Intel vision chips.
  • azinman2 1595 days ago
    Is this doing 360 plus Y as well?
  • huffmsa 1595 days ago
    I don't think it has enough resolution to help them find their 10nm design.