A reasonably speedy Python ray tracer

(excamera.com)

162 points | by AlexeyBrin 2377 days ago

14 comments

  • IvanK_net 2377 days ago
    I made one in Javascript a long time ago :) http://renderer.ivank.net/

    After some time you can get this: http://renderer.ivank.net/balls.jpg :)

    Edit: I am glad you like it! I also made this fully-GPU renderer (actually, it is a game): http://powerstones.ivank.net/

    • FeepingCreature 2377 days ago
      I also made one in Javascript. http://feep.life/~feep/jsfarm/info.html

      It uses a Lisp-based scene description language (with macros!) and WebRTC to form a P2P network of compute nodes, entirely in the browser, with near-native performance thanks to dynamic compilation to AsmJS.

      It got 0 votes on Hacker News.

      I'm not salty.

      edit: Source on Github! https://github.com/FeepingCreature/jsfarm/

      edit: I reproduced your scene, give it a bit to render.

      edit: Wow, you have a lot of neat scenes!

      edit: And here you go. rendered at ~2.5 million samples a second, thanks JumboCargoCable whoever you are! (You can set your nick in the settings menu accessible via the gear icon in the top left.) https://i.imgur.com/UvdBhq1.jpg and scene http://bit.ly/2yYciCS though I think I made it too bright.

      edit: Some people appear to have buggy systems that always return black pixels. :-(

      edit: Could whoever is SilkyDoorGame please post their cpu, os and browser?

      • dang 2377 days ago
        If you email hn@ycombinator.com we'll send you a repost invite. I don't want to do it now because once a particular theme (in this case ray tracers) has made the front page it's usually not a good idea to post another one too soon.
      • wdfx 2377 days ago
        That a great concept. I'd vote for that :)
        • FeepingCreature 2377 days ago
          Yeah in retrospect my mistake was to bet on a single announcement post instead of a series of smaller posts over the months I developed it. That's the sort of thing you only realize is a mistake after you'd made it, though.
          • nathancahill 2377 days ago
            Repost it now with "Show HN:" in the title. Helps distinguish OC work.
      • CyberDildonics 2377 days ago
        Why not just us json for scene description?
        • FeepingCreature 2377 days ago
          It's not strictly speaking a "scene description language", it just looks like that at first glance.

          It's a fully capable compiled programming language, which I happen to have written a raytracer in. Check out the "pathtrace" tab.

          The advantage is if there's some issue with the raytracer, you can fix it yourself. And you can use arbitrary scripts for making scenes. (Though the scene in memory must not exceed 32MB, which may limit you somewhat.)

      • IvanK_net 2376 days ago
        BTW. do yo perform Gamma correction in your pictures? They look like you skipped it.
        • FeepingCreature 2376 days ago
          Nope, they're purely summed up. What do I have to do?
          • IvanK_net 2376 days ago
            The results (R,G,B) that you calculate are physical quantities (amount of photons, or Watts). But when you double the amount of photons, the human eye sees it like 40% brighter, not 2x brighter.

            RGB values, that you give to the monitor (through the canvas), were constructed according to the human eye, so rgb(40,40,40) looks 2x brighter than rgb(20,20,20).

            Long story short, when you have your physical Red value between 0 and 1, do Red = Math.pow(Red,1/2.2);

      • gt_ 2377 days ago
        Great scene description language
  • rossant 2376 days ago
    As the author of the original version, I'm very glad to see such an improvement !

    My IPython Cookbook contains increasingly optimized versions with Cython as an illustration of how to use this library to accelerate Python code. The fastest Cython version is 300x faster than pure Python ; a lot of Python/NumPy overhead is bypassed by reimplementing the logic in, basically, C. The OpenMP multicore GIL-releasing version is roughly 4x faster than the fastest Cython version on a quadcore computer. (https://github.com/ipython-books/cookbook-code/tree/master/n...)

    There is also a GPU reimplementation (in OpenGL/GLSL) in the VisPy examples (https://github.com/vispy/vispy/blob/master/examples/demo/glo...), it is animated and runs in real time.

  • Mauricio_ 2377 days ago
    For anyone without any idea how to do this, Ray Tracing in a Weekend is a good introduction. It teaches you to do the image in the cover in a pretty short time. https://www.amazon.com/gp/product/B01B5AODD8
    • nikofeyn 2377 days ago
      yea, that is definitely a cool book. a few months back i started to go through it doing the implementation in racket but got distracted with other things.

      https://github.com/nikofeyn/ray-tracing-with-racket

      the code should run directly in dr. racket without modification, but i didn't finish the book obviously to get to the cover picture.

      the c++ code in the book is pretty straightforward (he leans on practicality), so it is kind of fun to directly port to a language but then slowly change it to be idiomatic in the target language. i was learning a lot about racket (my first project in it) in the short time i was going through the book. i need to get back to it...

  • berkut 2377 days ago
    While cool, it should be pointed out that this way of ordering things doesn't really scale with scene complexity (more objects, more complex triangular meshes requiring acceleration structures) or image size, as the number of masks required to determine visibility would become very prohibitive.

    One of the great things about raytracing (at least the basics before you get to more complicated light transport), is how simple the normal recursive algorithm is for rendering a scene. This method in the article complicates that greatly with the mask passes, and I guess could be termed a wavefront renderer.

  • Marazan 2377 days ago
    I wrote a pure python (No Numpy) ray tracer as a learning exercise. Spoiler: it was slowwwwwwwwwww.

    I converted to NumPy and it was just slow. I then went to array broadcasting (which required surprisingly few code changes due to NumPy being pretty awesome) and it became fast.

    • tgb 2377 days ago
      I'm familiar with the concept of broadcasting in Numpy, but I don't understand what it means in this context. Can someone explain?
      • Marazan 2376 days ago
        __s basically covers it but my initial switch from pure python to NumPy just involved changing my vector 'class' (just a tuple in reality) into NumPy arrays. Whilst the basic vector multiplications and additions became way faster the overhead of creating hundreds of tiny NumPy arrays was a killer.

        So, just like in the original article instead of creating a single 1x3 array I created a Mx3 array where M represented as many rays as I could fit into memory at once (I have quite a weedy machine).

        Due to how NumPy broadcasting exactly the same code for, say, subtracting the origin from a ray vector works to subtract that single origin vector for a multidimensional array of ray vectors.

        • tgb 2376 days ago
          I see now, thanks.
      • __s 2377 days ago
        Basically map https://docs.scipy.org/doc/numpy/user/basics.broadcasting.ht...

        So OP would've converted from indexing numpy arrays in Python space

  • m00s3 2377 days ago
    Anyone besides me disturbed that one of the code samples had function that took 3 parameters, 2 of which where 'O' and 'D'? I had to look at it a few times before I realized those were different variables.
    • willvarfar 2377 days ago
      In 3D programs it's normal for O to be origin and D direction. It's a convention you'll see in most codebases and it's completely undisturbing.
      • ci5er 2377 days ago
        Really? Since when?

        It's been a long-o time since I did any 3d physics or rendering code (in C), maybe even since before the WWW, but I don't remember this convention... (I mean - sure - it makes sense, but I don't recall the two 3-space triples being necessarily called that even in things like GL)

        • berkut 2377 days ago
          OpenGL doesn't do raytracing where you have a ray origin and direction though, so you wouldn't have seen it there.

          Using the full terms or shortening them to Dir and Orig are more conventional in my experience.

          It gets even more fun when you get to evaluating BSDFs for materials and you have variables like wi, wo, and different people use them in different ways :)

    • dahart 2377 days ago
      Anytime I start to feel disturbed about anyone else's code, I remember to re-read this: https://news.ycombinator.com/item?id=13571159
    • tomjakubowski 2377 days ago
      You'll get used to it after reading a couple dozen Shadertoy ray marchers.
  • Twirrim 2376 days ago
    Out of curiosity, I did a little digging around. I really need to take a step back and actually understand what is going on fully with the code, but a quick trot through cProfile showed a lot of effort hitting the dot method of vec3, primarily via abs(self) under the norm method.

    There's a useful library for python called numexpr, http://numexpr.readthedocs.io/, which can speed up numpy operations, leveraging multiple cores etc. (and Intel's VML library if you have it installed), one I've been aware of but never got around to trying out.

    At a quick stab, it seems like it can't take properties of classes? Either way, modifying dot a bit:

        def dot(self, other):
            self_x = self.x
            other_x = other.x
            self_y = self.y
            other_y = other.y
            self_z = self.z
            other_z = other.z
            return ne.evaluate("(self_x * other_x) + (self_y * other_y) + (self_z * other_z)")
    
    
    at 400x400 this slows things down a little. Once you get above about 800x800 it starts to draw equal. By the time you get to 2000x2000 it's shaving some 10% of the execution time.

    edit: here's a quick stab at using numexpr at just the most obvious places, without trying to consider major code refactoring. Note this bumps up the resolution to 2000x2000

    https://gist.github.com/twirrim/64f523fd5e8be86eb392b90e9222...

    compared to rt3 (at same resolution), I knock off over 10% on this 2015 retina mac:

      $ python rt3.py && python rt4.py
      Took 5.78933000565
      Took 4.9023668766
  • webkike 2377 days ago
    In college I built a raytracer in Rust, and I have to say it was one of the most valuable learning experiences I have ever had.
    • tyingq 2377 days ago
      >In college I built a raytracer in Rust

      I don't usually feel old around here. Every once in a while, however...

      • webkike 2377 days ago
        Yeah I say in college, but that was only a few months ago. College isn't that long, I might as well have said "in highschool"
        • khedoros1 2377 days ago
          "In high school" was over 14 years ago, for me ;-) The start of "in college" was about 14 years and one month ago.

          And Rust is "only" 7 years old. Time slips by.

  • ricardobeat 2377 days ago
    Are the reflections in examples like this physically correct? My brain kind of expects the floor to be strongly curved when reflected in the sphere. Maybe it's just the unfamiliar, unrealistic environment?
    • dahart 2377 days ago
      Yes, for perfect mirror surfaces. The scene isn't very physically realistic, but the reflections are doing the right thing given the scene. The floor is strongly curved in the reflections. The reflection of the horizon line isn't, only because the camera is near the floor and looking almost level. If the camera were up higher looking down, the horizon reflection would be more strongly curved.
    • CyberDildonics 2377 days ago
      The simple answer is no, but unless there is an error the rays will at least be traced from the reflection angle. All light you see bouncing off of hard surfaces is some sort of reflection and even for sharp reflections there are a lot of details.

      One is fresnel falloff towards edges, which is simple. Another is depth of field carrying into reflections, which is more difficult and not common (yet).

    • RBerenguel 2377 days ago
      I'm pretty sure the strong curve is because (normal) floors are finite planes, and usually we don't look so close from the sphere's equator in real life.

      Once I wrote a raytracer in Lisp based on pg's code in ANSI Common Lisp, so I've seen the results of changing camera positions.

    • berkut 2377 days ago
      It depends on where the camera is, how far the floor extends and the reflection angle (which might be different for different materials / IOR).

      https://imgur.com/a/PCG4O

  • CyberDildonics 2377 days ago
    The demos that come with embree do simple stuff like this in real time, which would be about 450 times faster, so I wouldn't call this 'reasonably speedy'.
    • Marazan 2377 days ago
      'in python'

      No one is ever going to even think of implementing a ray tracer in Python for anything more than fun or education.

    • dahart 2377 days ago
      This one is close to real-time at 115ms. Maybe you missed paragraph 2?
      • gmueckl 2376 days ago
        You could pack the raytracer and the scene into a GLSL fragment shader without any trouble whatsoever and achieve something greater than 30fps out of the gate.

        Check out https://www.shadertoy.com/ - all of this crazy stuff is generated using two triangles and a fancy shader. Almost every 3D scene there is generated using some form of ray casting or ray tracing on the GPU.

      • CyberDildonics 2376 days ago
        It does that by calling in to numpy which is native.
  • melling 2377 days ago
    I’ve got a list of random resources here:

    https://github.com/melling/ComputerGraphics/blob/master/ray_...

  • gravypod 2377 days ago
    I'd be interested in how this compares to one optimized by Numba.
    • mathgenius 2377 days ago
      Yes, or perhaps go directly to llvmlite. Good wholesome fun.
  • make3 2377 days ago
    now do a Tensorflow version!