11 comments

  • etaioinshrdlu 4 days ago

    It looks like you're using an ice40 FPGA?

    If you can make your project work with 38 I/O pins you could probably get it fabricated in ASIC by Efabless free. You just need to meet their repository requirements and make your verilog module conform to this interface: https://github.com/efabless/caravel/blob/master/verilog/rtl/...

    The openlane tool converts your verilog all the way down to final ASIC design...

    You'd need to license everything Apache 2.0 though. And it would have to be done soon, the deadline is Nov 30th.

    • nickmqb 4 days ago

      In terms of I/O pins that should actually be fine (current pinout: VGA (14x), SPI (4x), CLK (1x), N64_controller (1x)), though I'm still working on the project and I don't think it'll be done by the deadline. It looks like they might do future batches though -- it's a cool idea!

      • primis 4 days ago

        If you're using an n64 controller, you might want to consider using a gamecube one instead (it uses the same control logic, it just adds a 5v line for rumble features)

        Unless you're particularly fond of the N64's controller design that is

        • nickmqb 4 days ago

          That's a good idea. The two analog sticks on the GC controller would be an improvement over the single stick on the N64 controller for movement in 3D. I think that the main benefit of the N64 controller (besides a nostalgia factor, though that may just be me ;)) is how easy it is to connect. I actually just got some wire from the local hardware store, plugged pieces of it into the controller connector, and then attached some IC clips. For the GC controller, things are a bit trickier due to its connector layout, though I just found [1] which might be a nice solution; alternatively, buying a GC controller extension cord and wire stripping could be an option. I'll consider it!

          [1] https://www.raphnet-tech.com/products/gc_controller_connecto...

          • retro_guy 3 days ago

            Talking about the N64 and FPGA, do you know that someone is working on an FPGA implementation? Link: https://twitter.com/ufp64

            Hmm… Hardware Minecraft… I do also wonder what would it take to create a PICO-8 (popular fantasy console) implementation in hardware with a Lua CPU…

            • MaxBarraclough 2 days ago

              Impressive. Any chance that project will get a Cease and Desist from Nintendo? I don't know much about the history of hardware projects like that.

            • asddubs 4 days ago

              a less clean solution is also to buy controller extension cords (which are usually cheap even for older systems) and chopping off the end.

              • Craighead 4 days ago

                The GC controller is arguably the most ergonomic and capable controller ever made

            • etaioinshrdlu 4 days ago

              It's just as well :) None of the code for the submission process they have you use actually works. It's rather insane what they are asking developers to do.

              • mysterydip 3 days ago

                Don't you only need 5 pins for VGA: R,G,B,HSync,and VSync?

                • nickmqb 3 days ago

                  VGA is an analog protocol, but the FPGA can only output a 0 (GND) or 1 (3.3v) on its I/O pins. I'm using a Digilent VGA Pmod [1] which uses a set of resistor ladders to map each color component from a 4-bit value to an analog voltage that goes to the monitor. This means that we have 14 pins: R (4x), G (4x), B (4x), HS (1x) and VS (1x).

                  [1] https://store.digilentinc.com/pmod-vga-video-graphics-array/

                  • mysterydip 3 days ago

                    Oh duh, I should've remembered that. So 12 bit VGA then. From the demo that seems sufficient. Have you tried getting rid of the LSB to see if there's a noticeable difference? I'm curious how few colors you actually need

                    • nickmqb 3 days ago

                      Most textures look fine despite the 4-bit quantization, but color artifacts/hue change do become more apparent when multiplying with a light factor (to darken the textures on the bottom and sides of blocks), so I'd say 4-bit is definitely pushing the limit. 6 bits or even 8 bits per color component would be ideal, though unfortunately on a small FPGA like this we cannot afford such luxuries ;). It is probably the first thing I'd change if I were to port this to a larger FPGA, since it would be pretty straightforward to do and the increase in visual quality would likely be worth it.

            • throwaway122kk 4 days ago

              This looks very good well done! Microsoft need to hire you for the team asap!

              It always cracks me up playing minecraft on xbox one x and once your village hits few hundred villagers framerate (when running beta from insider it's shown on screen) drops to like 3-4fps

              • makapuf 4 days ago

                Wow, any details? Somme code ? A repo ? What graphical output? VGA or hdmi ? What input? This is very intriguing...

                • nickmqb 4 days ago

                  The screen is connected to the FPGA over VGA. The output resolution is 1024x768 @ 60Hz, but the 3D portion of the screen is rendered at 256x128 @ 30Hz. The design consists of a custom 16-bit CPU (running at 32.6Mhz) and a custom raytracing "GPU" that can handle up to 4 rays in parallel.

                  Input happens via a N64 controller! Those are actually fairly easy to work with at a low level.

                  The code is not public, though I'm considering open sourcing the project when it's done. Moreover, there's a lot of additional details that I could potentially go into, so I'm considering also writing a few blogs posts with more info if people are interested!

                  • guiambros 4 days ago

                    Pretty impressive; definitely interested in hearing more.

                    Did you use your Wyre [1] language to develop it? I saw the examples on Github, and seems pretty interesting. Its cuts quite a bit of the verbosity of Verilog. I'll give it a try.

                    [1] https://github.com/nickmqb/wyre

                    • nickmqb 4 days ago

                      Yes, I'm using Wyre for this. Feedback is always welcome, so once you've had a chance to try it, don't hesitate to let me know what you think!

                      • gravypod 4 days ago

                        Wow! This is really cool, thank you for open sourcing this! Do you have any posts talking about how you got started with FPGAs? I'm a SWE and want to get into this area. Your language seems to map directly to what I was sort of expecting to see in the HDL/VHDL world but wasn't finding. You're making me want to buy a VGA monitor and an iCE.

                        • nickmqb 4 days ago

                          Just do it ;). I don't have a blog post on how I got started, but you gave me the idea that perhaps I should write one.

                          In terms of hardware, I'm using an iCEBreaker dev board [1] which has worked really well (A nice bonus is that the board has a 16MB flash chip, which can be used for storage -- this is what the Minecraft clone uses to load the map and textures from on startup).

                          On the software side, I'm using the open source yosys/nextpnr/icestorm toolchain which is a lot faster than the vendor supplied tools. I mostly figured things out by just trying stuff, so a high iteration speed definitely helped here!

                          [1] https://1bitsquared.com/products/icebreaker

                          • ngcc_hk 4 days ago

                            Definitely write one. Wait for it. It would be great.

                    • anfractuosity 4 days ago

                      It looks very cool! What's the GPU & CPU written in out of interest, I think I saw on your twitter you've written something to translate to verilog, is that used for this?

                      • nickmqb 4 days ago

                        Yup, that's correct, the design is implemented in Wyre, which is a hardware definition language that I created. The language compiles to Verilog so it can work with existing hardware development toolchains. The language is open source and can be found here: https://github.com/nickmqb/wyre

                        • OJFord 3 days ago

                          I like the sound of that, but my verilog's rusty, a suggestion for the readme: show what the equivalent verilog would be for the example, or better I suppose the actual verilog that it would transpile to.

                          I suppose your target audience is mainly more familiar with verilog (though not necessarily I suppose - could have only ever used VHDL) but I'm interested in playing with it, just haven't used verilog, or FPGAs at all, since university.

                    • kingosticks 4 days ago

                      Very interested, please do!

                  • young_unixer 4 days ago

                    Are you using Minecraft's textures? because those look very similar to Minecraft's actual textures.

                    • nickmqb 4 days ago

                      Yes, I am. This also means that any open source distribution won't include those textures. However, if I do end up open sourcing the work I'll make sure to include instructions for people that already own Minecraft; the textures are just .png files that can be extracted from the game's .jar file and can then be transformed to be used on the FPGA.

                      • guavaNinja 3 days ago

                        I suggest you look at minetest[1] textures. It's an opensource minecraft clone. Most textures have CC or MIT licenses. Read license.txt for each texture mod before you use them. They may help in opensourcing your project.

                        [1]: https://github.com/minetest/minetest_game

                        • franga2000 3 days ago

                          I can't remember any off the top of my head, but I'm pretty sure there are a few open source texturepacks for Minecraft that would be a drop-in replacement (same filenames amd structure).

                      • takenpilot 4 days ago

                        This is insane and I love it.

                        • layoutIfNeeded 4 days ago

                          Wow, this is some Fabrice Bellard tier stuff! Very impressive!

                          • mentos 4 days ago

                            How is the rendering performance with large worlds?

                            • nickmqb 4 days ago

                              The FPGA that I'm using for this (the Lattice iCE40 UP5K) is really limited when it comes to RAM, which is the main constraint when it comes to world size. As per the title, there's only 143kb, which is insanely low for doing any kind of 3D stuff :). 48kb is used by the frame buffer, 19kb for textures, which leaves 76kb. 48kb of that is used for the map, which currently limits it to 32x32x32 blocks. However, I do have some plans to improve on that in the future!

                              The FPS (30Hz) is rock steady though! One of my pet peeves when doing DirectX/OpenGL development is that it's really hard to completely avoid frame drops, e.g. if the OS decides to schedule some other thread, or because the GPU driver decided to do something else than render your app. With hardware development, you can side step all of those problems. As a result, the Minecraft clone is guaranteed to not drop frames :).

                              • gnramires 3 days ago

                                Have you thought of going Shadertoy style and doing everything procedurally? Or every block procedurally? That way you can cut RAM as much as you wish. For example, if you have a procedural formula to determine if a block is populated, you don't need to store it in RAM, just use this formula in the renderer directly (in Shadertoys this usually would repeat per-pixel).

                                • nickmqb 2 days ago

                                  It did cross my mind. However, a problem with that approach is that evaluating such a formula is too costly/inaccurate on a small FPGA like this, which just has 8 DSPs (that can only do 16x16 bit multiplication), and some of these are already in use in other parts of the design.

                                • jfries 4 days ago

                                  If you have a guarantee for the worst case of generating a pixel (which you indicate by saying that you never drop frames), couldn't you get rid of the framebuffer? Schedule pixel generation so that they complete just in time for when they're needed for output.

                                  This would save up RAM for other things (and be a fun exercice to get right).

                                  • nickmqb 4 days ago

                                    That's a good observation! This technique is also known as "racing the beam". The problem is a mismatch of refresh rates; the VGA display operates at 60 Hz but the ray tracer is not capable of producing that many pixels, it can only do 30 fps. So we need a frame buffer to store the result.

                                  • Impossible 4 days ago

                                    On console and embedded the OS is either non-existent or gives your game guarantees about when it will schedule something and how often. Hardware obviously gives you way more control, but a baremetal raspberry pi project, Arduboy, console homebrew, etc can give you some of that control back in software. Awesome project btw

                                    • roblabla 3 days ago

                                      As an example of this, on the Nintendo Switch, games run on three cores dedicated to the game, while the rest of the OS tasks run on the fourth core. Furthermore their scheduler gives precise guarantees about the scheduling order of threads spawned on the same core. It makes sustained 60fps achievable through careful design.

                                    • mentos 4 days ago

                                      Thanks for the reply incredible work!

                                      Are there any FPGAs out there with an order of magnitude more memory you have considered?

                                      • nickmqb 4 days ago

                                        There are definitely FPGAs that are a lot more capable than the FPGA that I'm using. For example, see my reply to flatiron here: https://news.ycombinator.com/item?id=25172781

                                        While it would have been easier to use a larger/faster FPGA, part of the fun of such a low level project is to work within harsh constraints and see what can be done regardless :).

                                  • b20000 4 days ago

                                    this is super awesome! did you build the raytracing stuff from the ground up?

                                    • nickmqb 4 days ago

                                      Thanks! That's correct, I built the ray tracing "GPU" from scratch. It's highly optimized because it needs to trace 256 * 128 * 30 = 0.98 million rays per second on very under powered hardware. It's specifically tailored to fast traversal of voxel grids. There's too many details to go into here, but as I wrote in another comment, I'm considering writing a few blog posts to explain how everything works in more detail!

                                      • Shared404 4 days ago

                                        > I'm considering writing a few blog posts to explain how everything works in more detail!

                                        If you do, I'd love to read them.

                                    • MaxBarraclough 2 days ago

                                      Minecraft on 40mW. Very cool.

                                      • flatiron 4 days ago

                                        Did you consider porting it to mister?

                                      • vaccinator 4 days ago

                                        a Minecraft clone in hardware means all code is hardware?

                                        • nickmqb 4 days ago

                                          Yes and no. The design includes a custom built 16-bit CPU, which uses a custom instruction set, which I wrote an assembler for. There is a small 4kb bank of RAM that contains a program written in this instruction set. From a hardware perspective it is just data, but from a a software perspective it's that program that is ultimately responsible for running the game (by reading input from the gamepad module, setting up GPU registers, etc.).

                                          • function_seven 4 days ago

                                            Yup.

                                            The assembly language for this chip will be redstone :)