> While this may look like code in a normal programming language, it operates entirely differently.
So happy to see this called out up-front. One of the hardest things with FPGAs as a developer is understanding that you're reconfiguring hardware instead of a series of serial asm/opcodes. They look superficially the same but have very different requirements and constraints under the hood.
I think there are many parallels between functional programs and electronic circuits. Many design concepts and disciplines of software can be applied to digital design. The idea of actors and message passing very naturally applies itself to digital design.
Combinational logic is functional as long as you keep feedback loops out of it. For a given input, you will always get a particular output. Eventually. Time is a factor.
Unlike software, this bit of functional logic is "on" or "active" all the time though. At any given moment, there is something at the input and something at the output. It's like a pair of sunglasses. They are always sunglassing, light is going through them and being filtered, whether you are using them or not.
Adding state is necessary for anything but the most trivial designs. And like software, state makes things more complicated. We call the logic that implements state "sequential" logic. By that token, instead of Combinational, we could call it Concurrent logic.
Actors could be a logical grouping of combinational and sequential logic. Pure functions and state. With ports for messaging.
That's still an impartial model because each layer of abstraction can introduce path delay and resource usage.
Functional programming doesn't encapsulate interfacing across different clock domains or one hot vs binary state representation. Trying to bring those abstraction models to HDL you'll find a significant impedance mismatch.
And you missed mine, as I mentioned that we had electronic lectures.
I am not talking out of thin air and wild guesses how things should be, rather about my personal experience how we learned digital circuit design and how later, already with that skill, we applied the ideas to FP.
That was the goal of my initial comment, Hardware Design Ideas -> Help understating FP function combinations as if they were ICs, not how to design hardware from FP concepts.
Didn't miss it, just don't think they're equivalent. I appreciate functional programming and block diagrams/black box design are great tools for that space. However they don't capture the nuance of circuit design that goes into FPGA development.
You can't capture pipeline depth, SRAM usage, thermal envelope/clock gating that have significant real world impacts on your design.
Part of it too is that you actually have a different set of problems. Let's say you build and abstract "Foo" block. In programming land you only pay for that block when your branch that selects it runs. Each line of code that references it just is the cost of a single opcode/asm call.
In FPGA/ASIC land each time you reference that block you're instantiating a physical copy of the block. So if your block takes 8-bits of SRAM and you have 300 references(or parent blocks that fan out to 300 references) you're now paying 2.4kB of SRAM from a fixed pool that's usually only a few mB.
Ditto switching logic, one of the fun parts of FPGA bring-up is they can have large in-rush currents of many amps as all the LUTs get flipped to their programmed state. You also have power(and usually thermal that's tied together) budgets driven by how much switching logic you run per-cycle.
It really has a lot more in common with more traditional engineering domains where planning, simulation and strong math models are the standard tools.
Surely, we don't have to work with raw quantum electrodynamics when designing an FPGA, even though that's what's going on "under the hood".
Thinking that way, gets me wondering about different layers of abstraction along with the benefits and drawbacks of each. With FPGA programming, what levels of abstraction are available and what simplifying assumptions does each model make?
I've got in mind things like Kirchhoff's laws, where we assume perfectly conducting wires, discrete elements, a conservative ambient B field, and essentially DC current over the size of circuit elements. When those assumptions start breaking down, then we can drop down a layer of abstraction.
I think most HDLs are approaching the diminishing returns of abstraction already. Most of the electrical characteristics are already captured in routing, setup and hold times so while you may not directly interact with them anything more than the most trivial design will be constrained by them.
By "under the hood" I was referring to the constraints you have on what you design. In the software space we are mostly constrained by memory(huge), network speed(blazing fast), CPU time(tasks running long normally don't break things).
In FPGA/ASIC land those constraints are much, much smaller(kB vs gB) and there are many more(thermals, power, hard cycle requirements, etc).
> Surely, we don't have to work with raw quantum electrodynamics when designing an FPGA, even though that's what's going on "under the hood".
Probably. But that doesn't mean you might not see the effects of it. I can't find the story/paper at the moment due to too much noise in my searches, but a team of researchers gave some kind of machine learning process (I think genetic programming) an FPGA to treat as a black box to solve some problem. It ended up making a program for the FPGA that didn't work on any other FPGA, and had what should be useless elements in the program that didn't connect anywhere, but without which the whole system wouldn't work properly. So at the least they had some kind of weird effect going on that wouldn't be explained by normal analysis.
Maybe more like dataflow or flow-based programming. Let them play with those a while. Then, make the black boxes FSM's with serious constraints about size or execution time. They'll start getting the idea. Messing with a synchronous or time-oriented language might help, too, in terms of a metaphor for clocks.
For those looking for cheap FPGA boards, I highly recommend looking for something that is based on the Lattice ICE40. The cheapest board you can find is probably the $9 Upduino, though trickier to get going. (Complete lack of documentation.)
But there are tons of hobby boards in existence.
The best part is Project IceStorm, a fully open source tool flow, from synthesis to bitstream. While not the best in terms of optimization, it's killer feature is that it's blazing fast. You can get small design synthesized and converted to a place-and-routed bitstream in under a minute.
I entirely see where you're coming from, and this has completely changed in the last 2-3 years. SiliconBlue/Lattice used to be a niche-market low-power also ran in the FPGA world. Documentation was sparse and most examples from academia targeted Altera or Xilinx parts.
The open-source toolchain flipped this and allowed Lattice FPGAs to become the tool of choice for beginners and small-application hobbyists. There's now a tremendous wealth of resource available around getting started with Lattice FPGAs. And as the cherry on top, you don't have to wrangle the gigantic Vivado/Quartus/ISE suites to do so.
Amazing project to learn how FPGAs work. Not overkill with some high speed interfaces, but touches all problems: no easy division by 3 and 5, counters everywhere, no copy/paste solutions, no libraries with helpful functions. Though Xilinx ISE is obsolete, Vivado is the current tool.
As kens said, and to expand, Xilinx's Vivado doesn't support the Spartan 6, however they _do_ still fully support the Spartan 6 and continue to produce new chips. To support this, they've released a Virtualbox compatible image which runs Oracle Linux with ISE configured specifically to program Spartan 6 FPGAs.
And the QFP Spartan 6 is the most powerful Xilinx FPGA that is hand solderable. Maybe not a concern if you are using an existing board or have a reflow oven, but for me that is a good reason not to go up to Spartan 7 or newer.
The most interesting part of all this is just how CHEAP some of the learning boards are. I don't know enough to understand how limited the ElbertV2 is, but at $30, it can still be pretty limited and a fun toy
> I don't know enough to understand how limited the ElbertV2 is...
The Spartan-3A (XC3S50A) is an older chip -- it's from 2007, and is itself a minor update to the 2003 Spartan-3 family -- and using it limits you to the Xilinx ISE toolchain, which hasn't been updated since 2014.
It's still a decent entry point to FPGA development, though.
> But it's hard to find and get started with a good open source toolchain
That's because there aren't any. The one you have happens to be the exception - it's the only FPGA line with an open source toolchain AFAIK. FPGAs are the realm of large, heavily proprietary build environments.
I was complaining about that the other day. I'm a pretty big fan and proponent of the Libre software movement and have recently decided I want to help with an open hardware movement because it is almost impossible to do any serious work with FPGAs using only FOSS. Does anyone know of any good places to start and help?
The Lattice boards mentioned above are a good start. Really they're your only choice since they're the only open source toolchain, but the iCEStick is a pretty good place to start. It's a decent FPGA in a USB stick form factor and it's self contained so you don't need an external programmer. At $25 it won't break the bank.
If you meant more how to get involved with the community effort, I'd start with Clifford Wolf's page (http://www.clifford.at/icestorm/ ). He's the main guy behind the efforts with the iCE40 toolchain and his page links to some other efforts to reverse engineer Xilinx and Altera chips. There's a lot of interest in more open FPGAs, but I think there's a lack of skilled people so you'd likely be welcome.
I have been trained as an EE to consider FPGA design in the hardware skills domain, and indeed I would say that this writeup ignores the less friendly parts of FPGA development (timing closure, clock domain crossing and more...). Nevertheless, I think it is a good introduction for software developers to start thinking like "hardware description". Good work!
Great article! I've also been learning Verilog -- albeit in an academic setting -- and it has been an interesting experience. I discovered that you can do surprisingly complex things in Verilog (at least, from a digital logic point of view) using less code than I initially expected!
I would suggest that you be VERY careful with that expressiveness if you want to do anything in an actual hardware context. Verilog makes it very easy (IME much easier than VHDL) to create structures that, while technically possible to synthesize, take enormous amounts of logic resources.
Sometimes if-else-chaining is what you need. But if you are doing one of n-select, a switch is what you should use.
If you are using Verilog 2001 or later, you can use always @* to implicitly declare all variables a combinational process depends on. It used to be a huge source of latches.
I would also highly advice against mixing blocking (i.e a = x) and non-blocking (i.e. a <= x) assignments in one process. I usually write a single clocked process with all registers and then only have a few extra combinational processes for the logic.
I also recommend separating data path logic (the processing you do) with the control path (the FSM that keep track of what is happening inside your machine).
Never assume values or states is really good. I usually assign default states at the top of processes. Then for other state you only have to define that. It is really helpful in FSMs.
Not claiming to having a great coding style, but if you want some examples on RTL code and testbenches here are my cores for AES, SHA256 and many others:
Your professor seems great so far but pretty silly of him to teach you verilog instead of VHDL. As an EE myself I haven't seen a job description that says verilog over vhdl in a long time. Might want to familiarize yourself with both if you plan to pursue hardware design further.
VHDL is popular in Europe and the US East Coast. Verilog is king on the West Coast.
An an EE who was schooled in VHDL, gave VHDL classes to new hires and was conditioned to look down on Verilog: after moving from Europe to the East Coast and then the West Coast, I eventually saw the error of my ways and I'm very happy to not have to deal with VHDL anymore.
And for the open source crowd: there is a much large body of decent quality open source Verilog tools than there is for VHDL. So as a hobbyist, it's really a nobrainer.
I always got more of a VHDL for FPGAs and Verilog for chips kind of vibe. Not that there's anything intrinsic in those languages, just the kinds of industries that ship FPGAs seem to gravitate to a DoD specified language.
And can you elaborate on what you like about Verilog?
That is very strange. Verilog and SystemVerilog totally dominates VHDL across the globe in terms of market share. Including Europe. Esp if you are designing ASICs, but also for FPGAs.
If you are desinging ASICs and use VHDL you will quite likely end up having mixed language RTL design. And after synthesis the netlist will be in Verilog. So any co-simulation between RTL and netlist must support both languages. Which costs extra in licenses.
Other reasons for choosing Verilog is that the tool developers (being mainly in the US) knows Verilog best. If you look at the adaption rate of new language features, Verilog (SystemVerilog) gets much more attention.
VHDL is for some reason considered a good school language. I don't see the point of it. It's a bit like teaching the ISO stack. The industry has chosen another stack, the pedagogical value if ISO vs TCP/IP is slim to none. The same can (imho) be said for VHDL.
The majority of Intel's newly developed HDL is systemVerilog on the design and verification side, but honestly there is still a mix of everything going on there.
Some chip designs make use of purchased IP which is at times VHDL. Heck some unit owners and architects even experimented with HLS (High Level Synthesis) using SystemC for part of the graphics hardware and a few units still have state machines which automatically translate from a DSL or word macros.
I do think both of the trends people are identifying hold true though. US is more verilog than not, Europe I see more VHDL than verilog. US exceptions are typically FPGA developers and defense contractors (but definitely not universally).
In my opinion I find systemVerilog to be much easier to design in with less verbose syntax, better tooling and some nice constructs that make realizing design intent easier. systemVerilog always blows away VHDL in testbench design and general verification environment support. I would never wish designing an elaborate testbench in VHDL on an engineering team. VHDL does have some advantages if you have requirements to do a lot of formal property verification though.