This year marks the 40th anniversary of the first commercially available field-programmable gate array processor, otherwise known as the FPGA.
The FPGA is a type of semiconductor that can be reconfigured by a customer or designer after manufacturing – something processors typically cannot do. It was invented by Ross Freeman, co-founder of Xilinx, along with Bernard Vonderschmitt and James Barnett.
The founders of Xilinx wanted to create a chip that could be reprogrammed to implement any digital circuit design while offering the performance and density of the typical ASIC chip design. The ability to perform custom functions was meant to eliminate the need for fabricating new chips each time a design changed.
The FPGA did that, but not very well, said Jack Gold, president of J.Gold Associates. “The problem has been over the years, especially in the early days of the FPGA, it was a very power hungry, not very high-speed device. There’s a lot of stuff that you have to do inside that device to make it fully programmable. There’s a lot of overhang, basically,” he said.
Through the 1980s and 1990s, Xilinx and its chief competitor Altera advanced the FPGA, improving performance and power consumption. The designs became more complex, adding functions such as embedded memory, digital signal processing (DSP) blocks, and clock management.
And FPGA found a home in the areas like telecom, automotive, consumer electronics, and defense. More recently, FPGA has moved into AI, edge, and cloud. Xilinx was acquired by AMD in 2022. Altera was acquired by Intel in 2015, only to be spun off in 2025 as a separate company.
Freeman passed away in 1989, so he never lived to see his creation become a success.
Kirk Saban, corporate vice president of marketing in the adaptive and embedded computing group at AND, said of Freeman: “I can’t imagine that in his wildest dreams he ever thought that we’d be where we’re at today. Now that’s my interpretation, but I mean, if you look at how big it got to be – the complexity, the kinds of devices that we have – I can’t imagine that he ever imagined it doing what it’s done.”
Today we have “monstrous devices,” as Saban put it, that have embedded ARM processors, high speed serial transceivers, and integrated hard IP built in AI blocks. “These things have evolved massively. They’ve really become the heart of our customer systems in many, many cases,” he said.
Saban acknowledges that even with newer tools, developing on the FPGA “does require some specialized skills, which doesn’t make it as pervasive and ubiquitous as a CPU.”
So, what does the AI era hold for FPGA?
Saban believes an FPGA plays a role when AI is infused into some other kind of application. One example is vision, where there are multiple cameras streaming data and a need to make quick, real-time decisions. The same applies with the edge, when resources are not necessarily connected to the cloud and there’s a need to be able to make decisions locally and quickly.
Gold thinks it’s going to go down the same path that it is on. “It will still serve that niche, as AMD is going to probably be doing in the near future. They’ll continue down the process curve. They’ll get faster, just like everything else. But you’re not going to see hundreds of millions of FPGAs going out. They’re still going to be, relatively speaking, niche devices,” he said.