AMD’s been on a roll with its laptop, desktop, and server chips over the past few years. Now the company is apparently taking aim at the FPGA space.

According to a report from The Wall Street Journal, AMD is in advanced talks to buy chip maker Xilinx in a deal worth more than $30 billion. Xilinx is the company that invented the field programmable gate array (FPGA) more than three decades ago, and the company remains a leader in the space.

Avnet PicoZed with Xilinx Zynq 7020 SoC

FPGAs are flexible chips that can be programmed to serve a variety of functions. Unlike most other microprocessors, that means you can reprogram a chip to change the way it works after it’s been produced.

The WSJ article notes that FPGAs are currently widely used by cellular network providers as they build out their 5G infrastructure. They’re also used by the military for communications and radar systems. Xilinx also has “space-grade” chips for use in satellites and other applications in space.

But they’ve also shown up in some pretty interesting niche devices in recent years. The Analogue Pocket handheld game system is designed to use FPGAs to let you play classic console games using the original cartridges, by emulating the chip architecture of old game systems. The Precursor open mobile dev kit is powered by an FPGA that allows you to emulate different processor types so you can create your own handheld system that meets your specific needs.

Precursor (w/Xilinx XC7S50 SoC)

I doubt these hobbyists/enthusiast applications play a very large role in AMD’s decision to pursue an acquisition of Xilinx. But I do wonder what the deal would mean for the future of this sort of gadget.

Rival chip maker Intel is also a player in the FPGA space, having acquired Xilinx competitor Altera for $16.7 billion in 2015.

Support Liliputing

Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).

But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.

Contribute to our Patreon campaign

or...

Contribute via PayPal

* If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it.

Subscribe to Liliputing via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 9,545 other subscribers

16 replies on “WSJ: AMD wants to buy FPGA maker Xilinx for over $30 billion”

  1. Just wondering..
    So, Xilinx has contracts with CH army?
    And AMD is thinking about buying them?
    Would this not create a conflict with Thrump policy?

    1. The recent changes to trade policy were with regards specific chinese companies. I have no idea if there are any rules about the People’s Liberation Army. Xilinx is (supposedly) american to begin with though so I figure anything preventing AMD from buying it would have already prevented Xilinx from selling stuff to the glorious and superior PLA.
      They WERE selling stuff to huawei and then had to stop. I couldn’t find anything about them selling stuff to the PLA directly in the five minutes I cared to spend looking this up.

  2. CERTAINLY can’t be for the data center

    because that’s a DEAD END for FPGAs.

    Let me emphasize these words. No evidence. They are based on crowdthink, looking from a very skin-deep view of technology. Just because no one has financially yet made it desirable in data center, doesn’t mean they won’t henceforth and forever. The same financial experts said AMD was going to fail and look where we are.

    Fact: ASIC is not anyway near as reprogrammable and flexible as an FPGA. Full stop. You can’t shoehorn a cryptography-oriented ASIC into a video codec ASIC, and vice versa and so on and so forth, no matter how much driver wizardry you throw at it.

    An ASIC is manufactured down at the silicon level to be stuck, transistor and logically spearking, to only solve a specific subset of fixed function tasks. An FPGA has chameleon-like versatility that does not require switching out a chip or a card from a server.

    Why is an ASIC bad? I can tell you for a fact data center do not want to open up deployed machines ever time a client wants to swap their ASICs around because it greatly increases the possibility of hardware failure and downtime.

    Why is an FPGA good? An FPGA lets a client change their fixed functions on the fly and reprogram it to solve those problems on the fly. That’s huge. And just because no one has gotten it working yet doesn’t mean it isn’t a wise, financially viable idea.

    I say don’t dismiss it quite yet. Will it pan out? I don’t know. But none of us have a crystal ball and speaking from an engineering, IT, and web dev perspective (I have held all three hats in one form or another), it could really work. Just judging by the past financial performance of failed attempts does not mean it will never work when their is huge potential that is, as of yet, untapped.

  3. I’m really interested to know what AMD wants to do with FPGAs for them to buy Xilinx for $30 billion. Definitely way too much money to just add low-performance programmable logic into their CPUs and there are other much cheaper companies for that kind of thing.

  4. From what I am reading here, all the comments are from people who honestly never should be considering FPGAs for their target use cases. ASICs always were a better option over FPGAs. FPGAs nevertheless still have their rightful place: very fast prototyping and very fast time-to-market. If you are working with an extremely tight power budget or high volume sales, honestly an FPGA should generally not enter the conversation in the brainstorming phase. You are better suited with an ASIC, which can be purpose-built to the algorithms at the core of your problem. An FPGA will always be much more power hungry than ASIC and is only a good choice if: (1) you have a tight development deadline, (2) you are selling in very low volume, and/or (3) power consumption is not mission critical.

    1. The question is why is AMD specifically getting into FPGAs.

      It certainly can’t be for the data center as people have already mentioned because that’s a dead end for FPGAs.

  5. Yeah, this definitely can’t be a datacenter/server move. If so, then it’s a bad move. Maybe that’s why AMD’s stock went down after this news.

    Despite Xilinx’s announcements and partnerships, they haven’t really made much progress with customers to use their FPGAs in production servers. Intel hasn’t succeeded either especially since the acquisition significantly delayed their FPGA timeline.

    From what I’ve seen, the big tech companies have been hiring a lot of ASIC engineers to make their own accelerator ASICs and just skipped over FPGAs all together.

    1. FPGAs aren’t a bad move. They can be quite useful for adding fixed function and adapting it on-the-fly and morphing it to solve a different problem altogether. Contrast that with an ASIC, which requires longer development and fab time and offers no wiggle room if the problem changes. FPGAs are useful if you know your application. Most people, including yourself, are not the target use case of an FPGA, but many in the scientific research community are.

      1. Even for being able to change some low performance functionality after production, AMD doesn’t need to pay $30 billion dollars for Xilinx. They can license or even buy the existing companies that specifically target integrating programmable logic into ASICs. Those companies would be much cheaper.

        1. because that’s a dead end for FPGAs.

          How so? No one has given me any definitive evidence outside of saying Intel failed at it and Microsoft did. Their failure does not make it undesirable in any way. Granted, yes, ASICs can be programmed to some extent, but they are still extremely limited to the flexibility of an FPGA. The programmability of an ASIC is limited to the problem field it is targeting (e.g. cryptography, video codec, machine learning). An FPGA can be adjusted to suit. That is what makes FPGA so much better than an ASIC.

  6. Yeah, I’m the others who think if AMD’s intent is to get FPGAs into the server space, then they’re making a bad decision.

    My company tried out FPGAs for HPC applications for at scale purposes. Performance per watt wasn’t good enough. So much so that we’d rather spend the money on custom chips to achieve our performance and power goals. CPU perf is no longer good enough and FPGAs (both Xilinx and Intel) don’t make the cut either.

  7. Intel bought Altera mainly to add volume to their fabs and to share the cost of marketing them in the server space. I’m thinking they broke even on the deal. If their process had advanced better they would have made a profit.
    I see the AMD Xilinx deal as being similar. United they become a bigger customer to TSMC. They also unite to share the cost of marketing their products in the server space.

  8. I wonder what their reasons for buying Xilinx are. From what I can see so far, Intel’s acquisition of Altera has been a failure.

    The FPGAs with Intel’s sauce has been constantly delayed.
    Their push to get FPGAs into the data center failed. They dropped the whole Xeon CPU + FPGA in a single package project. Their PCIe “programmable accelerator” didn’t get much adoption and probably got dropped too (I don’t see new cards since the first one).

    As for data centers, it seems the big cloud folks have/are decided/deciding to make their our custom chips instead since CPUs no longer can keep up with the processing demands and FPGAs still don’t have enough logic resources.

    1. Based on some hearsay some time back, I heard MS’ FPGA accelerated AI and Bing search projects were going nowhere. They used Altera/Intel FPGAs for these.

      It seems other big tech companies have either given up on FPGAs or just went straight to custom silicon for their data center needs. FPGAs don’t have that much logic and use a lot more power compared to custom silicon.

      I do wonder what AMD’s plans are for FPGAs. I can’t see it being for data center products.

    2. I recall Intel trying to push their single package Xeon + FPGA to us. They eventually dropped the product all together. They tried to push their PAC PCIe accelerator card too. Their Arria 10 FPGA was way to small for our performance demands and consumes way too much power. You’d need distribute the processing across multiple cards and hosts which consumed even more power.

      FPGAs in the datacenter has been a dead end for Intel. I doubt AMD will do any better in that space.

  9. Sounds reactive to Intel taking Altera. But I think both AMD and Intel will be edged out by more flexible solutions and new entrants in fpga in longer term though AMD might not have that luxury to play long.

Comments are closed.