Broadcom Inc. unveiled a $73 billion backlog of AI orders that’s supposed to ship out over the next six quarters—only for investors to yawn and send shares tumbling 5% in premarket trading Friday. CEO Hock Tan called it a “minimum,” but Wall Street apparently prefers its forecasts served with a side of specifics, not vagueness.
The fallout? A collective investor sigh that could power a small data center, as Broadcom’s stock—up a sprightly 75% this year—suddenly feels like the kid who peaked too early at the party. With margins squeezing like a too-tight algorithm, those lofty AI dreams might just translate to tighter belts for shareholders, though the company’s dividend hike to 65 cents a share offers a consolation prize sweeter than overpromised hype.
Picture the ripple: Nvidia loyalists chuckle from their thrones, while Broadcom’s custom chip wizards wonder if their next big score will come with a clearer calendar. It’s the kind of letdown that makes you question whether AI’s gold rush is more fool’s errand or just eternally buffering.
Let’s rewind to Thursday’s earnings call, where Broadcom dropped a fiscal first-quarter sales projection of $19.1 billion—smashing analyst guesses of $18.5 billion like a rogue server rack. Tan, with the calm of a man who’s seen more chip cycles than a Vegas dealer, touted an $11 billion order from AI upstart Anthropic, hot on the heels of a $10 billion third-quarter windfall from the same crew.
That Anthropic haul? It’s the kind of deal that sounds like a typo—eleven billion bucks for silicon that thinks faster than your barista on caffeine. Yet Tan couldn’t resist the fine print: those AI sales are nibbling at profit margins, turning what should be champagne territory into a fizzy compromise.
And the mystery client? A cool $1 billion order, unnamed, like a secret admirer slipping cash under the door. Broadcom’s coyness here feels like flirting with fire—investors crave names, not nods, especially when Nvidia’s hoarding the spotlight like a dragon with a graphics card fetish.
Fast-forward to the backlog bombshell: $73 billion queued up, with lead times stretching six months to a year, depending on the product’s temperament. Tan waved it off as a floor, promising “much more” in the pipeline, but skipped the 2026 crystal ball entirely. “It’s a moving target,” he quipped, as if AI revenue were a caffeinated squirrel dodging forecasts.
AI semiconductor sales? Doubling to $8.2 billion in the first quarter, a year-over-year flex that should’ve had confetti flying. Instead, the room cooled faster than an overclocked CPU hitting thermal throttle. Broadcom’s not just slinging chips; they’re the unsung heroes of data center plumbing, upgrading networking gear to shuttle bits at speeds that make light jealous.
Tie in the star-studded ties: OpenAI’s October pact for custom chips and networking to juice ChatGPT, plus Anthropic’s mega-commitment to Google Cloud TPUs laced with Broadcom DNA. It’s a web of alliances that screams “team player,” yet Broadcom lingers in Nvidia’s elongated shadow, like the reliable sidekick plotting a quiet coup.
Fiscal fourth quarter wrapped with $18 billion in sales and $1.95 earnings per share, eclipsing estimates of $17.5 billion and $1.87. Solid, steak-and-potatoes stuff. But the real juice? Tan’s personal carrot: 610,521 shares if AI hits $90 billion by 2030, ballooning to triple if it touches $120 billion. Suddenly, the CEO’s got skin in the game thicker than a server farm’s cabling.
Tensions simmer as AI models balloon in complexity, demanding Broadcom’s wizardry to link chips, racks, and entire buildings without a digital traffic jam. Will the vague vibes clear, or is this the setup for a sequel where guidance finally lands? Investors lean in, breaths held, wondering if Broadcom’s the tortoise outpacing the hare—or just another chip off the old block.
In Palo Alto’s chip kingdom, Broadcom’s lineup spans comms, networking, and software, a Swiss Army knife in an era of specialized scalpels. The buzz? It’s electric, but the bill’s coming due, and everyone’s wallet is watching.


Leave a Reply