Chronicles of 2025

I thought stablecoins were supposed to be backed by US treasury. Didn’t realise they can buy gold too

1 Like

I lol’d.

I was part of the GOOG – still in my “investing? I have no clue” phase – when we started with TPUs. Thought at the time that we were lightyears ahead in compute and LLMs than anyone else, but clueless about the business model (as always).
NVDA was just a GPU company at the time, then became a (provider of GPUs for) bitcoin mining company.

Now I (again) believe Google has a serious advantage in the space from a pure technology perspective. From a cash flow angle, they can easily finance their AI adventures without going into debt or funny circular investment deals.

Still not an active shareholder, though.

I thought they were still lacking some basic audits (versus just attestations) regarding their holdings (regarding treasuries backing their virtual dollars). But maybe things have changed in the mean time.

1 Like

Tether buys everything (incl crypto)

Do they also buy Tether?

I’ll see myself out …

1 Like

that’s crazy man, why aren’t you? The stock already doubled this year, even though it’s not paying divvies.

It is paying divvies.

For the rest of your comment I’m not sure if you’re just teasing me.

1 Like

You ought not to watch a stock you once sold.

So basically, Tether is just a vehicle for unregulated ETC products.

2 Likes

Conceptually they probably mimic something closer to bank deposits (with self regulation tho :grinning_face_with_smiling_eyes:).

They hold some risky assets but they also have some equity to absorb losses. (Don’t ask me if it would pass a more stringent regulation like basel III)

1 Like

Enough crypto already! Back to AI … :wink:

I for one am fascinated by the current narrative battles around Nvidia and AI (and Google).

Michael Burry – who cried wolf lots of times, admittedly – apparently posted his views on Nvidia on Substack a couple of days ago (Cassandra Unchained, paywalled). Ok, fine. Summary according to Gemini in points 1 and 2 in the footnote.[1]

Nvidia apparently feels like they have to respond by sending a private memo to Wall Street sell side analysts … wait, what? Why respond at all? Anyway, their rebuttal is summarized in point 3 in the footnote.[1]

Michael Burry tweet responds again.

:popcorn:

Another nice account of the narrative battles, summarized by Gemini:

The author argues that despite being the world’s most valuable company, NVIDIA had to issue a rebuttal to short-sellers like Michael Burry because the sustainability of its 75% gross margins is fragile.

The reasons include:

  1. Customer Dependency: NVIDIA’s largest customers are unprofitable and rely on fickle investor capital to fund their massive hardware purchases.
  2. Uncertain Value Capture: It is unclear who will capture the massive value spread (e.g., 99.9% cost reduction in an MRI scan) created by AI—the chip maker, the model maker, or the end application.
  3. The Real Rival: NVIDIA’s true competitor is Google, which controls the entire AI stack (TPUs, Gemini, data, and distribution) and is strategically focused on driving down costs, directly conflicting with NVIDIA’s strategy of maintaining high margins.

The text concludes that the current unit economics suggest AI may evolve into a commodity business where low cost ultimately wins, intensifying the conflict between the two giants’ strategies.

(Full Tweet)


1   Gemini’s summary:

Michael Burry’s recent post on his Substack newsletter, “Cassandra Unchained,” is a powerful critique of the current AI boom, arguing that NVIDIA is the “Cisco” of this cycle and that the rally is being inflated by aggressive accounting practices, particularly among NVIDIA’s major cloud customers.

Here is a summary of the key points from his post:

1. NVIDIA is the New Cisco

Burry draws a direct parallel between NVIDIA’s position today and Cisco’s position at the peak of the dot-com bubble (1999–2000).2 Cisco sold the hardware (“picks and shovels”) for the internet infrastructure buildout, which turned out to be catastrophic supply-side gluttony that far outpaced real demand. He suggests the massive investment promises in AI infrastructure today ($3 trillion over the next few years) mirror that same overbuilding.

2. The Core Accounting Trick: Depreciation

Burry’s central argument is not about NVIDIA’s own financial fraud, but rather the accounting methods of its customers (the “hyperscalers” like Microsoft, Meta, and Oracle).

  • The Allegation: These cloud giants are systematically extending the useful life of AI chips >and servers for depreciation purposes from the traditional 3 years to 5 or 6 years.
  • The Impact: Since depreciation is a cost spread over time, extending the timeline artificially boosts current reported earnings for these customers, masking the true, massive cost of the AI hardware.6 Burry warned that companies could be overstating earnings by 20–27% if these depreciation timelines are out of sync with the rapid pace of chip cycles.

3. Rebuttal and Counter-Attack

NVIDIA reportedly sent a memo to Wall Street analysts to push back on Burry’s public criticisms. In his Substack, Burry dismissed NVIDIA’s memo as “disingenuous” and attacking “straw men,” arguing that NVIDIA was deflecting criticism away from his actual concerns about customer depreciation.

4. Continued Bearish Stance

Burry reaffirmed that he stands by his analysis and disclosed that he still holds put options (a bearish bet) on both NVIDIA and Palantir. He is now using his paid newsletter to argue that this AI cycle is propped up by financial and accounting scaffolding that will not age well.

4 Likes

IMO. It is obvious that the Nvidia margins are unsustainable.

However, they will still have a near monopoly position for a few more years where they can continue to enjoy monopoly margins. Maybe more years if intelligent robotics really comes soon.

1 Like

OK. I just crashed the market: I bought NVDA. Brace! Brace!

3 Likes

Years? As in *years-counted-in-@PhilMongoose -investment years"?

:wink:

2 Likes

I am no expert but I heard that NVDA moat is actually the quality of compiler tools even more than silicon design. That may ironically be eroded fast by AI itself.

2 Likes

Why? Building a competing ecosystem is a massive effort, part of it is also network effect. How would AI help? (Genuinely curious)

In a world where AI-written software is 10x cheaper than human-crafted software, the value of existing software drops by 10x. Both in the terms of pure monetary/opportunity cost of writing a competitor, but in terms of network effects.

What do I care if there’s a hyper-optimized kernel for some computation on CUDA if AI can produce a competitive one for my platform anyway.

Obviously there’s a pretty big qualifier there. What if we don’t get to a world of 10x productivity in software development? Well, then it doesn’t really matter which hardware ecosystem wins, because AI clearly won’t be capturing most of the world’s economic activity let alone the value from that activity.

2 Likes

Huh? What’s that number? For the record, AI has been doing wonders and picking up speed in diagnostics, particularly imaging triage and interpretation, but it’s been years now, before LLMs.

Was very busy yesterday, looked at my tickers in the evening, all very green. Thanksgiving+Santa?

1 Like

It’s buried in the full tweet.

  1. More broadly, Nvidia is unsure where value will land in the AI business. Anyone using the latest models knows how they can lower the cost of legal advice or reading an MRI. Our own work finds that reading an MRI costs $150 for a human doctor compared to $0.15 for an AI model. This 99.9% spread is the biggest we have ever seen in business. But who gets it? The chip maker? The model maker? An application maker? The doctor? The patient?

And yes, I agree, the interpretation of medical images has been a thing for years already.
I think the author wanted to make the point that whatever cost reduction future LLM applications may bring, it’s unclear who reaps in the profits.

1 Like

I’m pretty confident that it will not be the patient :stuck_out_tongue:

4 Likes

NVDA: You need a filter, a FOMOFI…