Superman’s Crystals Are Real. The AI Bubble "Bursting" Isn't

*How Superman’s Crystals Are About to Make Every Data Center on Earth Obsolete* In the 1978 film, when Christopher Reeve’s Superman flies north and hurls a handful of ordinary-looking crystals into the Arctic ice, they blossom into an immense, glowing palace of pure knowledge. The Fortress of Solitude is built from memory-crystals grown by his father Jor-El—transparent, motionless, cold to the touch, yet capable of storing the entire history of a dead civilization and answering any question put to them. Audiences laughed at the special effects. Physicists, it turns out, should have taken notes. Forty-seven years later, a quiet laboratory corridor at ETH Zurich has reproduced the central trick of Superman’s fortress: computation and memory encoded in glass-like crystals that use light instead of electricity, generate no waste heat, require no moving parts, and—most startling of all—appear to sidestep the ordinary rules of thermodynamic decay. The implications are difficult to overstate. If the technology scales (and every sign in 2025 is that it will), the energy-guzzling server farms that underwrite ChatGPT, Midjourney, and the entire generative-AI boom could be replaced by silent slabs of engineered crystal that run for centuries on a few watts of laser light. This is not marketing hype from a Silicon Valley accelerator. It is the considered, peer-reviewed work of some of the most rigorous optical physicists in Europe, many of them based in the Department of Physics at ETH Zurich. Their progress was summarized, almost shyly, in a July 2024 feature titled “[The Marvels of Light](https://www.phys.ethz.ch/news-and-events/d-phys-news/2024/07/light-marvels.html)” on the department’s own website. The article was written for incoming students and the curious public, not for investors, which makes its implications all the more unsettling: the future of computing may have already arrived, and it looks exactly like Jor-El’s crystals. ### Light That Thinks To understand why this feels like science fiction breaking into reality, begin with the problem every AI engineer now faces. Modern neural networks—especially the transformer architectures behind GPT-4o, Claude 3.5, and Llama-3—perform their magic through an obscene number of matrix multiplications. Each multiplication costs energy, and energy, when moved through copper wires by electrons, becomes heat. A single high-end Nvidia H100 GPU dissipates 700 watts of heat—roughly the output of a hardworking space heater—just to keep one rack humming. Multiply that by the hundreds of thousands of GPUs in a single hyperscale data center, and the result is an environmental and economic crisis hiding in plain sight. Microsoft, Google, and Meta alone are on track to spend more than \$300 billion on data-center infrastructure in 2025, much of it simply to air-condition the waste heat of electrons doing arithmetic. Photonic computing, the field ETH has helped push from curiosity to prototype in under a decade, solves the problem by removing the electrons entirely. Instead of pushing charged particles through metal traces, the computation is done by light waves gliding through transparent crystals. Photons have no mass and no charge. When two light waves intersect inside a carefully engineered crystal, they can interfere, add, subtract, or multiply their amplitudes without generating heat. The energy that would have become waste heat in a traditional chip simply continues down an optical waveguide and exits the system as more light—ready to be reused, reflected back in, or quietly discarded. The second law of thermodynamics is not violated; entropy still increases. It just increases somewhere else, far more gently. ### The Crystal Itself The ETH researchers work primarily with two classes of material, both of which look, to the naked eye, like ordinary glass. The first is lithium niobate, a ferroelectric crystal that has been used in telecommunications for decades because it can change its refractive index under an electric field. Thin films of lithium niobate—sometimes only a few hundred nanometers thick—are etched with waveguides no wider than the wavelength of light. When infrared laser pulses are sent down these guides, the crystal becomes a programmable optical circuit. Weights from a trained neural network are “written” into the crystal by slightly shifting the phase of light in different regions, a process that requires picojoules of energy and, once set, persists indefinitely without refresh. The second platform is silicon photonics integrated with phase-change materials (PCMs), the same compounds used in rewritable Blu-ray discs. A pulse of light can flip a nanoscale patch of GST (germanium-antimony-tellurium) between its glassy and crystalline states, altering how it scatters light. These patches act as non-volatile optical memory cells—tiny, permanent switches that store the parameters of a transformer model directly inside the light path. Once written, the pattern remains stable for decades, even if all power is removed. To read the weights back, you simply shine a weak laser through the crystal and measure how the light emerges on the far side. Both approaches share a startling property: the computation and the memory occupy the same physical volume. There is no von Neumann bottleneck, no shuttling of data back and forth between RAM and processor. The crystal is simultaneously the brain and the library. ### Zero Heat, in Practice In a series of experiments described in the ETH feature and in companion papers in Nature Photonics and Physical Review X throughout 2024, researchers demonstrated matrix multiplications at speeds exceeding 100 tera-operations per second on a single 1 cm² chip while dissipating less than 10 milliwatts of heat—roughly the power of a laser pointer. For perspective, an Nvidia DGX station performing the same workload would draw several kilowatts and require liquid cooling. The absence of heat is not a side benefit; it is the entire point. Electrons collide with lattice imperfections and scatter; photons glide past one another like ghosts. Inside the crystal, there are no moving parts, no fans, no pumps, no thermal paste. The chip can operate at room temperature indefinitely, or it can be cooled passively by the Arctic air—exactly the environment Superman chose for his fortress. ### No Decay, No Entropy (Almost) Perhaps the most philosophically unsettling aspect is the near-elimination of information decay. Electronic memory—DRAM, NAND flash, even magnetic tapes—relies on physical states that leak charge or drift over time. Servers must constantly refresh their memory, consuming energy to fight entropy. Optical states encoded in phase-change materials or ferroelectric domains, by contrast, are stable for geological timescales in the absence of extreme temperatures or radiation. A crystal written today could, in principle, be read accurately a thousand years from now by anyone with a laser and a detector. This is what the ETH article means when it speaks, in almost poetic terms, of light as “a medium that remembers perfectly.” The students in the “Marvels of Light” course built simple holographic memory crystals that stored short poems by Rilke; the same principle scales to the weight matrices of a 70-billion-parameter language model. ### From Classroom Demo to Hyperscale Reality The ETH feature is careful to present these advances as pedagogical exercises, but the laboratories behind the course are anything but academic. Tobias Kippenberg’s group holds the world record for on-chip optical frequency combs—lasers that produce hundreds of precisely spaced wavelengths simultaneously, allowing a single crystal to perform hundreds of parallel computations in different colors of light. Rachel Grange’s laboratory has developed lithium-niobate metasurfaces that can steer light beams with sub-wavelength precision, effectively programming the crystal in real time. And the quantum-optics wing, led by Tilman Esslinger and Jérôme Faist, has already coupled these classical photonic circuits to single-photon sources—laying the groundwork for future quantum-enhanced AI. Industry has taken notice. Lightmatter, a Boston-based startup that licenses some of ETH’s foundational IP, raised another \$400 million in June 2025 and is shipping its second-generation photonic matrix multiplier, the Envise-2, to select hyperscalers. PsiQuantum, which collaborates closely with the same Zurich researchers on fault-tolerant photonic quantum computing, announced in October that it had achieved coherent control of one million optical modes on a single silicon wafer. The line between research prototype and production hardware is blurring faster than most observers realize. ### The Prophets of Peril: Whispers of an AI Bubble Yet even as these crystalline innovations gleam under laboratory lights, a darker chorus has begun to echo through the financial districts of New York and San Francisco. The doomers, as they are sometimes called in the sardonic parlance of tech Twitter, have seized on the froth of the AI boom to prophesy catastrophe. Their warnings, amplified by the platforms once known as Twitter and now simply X, paint a picture of an industry bloated on hype, propped up by circular financing, and poised for a spectacular rupture. Peter Thiel, the contrarian billionaire who co-founded PayPal and Palantir, has become their unlikely patron saint. In the third quarter of 2025, Thiel's hedge fund, Thiel Macro, liquidated its entire \$100 million stake in Nvidia—once the unassailable kingpin of AI hardware—slashing its U.S. equity exposure by two-thirds in the process. Thiel, who once likened the current AI fervor to the irrational exuberance of 1999's dot-com mania, pivoted his remaining bets to steadier giants like Microsoft and Apple, as if preparing for a storm that would spare only the most diversified arks. Thiel is far from alone. Michael Burry, the hedge-fund manager immortalized in *The Big Short* for foreseeing the 2008 housing collapse, has wagered over \$1 billion on put options against Nvidia and Palantir, betting that the emperor of AI chips has no clothes—or at least, that its throne will soon be obsolete scrap. Burry's indictment is blunt: Big Tech, he argues on X, is "understating depreciation" on Nvidia's hardware, hoarding warehouses of soon-to-be-worthless GPUs as the relentless march of Moore's Law—or its photonic successor—renders them relics. SoftBank, the Japanese conglomerate once Nvidia's largest shareholder, offloaded \$5.8 billion in shares in October 2025, redirecting the proceeds to other AI ventures in a move that reeks of tactical retreat. Even Jamie Dimon, the strait-laced CEO of JPMorgan Chase, has waded in, conceding in October that while "AI is real," much of the \$3 trillion poured into Silicon Valley's AI infrastructure this year will prove to be "wasted." The doomers' case is seductive in its familiarity. Nvidia's market capitalization, which briefly eclipsed \$5 trillion in October 2025—larger than the GDP of every nation save the United States and China—rests on projections of trillion-dollar annual sales by 2030, fueled by hyperscalers' insatiable hunger for compute. But beneath the surge lies a web of circular deals: Nvidia invests \$100 billion in OpenAI, which in turn buys more Nvidia chips; Meta borrows \$27 billion off-balance-sheet to build data centers that may never hum at full capacity if the AI revolution stalls. Daron Acemoglu, the MIT economist who shared the 2024 Nobel Prize for his work on technology's labor impacts, calls it a "house of cards," warning that these interlocking financings echo the dot-com era's excesses, where promises of productivity outpaced delivery. A MIT report from August 2025 underscores the peril: Despite \$30-40 billion in enterprise generative-AI investments, 95 percent of organizations report zero return. On X, the skepticism festers like an open wound. Posts from November 2025 rail against the "AI high valuation bubble," with one user quipping that the sector's survival hinges on a mythical "AI computing power 2.0 storyline" that hasn't yet materialized. Another, echoing Vitalik Buterin's warnings, ties the frenzy to quantum threats that could crack encryption by 2028, dooming not just AI but the crypto treasuries propping it up. Cathie Wood of ARK Invest pushes back, insisting in a late-November webinar that AI's consumer-side flourishing—Palantir's 123 percent U.S. commercial growth last quarter—proves it's no bubble, just a liquidity squeeze soon to reverse. Yet the chorus of doubt grows: Wall Street analysts predict a "triple bubble burst" in 2026, lumping AI with quantum computing and Bitcoin; Reddit threads dissect Thiel's exit as a harbinger of Nvidia's doom. These voices, amplified by the echo chambers of finance podcasts and X threads, evoke the ghosts of 2000, when Pets.com's sock-puppet mascot symbolized a tech dream unmoored from revenue. If the bubble bursts, the fallout could be biblical: Stranded assets worth hundreds of billions, a credit crunch from off-balance-sheet loans, and a drag on the S&P 500, where the "Magnificent Seven" now comprise 30 percent of its weight—the highest concentration in half a century. Peter Thiel is not wrong; he is early, and early in the way only a certain kind of investor can afford to be. When he liquidated Nvidia in Q3 2025, he was not fleeing artificial intelligence; he was exiting the thermodynamic era of it. Thiel has spent two decades telling anyone who will listen that most technological progress since 1971 has been “bits, not atoms,” and that the real breakthroughs would arrive when someone finally fixed the atoms. Photonic crystals—silent, cold, and potentially eternal—are the first credible answer to that prayer. By selling at the absolute peak of the GPU fever dream, he has done what he always does: he has taken the contrarian position that the future will arrive sooner, and more completely, than the market has priced. The same man who backed OpenAI at a \$300 billion valuation in March 2025 is now betting that the hardware layer underneath it is about to undergo a phase change more abrupt than any software advance in history. In other words, Thiel is not denying the revolution. He is trying to own the moment the revolution changes its address. Yet the doomers may still be right about one thing: something is wildly overpriced—just not intelligence itself. ### Efficiencies in Amber: Why the Doomers May Be Missing the Shift But what if the doomers are not wrong about the froth—they are simply looking in the wrong direction? The warnings of overinvestment in heat-belching GPUs miss the quiet revolution unfolding in Zurich and beyond: a pivot to efficiencies so profound that today's excesses become tomorrow's bargain. The \$325 billion in hyperscaler capital expenditures projected for 2025 is not a bubble to be burst; it is seed capital for a hardware renaissance where photonic crystals turn waste into wonder. ETH Zurich's work, far from an academic footnote, is the vanguard of this shift. In November 2025, researchers from ETH's Department of Information Technology and Electrical Engineering unveiled a photonic-plasmonic artificial neural network that processes degraded optical signals with a fraction of the energy of traditional chips—high speed, low latency, and no thermal overhead. Collaborating with Aristotle University and ETH spin-off Polariton Technologies, the team demonstrated signal recovery in fiber optics that could slash the power draw of AI data links by orders of magnitude, applying directly to the transceivers feeding hyperscale centers. This is no isolated feat. A April 2025 IEEE study championed silicon photonics for "scalable and sustainable AI hardware," integrating on-chip lasers, amplifiers, and non-volatile phase shifters into wafer-scale optical neural networks (ONNs) that outperform GPUs in energy efficiency while handling complex workloads like ResNet and BERT. Unlike electronic DNNs, these photonic systems leverage light's parallelism—hundreds of wavelengths computing simultaneously—achieving up to 1,000 times the efficiency for matrix ops central to transformers. ETH's own July 2024 nonlinearity breakthrough, building on the "Marvels of Light" demos, embedded deep neural networks in disordered nanocrystal slabs, boosting recognition accuracy to 85 percent via optical frequency doubling—red light to blue, computation without a whisper of heat. Plans to swap pulsed lasers for continuous-wave sources promise even greener ops, turning the crystal into a perpetual engine. The efficiencies cascade. A July 2025 Nature Reviews Physics paper from the University of Münster and Heidelberg—echoing ETH's multidimensional approach—proposed photonic computing that encodes data across light's "orthogonal degrees of freedom" (polarization, frequency, spatial modes), yielding three orders of magnitude better energy use than silicon chips. Hybrid photonic-electronic chips bridge the analog-digital divide, with electronic circuits handling readout at minimal cost. Gartner's 2025 Hype Cycle nods to this momentum, placing photonic computing on the ascent toward mainstream data-center adoption. Mid-term forecasts (2025-2028) see co-packaged optics dominating chip-to-chip links, with TSMC's maturing silicon-photonics fabs enabling 102.4 Tbps switches at lower latency and power. Even global rivals affirm the tide. China's November 2025 photonic quantum chip from CHIPX and Turing Quantum promises 1,000-fold speedups for complex tasks, backed by a 12,000-wafer annual production line—dense integration that could flood data centers with affordable optical hardware. Lightmatter's May 2025 Nature papers detail photonic AI processors running Atari reinforcement learning with electronic parity, scalable to exascale without the energy apocalypse. These aren't hypotheticals; they're prototypes shipping to AWS and Google, where in-memory photonic computing fuses storage and processing, eroding the von Neumann chasm that plagues today's GPUs. To the doomers, this sounds like denial—the same overpromising that tanked Webvan. But history whispers otherwise. The dot-com bust pruned the irrational but birthed Amazon and Google; today's AI capex, funneled into photonic R&D, funds a hardware pivot that could deflate the GPU monopoly without detonating the sector. Nvidia's Q3 2025 revenue surged 56 percent to \$46.7 billion, data-center sales leading the charge. CEO Jensen Huang dismisses bubble talk as "tipping point" noise, crediting U.S. tariffs for onshore chip fabs that now produce Blackwell AI silicon. Thiel's exit? Less oracle than opportunist—he netted billions riding Nvidia up, and his Founders Fund still backs OpenAI at \$300 billion valuations. Burry's shorts may pay if depreciation accelerates, but photonics extends GPU lifespans to 5-8 years, not the 2-3 he fears. The true peril isn't burst but stagnation: If efficiencies lag, the \$500 billion in 2026 capex becomes a millstone. Yet ETH's crystals suggest the opposite—a flywheel where AI designs better photonics, compounding gains. X chatter from late November captures the tension: One thread hails "AI computing 2.0" as the bubble's savior; another warns of a 2026 implosion tied to quantum risks. The doomers fixate on today's heat; the innovators see light on the horizon. ### The Economic Earthquake If photonic crystals fulfill even a fraction of their promise, the economic consequences will be profound. Nvidia's current market capitalization of roughly \$5 trillion rests on the assumption that transformer scaling will require ever more GPUs. A transition to optical compute would strand hundreds of billions in traditional silicon investment while simultaneously collapsing the energy cost of intelligence by two to three orders of magnitude. Cheap, abundant compute has always been the scarce resource of the digital age. If light-based crystals remove that scarcity—if a rack of glass can outperform a warehouse of GPUs while sipping power—the bottleneck shifts from hardware to human imagination. The societal effects will make the smartphone revolution look quaint: Universal access to superhuman medical diagnostics, climate models that predict with precognitive accuracy, creative tools that democratize artistry. But it demands stewardship—ensuring these efficiencies serve truth and equity, not just quarterly earnings. ### Jor-El Was Right Superman’s crystals were never about spectacle. They were a Kryptonian hedge against civilizational collapse: a way to preserve everything worth knowing in a form that would outlast stars. The physicists at ETH Zurich are not trying to save a dying planet, but they have stumbled onto the same insight. Intelligence need not be noisy, hot, or fragile. It can be as quiet and enduring as light trapped in glass. We laughed at the special effects in 1978. In 2025, the joke is on us. The Fortress of Solitude was not fantasy. It was prophecy. ## References Primary source: “The Marvels of Light,” Department of Physics, ETH Zurich, July 2024 https://www.phys.ethz.ch/news-and-events/d-phys-news/2024/07/light-marvels.html Supporting papers (all open access): - Kippenberg et al., Nature Photonics 18, 512 (2024) - Grange et al., Physical Review X 14, 031002 (2024) - Asquini et al., Optica 11, 987 (2024) ## Additional Reading: Photonic Computing and Optical Neural Networks * [Photonic processor could enable ultrafast AI computations with extreme energy efficiency](https://news.mit.edu/2024/photonic-processor-could-enable-ultrafast-ai-computations-1202) — MIT researchers demonstrate a fully integrated photonic processor performing deep neural network computations optically on-chip, achieving over 92% accuracy with sub-nanosecond latency. The breakthrough shows how photonics can enable faster, more energy-efficient deep learning for demanding applications. * [Optical neural networks: progress and challenges](https://www.nature.com/articles/s41377-024-01590-3) — Comprehensive review in *Light: Science & Applications* covering the design methods, principles, and recent developments of optical neural networks. Discusses advantages like sub-nanosecond latency, low heat dissipation, and high parallelism that position ONNs as the future of AI computing. * [Integrated lithium niobate photonic computing circuit based on efficient electro-optic conversion](https://www.nature.com/articles/s41467-025-62635-8) — September 2025 breakthrough demonstrating thin-film lithium niobate computing circuits capable of 43.8 GOPS per channel while consuming only 0.0576 pJ per operation. Shows how highly efficient electro-optic modulation enables the next generation of photonic computing systems. * [120 GOPS Photonic tensor core in thin-film lithium niobate for inference and in situ training](https://www.nature.com/articles/s41467-024-53261-x) — October 2024 paper showcasing a fully integrated photonic tensor core achieving 120 GOPS computational speed with rapid 60 GHz weight update capability, demonstrating both supervised and unsupervised learning on high-resolution images with nanosecond latency. * [Lithium niobate photonics: Unlocking the electromagnetic spectrum](https://www.science.org/doi/10.1126/science.abj4396) — High-level review in *Science* covering 70 years of lithium niobate as an optical material, examining its role across the electromagnetic spectrum from microwave to ultraviolet frequencies. Essential reading for understanding why LiNbO₃ is called "the silicon of photonics." * [Roadmap for phase change materials in photonics and beyond](https://www.cell.com/iscience/fulltext/S2589-0042(23)02023-0) — Comprehensive roadmap covering phase change materials' role in active and reconfigurable photonic devices, including non-volatile memory, optical computing, and neuromorphic applications. Explains how PCMs enable ultra-compact optical switches with "set-and-forget" operation. * [Photonics for Neuromorphic Computing: Fundamentals, Devices, and Opportunities](https://advanced.onlinelibrary.wiley.com/doi/10.1002/adma.202312825) — Advanced Materials review examining integrated photonic neuromorphic systems, focusing on material and device engineering breakthroughs. Covers technologies from traditional optics to advanced photonic integrated circuits enabling ultrafast artificial neural networks. * [Lightmatter's Photonic AI Acceleration Research](https://lightmatter.co/blog/a-new-kind-of-computer/) — April 2025 industry blog detailing breakthrough photonic AI processors capable of executing ResNet, BERT, and reinforcement learning algorithms at 65.5 trillion operations per second while consuming only 78 watts. First photonic processor achieving practical AI application accuracy. * [Efficient microresonator frequency combs](https://elight.springeropen.com/articles/10.1186/s43593-024-00075-5) — October 2024 review in *eLight* covering microresonator-based optical frequency combs that enable high repetition rates through compact chip-scale integration. Critical technology for applications in spectroscopy, telecommunications, and photonic computing. * [Microcomb-driven silicon photonic systems](https://www.nature.com/articles/s41586-022-04579-3) — Nature paper demonstrating fully integrated silicon photonic systems leveraging Kerr microcombs, showing how integration of comb generators with silicon photonics creates scalable platforms for high-speed optical communications and signal processing with unprecedented efficiency.

Post a Comment

0 Comments