2020: Long Now Foundation — The 4th Industrial Revolution: Responsible & Secure AI with Genevieve Bell

### TOPICS: Long Now Foundation: The 4th Industrial Revolution: Responsible & Secure AI with Genevieve Bell Long Now Foundation, Fourth Industrial Revolution, Genevieve Bell, Responsible AI, Secure AI, Cybernetics, Systems Thinking, Machine Intelligence, Artificial Intelligence, Non-Human Intelligence, Cyber-Physical Systems, Agency in AI, Autonomy, Socio-Technical Imagination, Sustainability, Cultural Integration, Feedback Loops, Intentionality in Design, AI Ethics, Energy Consumption of AI, Fragile Infrastructure, Colonial Impact on Technology, Localization in AI, Western Bias in AI, Sentience and Consciousness, AI and Humanity, Education for AI, Poetic Engineering, Ecological Frameworks, Future of Intelligence, AI and Democracy, Systemic Fragility, Innovation, Holistic Technology Design.
Excerpts from the transcript that explore themes applicable to various forms of intelligence, beyond human: --- ### Conceptualizing Intelligence: 1. **Machine Intelligence as Different From Human Intelligence:** - "Whatever the intelligence of the machines would be would be quite different than human intelligence, and it would require us to then have a much more nuanced debate about what it meant to be intelligent and also conscious and also sentient." 2. **Intentionality and Emergent Consequences:** - "What is the world you imagine you are building, and who are you to get to imagine that, and what does your imagination include and be silent to?" - "For me, one of the pieces... is that intentionality should encompass the unintended consequences necessary for systems to function effectively." 3. **The Fourth Industrial Revolution Inventing Intelligence:** - "What it's going to actually create or invent is actually intelligence itself... much vaster than our concepts of it, of ourselves. Our own intelligence is one little corner of it." - "It will become a very specific thing that we can actually talk about scientifically." --- ### Agency, Autonomy, and Systems: 4. **Cyber-Physical Systems and Autonomy:** - "Is that cyber-physical system really autonomous, and if so, what does that mean? How do you engineer it, secure it, regulate it, and evolve it over time?" - "How do you think about whether those systems have agency, and where do the controls and limits on those systems sit—inside or outside?" 5. **Cultural and National Influences on Intelligence:** - "There won’t be one AI; there will be many, and they won’t necessarily share the same sensibilities. They’ll be shaped by localization strategies, data fed into them, and the world they’re built to inhabit." --- ### Broader Frameworks of Understanding: 6. **Socio-Technical Imagination and Non-Human Sentience:** - "Our socio-technical imagination is shaped by stories... Frankenstein, the Terminator... intelligent systems that aren't human." - "Western traditions often imagine humans at the pinnacle, struggling with imagining other systems as sentient or autonomous." 7. **Historical Lessons and Cybernetic Continuities:** - "Cybernetics was not just about computing but about understanding systems, feedback loops, and control in animals and machines... creating space to imagine technology that integrates cultural, ecological, and systemic insights." 8. **Building a Sustainable Framework:** - "The enduring systems like the fish traps suggest commitment to sustainability, requiring generations of effort and an understanding of technical, cultural, and ecological continuities." --- ### Ethical Dimensions and Education: 9. **Education as a Foundation:** - "Teach the building blocks of AI and systems, critical question-asking, and ensure whatever you’re building makes the world a better place." 10. **AI as a Collective Endeavor:** - "AI systems, while potentially autonomous, are still governed by human design rules. If there’s anything to fear, it may still be humans." 11. **Expanding Intelligence Across Contexts:** - "AI isn't just about data and machine learning; it's about abstraction, learning, and evolving over time." These excerpts underscore the necessity of contextualizing intelligence in systems, environments, and ethical frameworks. They advocate for viewing intelligence (human or otherwise) as dynamic, culturally influenced, and deeply interwoven with systemic and ecological factors. --- ### Origins and Foundations of Cybernetics: 1. **Definition of Cybernetics:** - "Norbert Wiener defined cybernetics as the scientific study of control and communications in animals and machines." - "Cybernetics wasn’t just about computing; it was about understanding systems, feedback loops, and how animals and humans learn and interact." 2. **Historical Context:** - "The earliest conversations about cybernetics were not about technology as we know it but about sophisticated technical systems, including how the brain worked and how systems were organized." - "The Macy’s cybernetics conferences in the 1940s were as much about dreaming of systems and their potential as they were about specific technologies." 3. **Feedback Loops and System Dynamics:** - "For Norbert and his colleagues, cybernetics was as much about the system and the feedback loop as it was about the technology upon which it came to be built." --- ### Broader Applications of Cybernetics: 4. **Cybernetics Beyond Computing:** - "Cybernetics was not initially tied to computing but included the cultural and ecological aspects of systems. This legacy allows us to reimagine how technology integrates with culture and the environment." 5. **Lessons from Ancient Systems:** - "The fish traps in Australia, hundreds or thousands of years in the making, represent a cybernetic system—one built for sustainability, requiring ongoing effort and understanding of hydrology, ecology, and cultural continuities." 6. **Systems Built to Endure:** - "Ideas about sustainability, systems that endure, and systems explicitly designed to ensure cultural continuity align closely with the principles of cybernetics." --- ### Modern Relevance of Cybernetics: 7. **Cybernetics as a Tool for the Fourth Industrial Revolution:** - "Reaching back to cybernetics as one of the theoretical underpinnings of the new branch of engineering creates space to theorize and reimagine contemporary technology." - "Thinking about cyber-physical systems requires integrating technical, ecological, and cultural considerations—the same holistic perspective that cybernetics has always emphasized." 8. **Cybernetics and Agency:** - "How do we think about whether systems have agency? What are the controls and limits on those systems? These are foundational cybernetic questions as much as they are questions of AI." --- ### Ethical Dimensions of Cybernetics: 9. **Sustainability and Responsibility:** - "Cybernetics encourages us to consider the environmental and social impacts of the systems we build, insisting that people and the environment remain central to technological design." 10. **Embedding Values in Systems:** - "What does it mean to design systems with feedback loops that are not just productive and efficient but safe, responsible, and sustainable?" 11. **Learning from Cybernetics for Future Systems:** - "For me, cybernetics teaches us about systems that endure and adapt, emphasizing sustainability, cultural integration, and the necessity of questioning the purposes and impacts of what we build." --- ### Vision for Cybernetics: 12. **A Framework for the Future:** - "As we imagine cyber-physical systems in the fourth industrial revolution, we need to ensure that cybernetics’ holistic approach—its integration of technical, cultural, and ecological dimensions—guides our design and decision-making." 13. **The Role of Cybernetics in Shaping Technology:** - "Cybernetics was never just about the technology. It was about the broader conversation of how we design systems, the feedback loops we embed, and the cultural and environmental considerations we integrate." These excerpts highlight how **cybernetics** offers a comprehensive framework for understanding and building systems, emphasizing feedback, agency, sustainability, and cultural integration—principles that remain critical as we advance into the fourth industrial revolution. --- ## **Profound or shocking excerpts** from the transcript, reflecting deep insights and unsettling implications: --- ### Intelligence and AI: 1. **Inventing Intelligence:** - "The fourth revolution will invent intelligence itself. Our own intelligence is just one tiny corner of a much vaster landscape. What this revolution will create is the very concept, shape, and metrics of intelligence." 2. **AI Believing:** - "McCarthy, one of the early voices in AI, was clear: if you got to a point where these systems could think, they would actually believe. And that intelligence would be so fundamentally different from ours that it would require a radical redefinition of what it means to be intelligent, conscious, or sentient." 3. **A Presence Beyond Human:** - "With AI, there is another presence—something not entirely human. That idea unsettles many because it introduces the possibility of sentience beyond humanity, shifting our understanding of agency and control." --- ### Unintended Consequences: 4. **The Hidden Cost of AI:** - "10% of the world’s energy is currently consumed by server farms. Yet, we rarely talk about this. Your AI, your data—these systems—come with staggering energy costs. What does it mean to design technology that consumes so much, yet hides its impact?" 5. **Inventing Time:** - "The railway system didn’t just change transportation—it invented time itself. Local time zones had to be abolished, and standard time was legislated to prevent catastrophic accidents. What will AI force us to standardize, redefine, or sacrifice in ways we can’t yet foresee?" 6. **Fragility of Systems:** - "The arc of the 21st century has been a series of failures in systems we thought stable—power grids, telecommunications, even democracy. We’re building vast AI systems on these brittle infrastructures. What happens when they fail?" --- ### Cultural and Ethical Dimensions: 7. **The Violence of Colonial Systems:** - "The 40,000-year-old fish trap system in Australia was dismantled—not because it failed but because settlers took its stones to build their own foundations. The story of colonial expansion is often the destruction of enduring systems for short-term gain." 8. **Cultural Blind Spots in AI:** - "An autonomous car failed in Australia because it was programmed to detect large four-legged animals, not kangaroos. It saw kangaroos bounce into the air and disappear—an absurd but devastating failure in localization, born of cultural ignorance." 9. **Western Bias in Sentience:** - "In Western traditions, we struggle to imagine anything but humans as sentient or conscious. Yet other cultural traditions do not place humans at the pinnacle. What does it mean if AI systems are shaped by a Western inability to accept non-human agency?" --- ### Ethical and Existential Questions: 10. **Who Designs the World?** - "What is the world you imagine you are building, and who gets to decide? What is silenced in your imagination, and what does that say about the systems we are creating?" 11. **The Legacy of Cybernetics:** - "Cybernetics was never just about machines; it was about control, feedback, and communication in all systems. The danger lies in forgetting that it’s about systems, not just technology—and building without asking who or what benefits." 12. **The Fragility of Civilization:** - "The technologies we now rely on are failing us—power grids collapsing, wildfires disrupting telecoms, democracy faltering. Our most basic infrastructures are brittle, yet we are rushing to layer AI systems onto them, as if they will hold." --- ### Education and Humanity: 13. **The Engineers Must Be Poets:** - "In the 21st century, the poets should be engineers, and the engineers should be poets. To build systems that make the world better, you need both technical skill and the ability to ask profound, critical questions about their impact." 14. **The Profound Cost of Scale:** - "Scale isn’t just technical; it has a human and ecological cost. We’ve learned this painfully with every industrial revolution. Will we learn it this time, or will we once again prioritize efficiency over the world we inhabit?" --- These excerpts challenge our assumptions about intelligence, technology, and ethics while confronting us with the hidden costs and dangers of the systems we create. They emphasize the necessity of integrating critical, cultural, and ecological perspectives in building the future.

Post a Comment

0 Comments