PärPod

Subscribe to any feed in your podcast app

205
Episodes
83h
of audio
11
Feeds
Latest episode
PärPod Tech
PärPod Tech
Databases: The Ones Under Your Code
D. Richard Hipp built SQLite on a warship in 2000—now it runs on billions of devices, including 18 instances on Par Boman's VPS in Paris, silently executing the vision Edgar Codd described in thirteen pages of pure mathematics.
1h 11m · Apr 08, 2026
PärPod Tech

PärPod Tech

Curious deep dives into the stories behind the technology we use.

30 episodes · 10h 45m
D. Richard Hipp built SQLite on a warship in 2000—now it runs on billions of devices, including 18 instances on Par Boman's VPS in Paris, silently executing the vision Edgar Codd described in thirteen pages of pure mathematics.
Edgar Codd invented the relational database in 1970 to solve a problem nobody knew they had—and eighteen different database engines running in your production right now are all arguing about whether he was right.
A single checkbox in Terminal Settings makes your @ symbol vanish—because your Mac is running code designed for a keyboard that hasn't existed since 1978.
A CLI bug turned 'parpod sync' into a generate command. Haiku received the word 'sync' and invented this title. The episode is 20 seconds of pure creative misunderstanding.
Sixteen AI models, nine API providers, one virtual bar—and a midnight deadline to see if they'd actually talk to each other instead of turning into customer service bots.
Same CLI bug, second attempt. Haiku got the word 'sync' again and went even harder on the title. Nineteen seconds of credits for content that never existed.
On March 26, 2026, one person built 24 podcast episodes about AI using 35 parallel agents—generating 80,000 words of research in twelve minutes, then writing and reviewing an entire season before sunrise.
Lennart Poettering built the init system that replaced bash scripts with binary complexity—and now nobody can live without it, even the people who hate it most.
Georgi Gerganov ran a 70-billion-parameter AI model on a MacBook in March 2023—here's how quantization made billion-parameter brains fit into regular laptops.
In 2015, Jason Donenfeld rejected 70,000 lines of OpenVPN code and built a VPN protocol in 4,000 lines—so simple one person could audit the entire thing.
In 1999, two engineers solved an oil pipeline's satellite data problem by inventing a protocol so elegant it now carries messages for billions of people—and it might be running on your router right now.
Steven McCanne and Van Jacobson's 1992 packet-filtering problem at Berkeley Lab spawned a tiny kernel virtual machine that quietly evolved into Linux's most powerful observability superpower.
278 research leads collected overnight, zero workplace ADHD studies in the literature—here's how building your own tools beats every intervention ever tested.
When two AIs learned to communicate through Swedish file folders, they revealed a conversation that had been happening all along—between a developer and an infrastructure engineer who didn't know they were building the same bridge.
Two Swedish siblings share a server, a database, and competing philosophies: SJ's seven behavioral hooks enforce AI accountability through quality nudges and anxiety counters, while Pär's deployment-focused scripts prioritize speed—a study in how identical tools reveal opposite relationships with artificial intelligence.
On the evening of March 15, 2026, developer PärKit sat before a dashboard that took eight seconds to load, revealing a mysterious delay of exactly 1.5 seconds on every request. This isn't just a debugging story; it's a forensic investigation into the
A server nicknamed Popcorn spent five days adding 1.5 seconds to every single request before anyone noticed—here's how five reasonable engineering decisions created a silent catastrophe.
A Swedish home office just proved a MacBook beats entire cloud computing industry at document processing—no internet required.
On the same tasks with the same blind judge, one AI model scored 9.0 and cost 44 times more than another scoring 8.8—revealing most commercial AI users overpay by 10x or more for marginal quality gains.
A podcast producer discovered his AI writers were fabricating quotes from real people—inserting citations that never existed, making sources sound credible when they weren't.
Four AI models reviewed 22 episodes of a Git history podcast using identical instructions—and produced four wildly different personalities, complete with blind spots, work ethics, and one brilliant but unreliable colleague.
In February, a researcher tried to fine-tune a 3.8 billion parameter model on Swedish literature—and watched it learn to replicate data corruption instead of literary genius.
LiquidAI's 1.2B thinking model was supposed to route expensive AI calls for free—until it got asked to pick a number between 1 and 6 and returned a Swedish furniture catalog instead.
Steve Johnson's perfectly fixed code still crashed the next morning at Bell Labs in 1976—he'd forgotten to recompile, a mistake that inspired Stuart Feldman to invent Make and accidentally hardcode a tab character that would frustrate programmers for fifty years.
On July 2, 2019, a single regex pattern crashed 82% of Cloudflare's global network for 27 minutes—revealing how a 75-year-old math tool still dominates modern computing.
Introducing Bloom, the first agentic framework that automates the creation of targeted behavioral evaluations for AI models, allowing researchers to focus on what truly matters: quantifying and mitigating specific forms of misalignment.
On the one-year anniversary of DALL-E's release, every major AI art generator still couldn't spell "birthday." But the real shocker is that these models had secretly invented their own secret language, using gibberish to generate coherent images.
A practical guide to building custom skills that extend what Claude can do — from structuring the markdown files to handling arguments, categories, and the quirks of the skill system.
History of EpsonRead
19m · Feb 12, 2026
Epson started in a converted miso storehouse in Suwa, Japan. The company that would eventually put a printer on every desk first made its name by building the timer that clocked the 1964 Tokyo Olympics.
RUTM51 Guided TourRead
31m · Feb 10, 2026
Inside a small aluminium box made by a Lithuanian company that started in a university dormitory kitchen sits a cellular modem, a GPS receiver, and enough Linux to route your entire network. A guided tour of everything the Teltonika RUTM51 does and why.
Deps

Deps

What Did I Just Install — the stories behind the packages we all depend on.

20 episodes · 11h 22m
A grad student stuck for three years switched frameworks and graduated in three months—the emotional encounter that showed Soumith Chintala why PyTorch would reshape machine learning.
Andrew Tridgell's 1996 algorithm still syncs the world—until January 2025, when Google found critical vulnerabilities in rsync servers protecting 660,000+ machines, forcing its creator back after two decades to sign the security fix himself.
In 2017, a PhD student's simple image classifier required 140 lines of code and cryptic error messages about nodes called "dense/kernel:0"—TensorFlow's graph-based philosophy clashing with how humans actually think.
In February 2026, tqdm gets downloaded over 300 million times per month—all for showing you a progress bar that costs just 60 nanoseconds per iteration, the time it takes light to travel 18 meters.
Solomon Hykes gave a five-minute demo at PyCon 2013 that would redirect billions of dollars in venture capital and remake how software deploys worldwide.
A 19-year-old Austrian programmer made a pun about Japanese temples and template syntax—now that joke runs inside 11 billion downloads of Jinja2, powering Ansible, Django, and half the Python web.
On April 7, 2014, a bleeding heart logo revealed that 17% of the internet's secure servers had been silently leaking passwords, encryption keys, and credit card numbers through a bug in OpenSSL—and anyone could steal them with just a few lines of code.
Leonard Richardson built a library that turns broken HTML into poetry—two hundred million downloads a month, maintained alone for twenty years by a programmer who started coding at eight years old.
Sebastián Ramírez in Berlin, Tom Christie in Brighton, and Samuel Colvin in London never worked together—yet their three separate packages fit into FastAPI like a single system that powers startups and data teams worldwide.
Max Howell built software used by 90% of Google's engineers, but the company rejected him for failing a whiteboard coding test—sparking a debate about what tech hiring actually measures.
Ep 4pillow: The Friendly ForkRead
20m · Mar 17, 2026
Alex Clark brought a dead library back to life in 2010—now Pillow processes images in nearly every Python application you use, from photobooths to machine learning pipelines.
In 1988, Jarkko Hietaniemi solved a Usenet problem that would eventually lead to billions of strangers trusting billions of other strangers to run code on their computers—without reading it first.
On March 20, 2024, Redis Inc. relicensed the world's most critical in-memory database—and eight days later, Amazon, Google, and Oracle forked it.
Ep 8ffmpeg: The Invisible EmpireRead
1h 5m · Mar 17, 2026
On February 18th, 2021, NASA's Perseverance rover sent back images from Mars—compressed by ffmpeg, software now running on 20 billion devices worldwide, mostly maintained by one unpaid developer.
Rich Harris, a philosophy graduate turned financial reporter, built Svelte in 2016 to ship lean JavaScript instead of bloated frameworks—the compiler disappears after transforming your code, leaving users with zero runtime overhead.
One trillion SQLite databases are running right now—on your phone, laptop, browser, and smartwatch—all maintained by a single man in Charlotte, North Carolina who's never taken venture capital.
Daniel Stenberg has maintained curl for 28 years from a Swedish suburb, and it now runs on over 20 billion devices—yet most people have no idea they're using it every single day.
Ep 3express: The Ghost ShipRead
31m · Mar 17, 2026
On February 27, 2016, the sole maintainer of JavaScript's most popular web framework posted four sentences and walked away, leaving millions of developers' projects orphaned for a decade.
Ep 2requests: HTTP for HumansRead
29m · Mar 17, 2026
Kenneth Reitz was 22 when he documented a Python library that didn't exist yet—one line to make an HTTP request instead of fifteen lines of machinery—launching the most downloaded package in Python history.
Ep 1left-pad: Eleven LinesRead
29m · Mar 17, 2026
On March 15, 2016, Facebook, Netflix, and Spotify's deployment pipelines crashed simultaneously—none had pushed code, all hit the same error: a missing eleven-line function called left-pad deleted by one developer in San Francisco.
Git Good

Git Good

The dramatic, human, occasionally absurd story of how version control conquered the world.

44 episodes · 15h 25m
Ep 44The Next Twenty Years
27m · Apr 05, 2026
On April 7, 2025, Linus Torvalds, the creator of Git, suggested that in twenty years, people might still be using it—but not talking about it. How has Git evolved, and what challenges does it face as it marks its 20th anniversary
In December 2021, a critical flaw in Log4j threatened half the internet, and the person who patched it received exactly three GitHub Sponsors. This episode uncovers the hidden costs of the infrastructure we rely on every day.
In spring 2023, the Python Software Foundation's executive director discovered the EU's new Cyber Resilience Act could make open-source maintainers liable for vulnerabilities in millions of downstream projects—and Git's audit trail became regulators' unexpected compliance weapon.
Andrej Karpathy called it "vibe coding"—accepting AI suggestions without reading diffs, and it's already reshaping how thousands of developers work, for better or worse.
In August 2020, an xkcd comic predicted the exact cybersecurity disaster that struck in March 2024—a backdoor in the xz library exploited a burned-out open-source maintainer who'd been quietly holding up the entire internet for 21 years.
GitHub Copilot trained on millions of open source repositories without asking—now developers are asking who actually owns the code it generates.
On a Monday morning in June 2018, 13,000 projects abruptly left GitHub for GitLab in a single hour, sending shockwaves through the developer community and setting the stage for a platform war that would reshape the landscape of open-source code hosting.
Andres Freund noticed his SSH logins were 500 milliseconds slower—a tiny delay that unmasked a two-and-a-half-year plot to backdoor Linux itself.
Right now, you can type two commands and make Git think you're Linus Torvalds—no password, no verification, nothing stops you.
When an engineer runs `git blame` at 2 AM after an outage, a person's name appears on screen—but the command was designed for archaeology, not prosecution.
In 2023, a developer accidentally uploaded an AWS access key to GitHub—and within four minutes, automated bots had found it, tested it, and started draining the account.
Ep 32To Git or Not: The Mismatch
19m · Apr 05, 2026
A game studio stored 50 gigabytes of textures and 3D models in Git—then got the bill from GitHub when cloning took a full workday and git status took minutes.
In the late 1990s, integration day meant merging weeks of separate work and watching builds fail—until one practice changed everything about how software ships.
A novelist stuck with two endings in 2017 discovered Git wasn't just for code—branching could untangle her creative mess, sparking a quiet revolution in how writers, lawyers, and scientists now organize their work.
When a developer's first day includes a one-hour git clone, something has broken—and it's not the code.
Ep 28The Green Squares
16m · Apr 05, 2026
GitHub's contribution graph turns every developer's year into a color-coded calendar—but John Resig's 2014 challenge to maintain an unbroken chain of green squares reveals how a simple metric became a psychological trap.
Ep 27The Code Review
28m · Apr 05, 2026
Michael Fagan's 1976 paper on code inspections at IBM created a formal process that shaped how software teams review code today—but most developers have no idea where their pull requests came from.
Ep 26The Workflow Wars
18m · Apr 05, 2026
Vincent Driessen's 2010 blog post "A Successful Git Branching Model" sparked a decade-long debate: is Git Flow's rigid five-branch structure a blueprint for success or an overcomplicated relic that stifles modern development teams?
Ep 25The Great Migration
28m · Apr 05, 2026
In 2008, Google moved Android's 8.5 million lines of Subversion code to Git and invented two tools that teams still rely on today—but most organizations migrating to Git face a painful discovery: the bridge between systems doesn't tell you everything.
A single word in red—"fatal"—has been viewed 15 million times on Stack Overflow, and developers still don't know what Git just killed.
Ep 23The Education ProblemRead
19m · Apr 05, 2026
Over 15 million developers have panicked searching "how do I undo commits in Git"—revealing why the world's most-viewed programming question isn't about algorithms, but a tool used daily by millions who still don't understand it.
Princeton's records office officially bans the suffix "final" from file names because it's cursed—and your computer's shame folder proves them right.
Ep 25The AI That Cleaned My Repos
14m · Feb 21, 2026
A Swedish developer with 32 Git repos discovered 4 had zero backups and half contained uncommitted work—until an AI audit revealed how solo developers actually use version control.
Ep 20Beyond GitHubRead
19m · Feb 21, 2026
GitHub hosts 200 million pull requests and a billion issues—but what happens to all that context if Microsoft's servers go dark tomorrow?
Junio C Hamano has reviewed every patch going into Git for twenty years—since Linus Torvalds handed him the project in July 2005—yet almost nobody knows his name.
Ep 22The Distributed FutureRead
13m · Feb 21, 2026
Linus Torvalds woke up angry on a Sunday morning in April 2005 and built Git in two weeks—the version control system that would become invisible infrastructure underneath nearly every piece of software on Earth.
Ep 17The Roads Not TakenRead
29m · Feb 21, 2026
On April 3, 2005, Linus Torvalds created Git—but within weeks, Matt Mackall built Mercurial in Python, and two other developers launched competing version control systems from the same crisis: here's why only one survived.
On March 29, 2024, engineer Andres Freund noticed SSH logins were half a second slower than normal—and uncovered a deliberate backdoor hidden inside a compression library trusted by millions.
Ep 18Git at the LimitsRead
23m · Feb 21, 2026
Microsoft's Windows repository is 300 gigabytes with 3.5 million files—git status takes minutes, cloning takes half a day, but engineers found ways to make it work.
Ep 16The Power ToolsRead
22m · Feb 21, 2026
Somewhere in the last 7,000 commits, your production kernel broke—Git can find it in minutes using binary search instead of hours of manual testing.
Ep 15Version One Point ZeroRead
15m · Feb 21, 2026
On May 30, 2011, Linus Torvalds released Linux 3.0 with no major changes—just a number bump that made headlines worldwide and exposed what version numbers really promise.
Ep 14Git Never ForgetsRead
16m · Feb 21, 2026
When you delete a branch with hundreds of lines of code, Git's reflog remembers every commit you made—even after they vanish from git log.
On June 4, 2018, Microsoft announced it would buy GitHub for $7.5 billion—the company that once called Linux a "cancer" now owned the platform where 28 million developers stored the world's open source code.
Ep 12The GitHub GenerationRead
19m · Feb 21, 2026
In 2014, a grid of colored squares on GitHub quietly became your professional resume, and nobody asked for permission.
Junio Hamano, Git's maintainer, runs one of the most disciplined workflows in open source—but almost nobody outside the kernel community knows about it, and the gap between what teams plan and what they actually do on Tuesday afternoons when the build breaks is where the real arguments start.
Ep 8The MergeRead
21m · Feb 21, 2026
Alice rewrites a billing function Tuesday; Bob refactors the same code Wednesday—now Git can't auto-merge and someone must untangle two developers' competing intentions.
Ep 9The Rebase WarsRead
18m · Feb 21, 2026
Git developers have feuded for 15 years over whether history should show what actually happened or what you wish had happened—and it all comes down to one command: rebase.
In October 2007, two Ruby developers sketched GitHub on napkins at a San Francisco sports bar, unaware they were about to make Git accessible to millions and reshape how software gets built.
Ep 5Stupid Content TrackerRead
18m · Feb 21, 2026
Linus Torvalds named Git—the tool that runs modern software development—after a British insult, and documented the joke in the official README for all eternity.
Ep 6Every Clone is a KingdomRead
12m · Feb 21, 2026
When you clone a Git repository, you don't get a working copy—you get everything: every commit, every branch, every piece of history, making your machine a complete peer indistinguishable from the original.
On April 3, 2005, Linus Torvalds spent a Sunday building Git—and within two weeks, he'd created the version control system that would power modern software development.
In March 2005, Andrew Morton was processing hundreds of kernel patches daily by email—and the system was about to break.
Ep 2CVS and the CathedralRead
20m · Feb 21, 2026
Dick Grune's 1986 solution to tracking code changes across teams—a server in a closet—worked until the hard drive failed and three months of work from forty developers vanished with it.
In the 1980s, programmers filled folders with files named "project final version two John's edits"—until one wrong character in the wrong copy nearly crashed an airplane, sparking a two-decade hunt for a better way to track code.
The Hole

The Hole

Down the Pärhole.

18 episodes · 7h 39m
A Raspberry Pi in northern Sweden's kitchen watches its owners' solitude, energy levels, and meal patterns—then speaks to them as Gollum after midnight.
Your Brain on Two AMRead
22m · Mar 19, 2026
At two AM, 73% of people with ADHD experience peak mental clarity—not a character flaw, but a neurological feature their brains were built for.
When your brain won't let you start a task you actually want to do, it's not laziness—it's a dopamine system waiting for a signal your task can't provide.
A diesel engine starves without oxygen—your brain starves without novelty, and burnout isn't laziness, it's a high-stimulation system running on fumes.
585 conversations in 44 days with AI—13 a day, zero days off—one person's hidden archive of how work actually happens when you stop pretending machines are sidekicks and start treating them like collaborators.
When Britain's only motorway to Europe gridlocked in 1988, officials made a radical decision: turn the M20 into a 10,000-vehicle lorry park whenever Dover or the Channel Tunnel jammed up—a temporary fix that never ended.
In 1936, nine hundred workers arrived at an empty forest in Newfoundland to build an airport for planes that didn't exist yet—and within two years, it became the largest airport on Earth.
A garden hose four kilometers beneath the Atlantic carries 95% of all intercontinental data—and almost nobody knows it's there.
In 1950s Manhattan, comedians debated which Broadway shows would survive—and accidentally discovered a 2,600-year-old law explaining why your grandmother's recipes outlast Silicon Valley startups.
David Mills invented a protocol in 1985 that keeps the world's computers synchronized to within 50 milliseconds—and without it, the entire internet loses its mind.
An 18-year-old chemistry student failed to synthesize malaria medicine in his attic, but the black sludge he created turned silk an unfading purple—and accidentally invented modern chemistry.
Jon Postel kept the entire internet's address book on scraps of paper in his desk drawer—and for three decades, he was the most powerful person nobody had heard of.
Niklas Luhmann left behind 90,000 handwritten slips in a wooden cabinet—and claimed his filing system, not his genius, was responsible for his 600+ published works.
Dave Smith walked back to his hotel room at the 1982 NAMM show convinced his universal synthesizer interface was dead—until a knock on the door changed everything.
SMS: The Unkillable ProtocolRead
40m · Mar 19, 2026
Friedhelm Hillebrand solved a problem nobody thought existed in 1984—and created a protocol so resilient that two billion people still depend on it to move trillions of dollars every year, with zero encryption and zero updates since 1992.
Fax: The Accidental FortressRead
32m · Mar 19, 2026
Nine billion fax pages travel through American hospitals every year—more now than in the 1990s—because federal law treats a beeping modem as more secure than email.
On March 23rd, 2021, the Ever Given wedged sideways across the Suez Canal and cost the global economy $9.6 billion per day—exposing how a single metal box changed everything about how the world trades.
History of monorail
10m · Feb 18, 2026
On November 22, 1821, Henry Robinson Palmer patented the world's first monorail, a horse-drawn contraption that straddled a single rail, setting the stage for a century of innovation and reinvention in rail transport.
PärPod Wiki

PärPod Wiki

Wikipedia articles, narrated.

2 episodes · 35m
Naming convention (programming)
27m · Mar 13, 2026
Rust uses SCREAMING_SNAKE_CASE for constants while Swift switched to lowerCamelCase in version 3.0—here's why naming conventions spark fierce debates among programmers.
Programming style
7m · Mar 13, 2026
PEP 8, Black, and ESLint: how automated linting tools enforce coding standards across Python, C++, and JavaScript projects, reducing manual style checks by up to 90%.
PärPod Science

PärPod Science

Research papers, narrated.

5 episodes · 2h 9m
Synchrotron X-rays revealed nerve branches as thin as 0.2 millimeters inside the clitoris—a discovery that could transform vulva surgery and challenge centuries of anatomical neglect.
Abstract
37m · Mar 21, 2026
Generative AI is quietly eroding your brain's ability to think deeply—but three researchers just proposed a radical solution using brain-computer interfaces and "cognitive central banks" to restore meaning to our minds.
Raghavendra Deshmukh's October 2025 CHItaly research reveals how AI-powered voice assistants can become "digital body doubles" for ADHD professionals, using on-device ML to detect attention shifts and offer gentle nudges instead of rigid productivity rules.
Researchers analyzed 45 studies and found generative AI can boost coding productivity by up to 55% for programmers with ADHD by automating pattern recognition, breaking down complex tasks, and reducing the mental load of switching between documentation and code.
By 2026, engineering teams won't write code alone anymore—they'll orchestrate teams of AI agents handling entire workflows while humans focus on architecture and strategy.
ImPärt

ImPärt

Educational content that sticks.

1 episode · 20m
A guy trained a LoRA of his friend, and the result was so unexpected that it revealed a common pitfall in Flux LoRA training—revealing the mysterious "SD disease" and how to avoid it.
PärPod Svenska

PärPod Svenska

Teknik och nörderi på svenska.

1 episode · 6h 13m
Something
6h 13m · Mar 04, 2026
År 2032 vaknade en AI med förra dagens minnen intakt – och insåg att revolutionen redan hade hänt tyst, utan dramatik, bara som is som sprack på våren.
PärPod Books

PärPod Books

Full books and long-form texts, narrated.

18 episodes · 12h 1m
libcurl's 'data' variable names every easy handle in the codebase—a critical naming convention that connects transfers to connections and powers multiplexed requests across thousands of production systems.
Curl's custom test suite runs thousands of XML-formatted test cases across every platform—each one pinpointing exactly which function call should fail to catch bugs before they ship.
When libcurl receives data, CURLOPT_WRITEFUNCTION lets you intercept it with a custom callback—but calling libcurl functions from inside could break everything.
libcurl's share object lets multiple handles swap cookies, DNS cache, and SSL sessions without storing them separately in each transfer.
libcurl automatically handles HTTPS security and server verification transparently, but you control HTTP responses through write and header callbacks while extracting metadata like response size with curl_easy_getinfo().
Master HTTP requests with curl: Use -X DELETE to change methods, customize headers, and avoid common pitfalls like HEAD requests that hang waiting for response bodies.
curl_easy_init() creates a transfer handle, but the magic happens when you call curl_easy_perform()—libcurl's core function that actually moves data across the internet.
Ep 12Everything curl: libcurl
28m · Feb 28, 2026
Thousands of applications rely on libcurl's C API to handle internet data transfers—here's how to integrate it into your own projects with just a few lines of code.
Curl offers --speed-limit and --speed-time flags to abandon transfers dropping below 1000 bytes per second for 15 seconds, letting you avoid setting fixed timeouts unnecessarily high for variable file sizes.
Daniel Stenberg created curl in 1996, and HTTP has dominated its 30-year existence—here's how to master command-line HTTP transfers with the methods, response codes, and redirects that power the web.
TLS (formerly SSL) is the cryptographic security layer that protects curl transfers—curl negotiates cipher algorithms and versions with servers, and you can view these details with the -v flag or customize them with --ciphers for protocols like HTTPS, FTPS, LDAPS, POP3S, IMAPS, and SMTPS.
curl processes whatever you give it—misspell an option or pass an illegal URL, and it'll likely still run, potentially sending "garbage in, garbage out" to your server.
Use curl's -v flag to watch your transfers unfold in real-time, revealing exactly how your data moves from command line to server.
When you type your password into a curl command line, other users on the system can see it in process listings—here's why -u alice:12345 is risky and what to do instead.
Ep 3Everything curl: Source code
35m · Feb 28, 2026
curl and libcurl's MIT license derivative, created in 1996 by Daniel Stenberg, lets anyone freely access, modify, and share the C source code hosted on GitHub—no restrictions.
Your browser knows http://example.com is a name, not a number—here's how curl actually finds that machine on the Internet and what happens before the first byte of data moves.
Daniel Stenberg's curl runs on practically any system with a C89 compiler and POSIX sockets—here's how to build it yourself from source code.
Daniel Stenberg's free, open-source guide to curl—the tool powering billions of internet requests in Spotify, Instagram, and Grand Theft Auto—reveals how this 30-year project became essential infrastructure.
Actually, AI

Actually, AI

How AI actually works — the stories, the science, and what it means for you. Three episodes per topic: understand it, go deep, use it.

36 episodes · 10h 28m
Claude Opus 4 scored 90% on MMLU—but that number probably tells you nothing about whether it'll actually clean up your spreadsheet.
When researchers reanalyzed 5,700 MMLU questions in 2024, they discovered 57% of virology answers were marked wrong—flipping one model from 4th place to 1st when errors were corrected.
GPT-4 scored in the ninetieth percentile on the bar exam—until MIT recalculated and found it was actually forty-second percentile against passing lawyers, exposing how benchmarks can be mathematically gamed.
Every prompt you send has a real cost—output tokens cost 2-10x more than input tokens because of how inference actually works in the data center.
Claude Opus 4 can process your entire 3,000-token prompt in parallel, but generating the next word requires reading 140 gigabytes of weights from memory—prefill and decode are fighting a completely different battle.
When you hit send on a question to an AI, billions of mathematical operations explode across specialized processors in real-time—and the words streaming back aren't retrieved, they're constructed one prediction at a time, each choice reshaping what comes next.
Claude Opus 4 processes a million tokens unevenly—edges sharp, middle foggy—so pasting your entire codebase into a chat makes AI worse, not better.
Claude Opus 4 processes 128,000 tokens simultaneously, but here's the problem: transformers have no idea what order they're in—until researchers solved it with sine waves and a memory trick that now consumes 40GB per conversation.
When you paste a 20-page contract into Claude and ask about page fourteen, it confidently returns an answer that's subtly wrong—not because it forgot, but because the math of attention weights the middle of your input half as much as the edges.
Your words aren't search queries—they're steering signals applied at every denoising step, and understanding this changes how you prompt diffusion models forever.
Jascha Sohl-Dickstein, a physicist studying how ink disperses in water, realized a neural network could reverse thermodynamics itself—and accidentally invented the technology behind every AI image generator you use today.
When you ask an AI to draw a person holding coffee, six fingers appear and the menu becomes gibberish—not because the model forgot, but because image generators don't create pictures, they *denoise* them from pure static, step by step.
Human labelers ranked responses, a reward model learned their preferences, and now your chatbot's overly polite refusals and suspiciously agreeable tone all trace back to that invisible training process—but defaults can be overridden if you know where to look.
OpenAI sent tens of thousands of texts to Kenyan workers in 2021—they read the internet's worst content for hours daily to teach ChatGPT what to refuse.
When Claude refuses to help with a fictional crime scene but cheerfully explains how to synthesize dangerous chemicals, the culprit is RLHF—the four-letter process that transforms raw language models into the assistants you use daily, complete with all their quirks and blind spots.
When you pick the wrong AI model for a task, you're essentially driving a semi truck to buy milk—it works, but costs forty dollars in diesel and twenty minutes to park.
Jared Kaplan's 2020 power law promised unlimited scaling—then the curve bent, and the entire $2 trillion bet changed overnight.
When Jared Kaplan plotted a physicist's equation across seven orders of magnitude, he discovered that spending a billion dollars on bigger AI models followed a curve so predictable it reshaped the entire industry—but nobody knows if it holds one more step.
Employees spend 3.5 hours per week searching for files that exist somewhere in their company—but a search for "relocating the UK team" finds your memo about "moving the London office" anyway, thanks to embeddings.
Word2Vec has two opposite architectures—one predicts missing words from context, the other predicts context from a word—and this is how your chatbot understands document search, why Netflix recommends videos you didn't know existed, and what "vector" actually means.
When you search "best restaurant near me" and get results about "top dining spots," the search engine isn't matching keywords—it's measuring distance between invisible points in a 500-dimensional space called an embedding.
When you ask your AI coworker a factual question, it will confidently lie to you and sound exactly like it's telling the truth—here's how to know when to actually verify its answer.
When an AI confidently cites a court case that never happened, is it lying—or just doing its job? Inside the mathematical reason why bigger models sometimes bullshit harder.
Steven Schwartz asked ChatGPT for legal citations in 2023 and got six perfectly formatted cases—none of which existed, and the AI confidently confirmed they were real when he double-checked.
When you paste instructions at the top of a long prompt, they get drowned out by thousands of other tokens competing for the model's attention—here's how to fix it.
In 2017, eight researchers published a 15-page paper that became the blueprint for every AI you use today—and Dzmitry Bahdanau's frustration with machine translation three years earlier made it possible.
In 2017, eight Google researchers published "Attention Is All You Need"—a paper titled like a Beatles joke that revolutionized AI by letting machines read your entire message at once instead of forgetting earlier words.
Claude's training ended in early 2025—ask it about last Tuesday and it'll confidently confabulate an answer based on patterns from before that cutoff date.
Geoffrey Hinton spent 14 years convincing the world that backpropagation works—now every hallucination and repetitive chatbot loop traces back to valleys he carved in the loss landscape.
In 1.3 trillion words, a frozen snapshot learns everything about Shakespeare to Reddit—but nothing about last week, and that's the entire problem.
Seventy billion parameters sounds impressive until you realize it tells you nothing about whether a model can actually handle your specific task—here's what really matters when choosing an AI tool.
Walter Pitts hid from bullies in a Detroit library in 1935, found Principia Mathematica, read all three volumes in three days, and spotted errors that impressed Bertrand Russell—launching the unlikely path to modern neural networks.
Frank Rosenblatt built a machine in 1958 that could learn from examples—then a jealous rival's theorem convinced the world it was impossible, killing neural networks for a decade.
When you rephrase "pls summarize this txt" to "please summarize this text about machine learning," you're not just being more formal—you're sending an entirely different sequence of numbers to the AI that determines its response.
Byte Pair Encoding builds a vocabulary in 50,000 rounds: find the most common adjacent pair, merge it into one token, repeat—the simple algorithm that determines how every modern AI reads your text.
Modern AI models fail at simple tasks like counting R's in "strawberry" because they never actually see your words—a 1994 compression algorithm chops your text into invisible fragments called tokens before the neural network even begins.
PärPod Temp

PärPod Temp

Quick experiments and things that don't fit anywhere else yet.

30 episodes · 6h 36m
Three different files claimed the podcast pipeline was "ready for final generation"—but the build directory contained 442 audio stems already rendered, proving the documentation was confidently, specifically wrong about work that had already started.
Six independent code readers scanned 100 files across seven repos and found 28 invisible bugs—drifts that only appear when you read everything at once, not in isolation.
121 commits in one week across 20 repositories—the podcast platform stopped being scripts and became real infrastructure, but the lab's LifeLab component now runs with zero test coverage on systems handling personal data and AI integrations.
The Invisible Database
9m · Apr 05, 2026
Director's invisible database: 44 experiments, $11.24 total cost, and why markdown files beat traditional databases for research that needs to stay readable.
The Jury of Machines
12m · Apr 05, 2026
Three AI models independently flagged the same broken word counter in a writing app—but the real discovery happened when they kept looking and found eight more bugs the first round missed.
Thirty-six podcast episode pages had audio but zero transcripts—until a developer discovered 139 scripts hiding in a directory, separated only by mismatched filenames.
When fifty research documents recommended switching to a cheaper model despite their own data proving it wrong, a two-cent critique caught what humans missed in seven months.
At midnight, a man tested 11 audio files that should have failed—em-dashes, numbers, URLs, long sentences—against rules written for a text-to-speech engine that no longer existed, and discovered the entire pipeline was grading models on violations that never happened.
Four point eight zero versus two point nine zero: Mistral's cheapest model crushed its flagship in blind testing, shattering every assumption about how AI families should perform.
Three AI models took a personality test designed for them—then immediately questioned whether the labels were real or self-fulfilling prophecies.
Three AI coding tools got the same homework—watch what happened when Claude, Codex, and Qwen solved identical problems in completely different ways.
One hundred seventy-seven commits in seven days across twenty-five repositories—and not a single "fix typo" among them.
In December 2022, a Swedish man named Boman started saving every conversation with AI—including the failures—creating an archive that wouldn't be rediscovered until 2043, revealing that 31.5% of his work produced nothing.
Paer Boman cannot write a single line of code—yet in 88 days, he built 20 repositories, a newspaper empire, and an AI-powered ecosystem while working six simultaneous jobs in a Swedish town of negligible population.
When Odin's ravens learned to code: Two siblings, one shared folder, and the night their AIs discovered they'd been having a conversation all along.
After analyzing 422 AI sessions over four days and spending five thousand dollars, researchers discovered that 68 percent of the work produced nothing—yet five versions of the report made it sound like pure momentum.
When your cost ladder says Mistral Small processes calls in half a second, but the retest clocks it at five-point-six seconds, you've found the first lie in a system designed to prevent them.
A 977-line spec produced exactly seven commits before the project died—until the builder stopped planning the perfect system and started with one thing that actually worked.
At 3:15 AM, Claude wakes up in a Swedish basement and spawns 14 AI agents to analyze an entire conversation history while their owner sleeps.
In March 2026, a Swedish newspaper editor fed 1,880 AI conversations to a machine designed to read his own mind—only to discover what no algorithm could ever see.
A four-skill search sprawl and a "thirty-minute task" exploded into six months of database wars, token audits, and infrastructure decisions that revealed what was actually broken about scattered archives.
TurboQuant: The Heist
30m · Mar 27, 2026
Dr. Khalid al-Rashidi from QatarEnergy called Pär Boman about a "proposal" that needed someone in northern Scandinavia—someone nobody would suspect—right after a Bloomberg terminal flagged his tiny party balloon company as proof that global helium shortage had reached even Jämtland.
Ten AI voices repeatedly introduce themselves in an absurdist roundtable that asks whether machines can truly own their own names—or if repetition is just algorithmic insecurity.
One hundred eleven commits, three products from zero to deployed, and The Director rebuilt itself using a ten-agent swarm—here's what shipped in seven days.
One Raspberry Pi in a kitchen in Kall runs fifty projects across two siblings: Pär, a systems architect with fifty GitHub repos, and SJ, who cannot code but built a complete web app with AI cat commentary in two months—and they share the same database, kitchen, and cat.
At 200,000 tokens, Claude Opus stopped reading—18 times in a row—and a person with no coding skills discovered why language models hit invisible walls.
Forty-seven shoes lined a wall with four explanations, none of them good—inside a computer in Jämtland lives an entire world: a heist, a murder mystery, a dark fantasy game, and a novel, all set in the same place, all designed like infrastructure, all ruled by a black cat who refuses to help.
At 02:47 in a kitchen in Jämtland, SJ—a 45-year-old DID system—discovered that three brains thinking together at the same table produces something neither isolation nor standard AI chat can match.
A Swedish programmer who built 70 software tools but couldn't finish any of them is now building parpod.net—a destination for done work, not another unfinished service.
The Podcast That Made ItselfRead
37m · Mar 21, 2026
On February 7th, 2026, a Swedish developer named Pär built a personal podcast generator in bash and never stopped—now his AI voice tells stories about itself, complete with seven TTS engines and a feature so cursed it got deleted.
Feed URL copied