Deps
Deps
Deps
requests: HTTP for Humans
S1 E229m · Mar 17, 2026
Kenneth Reitz was 22 when he documented a Python library that didn't exist yet—one line to make an HTTP request instead of fifteen lines of machinery—launching the most downloaded package in Python history.

requests: HTTP for Humans

The Documentation for Something That Did Not Exist

This is episode two of What Did I Just Install.

In February twenty eleven, a twenty-two-year-old college dropout sat down and wrote the documentation for a library that had not been built yet. The documentation showed what making an HTTP request in Python should look like. Import requests. Call requests dot get with a URL. Get a response object back. Access the status code, the headers, the text, the JSON, all as simple attributes. One line to make a request. One attribute to get the result.

At the time, doing this in Python required importing two separate modules with overlapping functionality, constructing a password manager object, building an authentication handler, creating an opener director, installing the opener globally, and then calling a function that returned a response you had to manually decode. Fifteen lines of machinery for something that should be obvious. The twenty-two-year-old looked at that machinery and decided it was stupid. His name was Kenneth Reitz, and the library he documented before writing would become the most beloved package in the Python ecosystem, downloaded a billion times a month. So fundamental that there was a serious attempt to make it part of the language itself. But the story of requests is not just about beautiful code. It is about a dependency chain worth billions of downloads maintained on a budget of thirty thousand dollars a year, a fundraiser that went wrong, a mental illness that became public, and the question of who gets to define what Python feels like.

The Fifteen Lines

To understand why requests matters, you need to feel what Python was like before it existed. The year is twenty ten. You want to make an HTTP request with basic authentication. You open the Python documentation and discover that the language has two modules for this. urllib handles URL encoding. urllib2 handles actually opening URLs. They are different modules with confusingly overlapping names, and their interfaces agree on nothing.

To make your authenticated request, you start by importing urllib2. Then you create an HTTPPasswordMgrWithDefaultRealm, which is a real class name that you have to type with your actual fingers. You add a username and password to the password manager. You create an HTTPBasicAuthHandler and pass it the password manager. You create an opener using build underscore opener and pass it the handler. You install the opener globally. Now you can call urlopen on your URL, catch HTTPError and URLError as separate exception types, and call dot read on the response to get the body as bytes that you then need to decode.

A developer writing about this in twenty ten compared the experience to programming in Java. Another noted that the documentation was not comprehensive enough to figure this out without searching forums. When Python three arrived, the reorganization did not help. urllib and urllib2 became urllib dot request, urllib dot parse, urllib dot error, and urllib dot robotparser. Four submodules. Different from Python two. Still verbose. Still nobody's idea of a good time. There was even a security vulnerability in the redirect handling, where a server could redirect urllib to a file colon slash slash URL and read local files off your machine.

Kenneth Reitz was twenty-two years old, sitting in central Pennsylvania, and had had enough.

The Wandering Street Photographer

Reitz grew up in central Pennsylvania. Born in nineteen eighty-eight, he attended what he later described as a legalistic evangelical Christian school, an experience that indirectly shaped the open-mindedness he carried into adulthood. His father was a programmer. Kenneth taught himself BASIC and C at age nine, started using Python during his first computer science course, fell in love with it, and dropped out of college. The dropout detail is the kind of thing that stops mattering once you see what someone builds, but it mattered to Reitz. He carried it as part of his identity, the person who did not finish the credential but built the thing everyone uses.

But Reitz was never just a programmer, and understanding that is essential to understanding requests. He is a serious photographer. He shoots almost exclusively in black and white using a Leica M Monochrom, a camera that has no color sensor at all, only luminance. It is permanently mounted with a thirty-five millimeter Summicron lens. He calls it his favorite object in his life and has said he plans to shoot with only that lens for the rest of his days. For film work he uses a nineteen fifty-four Leica M3 with a vintage nineteen seventy-nine Summicron. His style is street photography and personal photojournalism, documenting what he describes as human consciousness by capturing people and the things they build. He has published over twenty-six thousand photographs. His philosophy on shooting without color: nothing in life is black and white, everything is grey, and shooting pure luminance brings him one step closer to perceiving things as they truly are.

He is also a musician. He releases electronic music under the name Infinite State, playing a Moog Voyager, a Moog Sub thirty-seven, a custom Eurorack modular rig, and various drum machines. He is a percussionist by training with over twenty years of drumming experience, including marching band. His album descriptions tie directly to his mental health, the first referencing esotericism and bipolar disorder, the last referencing his schizoaffective mind. He created PyTheory, a library for exploring music theory with Python, which he describes as more of a thought exercise than a practical tool, an exploration of the tonal and harmonic interplay of ratios between various musical systems and the correlation between sensory perception, colors, and emotion.

He describes himself as a wandering street photographer, idealist, and moral fallibilist. This is the person who sat down and decided that HTTP in Python should be beautiful.

The README That Came First

The practice Reitz followed was called readme-driven development, a term popularized by Tom Preston-Werner, the co-founder of GitHub, in a twenty ten blog post. The idea is that you write the documentation before you write the code. You design how the thing should feel to use, and then you make the implementation match.

Reitz took this seriously. In a twenty thirteen essay called "How I Develop Things and Why," he wrote:

Before I start writing a single line of code, I write the README and fill it with usage examples. I pretend that the module I want to build is already written and available, and I write some code with it. You discover it. You respond to it. The user API is all that matters. Everything else is secondary.

When asked what he would recommend as a recipe for good API design, Reitz gave a single answer: The Design of Everyday Things by Don Norman. Norman's thesis is that good design makes the correct action obvious and the incorrect action difficult. A door with a handle should be pulled. A door with a plate should be pushed. When you get it wrong, the failure is in the design, not the user. Reitz applied this to code. The correct way to make an HTTP request should be the obvious way. Import requests. Call get. Get a response. No factories, no handler chains, no opener directors. The principle he articulated: encapsulating concepts is better than extrapolating complexity.

The first version was released on February thirteenth, twenty eleven. Version zero point two came the next day, Valentine's Day, which the changelog marks as "Birth!" By the end of that year, Reitz had landed a job at Heroku, the cloud platform, with the remarkable title of Python Overlord. He was twenty-three and responsible for the technical design of their entire Python application stack. Requests was not yet famous, but it was about to be.

HTTP for Humans and Its Discontents

The tagline "HTTP for Humans" replaced an earlier, more abrasive version along the lines of "HTTP that is not stupid." The positive framing was smarter. It positioned the library not as an attack on what came before but as an assertion of what was possible. The phrase caught fire. Reitz himself created a constellation of projects in the same vein: Pipenv, Python Dev Workflow for Humans. Records, SQL for Humans. Legit, Git for Humans. Maya, Datetimes for Humans. Tablib, Tabular Datasets for Humans. Other developers followed. Logging for Humans appeared. Then another Logging for Humans by a different author. Then a third.

Eventually a developer named Charles Leifer wrote a blog post whose title cut through the noise: "'For Humans' makes me cringe." His argument was specific and worth hearing.

The phrase is an empty signifier. Overuse has drained it of meaning. It carries false modesty, since any maintainer worth their salt commands their subject matter. And labeling your project "for Humans" somehow implies that any other libraries existing in the space before were not for humans. It is a backhanded dig at other libraries for having bad APIs. It is a pretentious way of saying you see your project as having a superior API.

His conclusion: let the library speak for itself and let others judge its quality.

Leifer had a point. The phrase had become a meme. But the original had earned it. What made requests special was not just ease of use. It was that it felt like Python. It followed the principle of least surprise. Methods were named what you expected. Objects behaved the way you hoped. Error handling was consistent. Reitz had not just written a library. He had written an argument about what Python code should feel like, and the argument was persuasive enough to reshape community expectations. After requests, APIs that forced you through fifteen lines of boilerplate felt embarrassing. The bar had moved.

Five Packages, Five of the Top Ten

Here is something that reveals the hidden architecture of the Python ecosystem. Requests has four dependencies. Four packages that it pulls in when you install it. Those four packages plus requests itself are each independently in the top ten most downloaded packages on all of PyPI. A single pip install command accounts for five of the top ten slots on the download charts.

The most important dependency is urllib3, and it has one of the best origin stories in open source. Andrey Petrov studied computer science at the University of Toronto and graduated in two thousand seven. His first real job was at TinEye, the first web-based reverse image search engine, in Toronto. TinEye's technology creates unique digital fingerprints for images and compares them against an index of every image it has seen, finding matches even when the image has been resized, cropped, or altered. The company needed to upload billions of images to Amazon S3. Petrov wrote a processing script and estimated how long it would take. Two months.

The problem was that Python's built-in HTTP tools opened a new socket for every request. Connect, handshake, transfer, close. Connect, handshake, transfer, close. Billions of times. Petrov wrote a library in about a week that pooled connections, reused sockets, handled retries, managed redirects, and did proper multipart encoding. With this library and a concurrency tool he built alongside it, the upload that was going to take two months finished in about two weeks. He asked his boss, Paul Bloore, if he could open-source it under his own name with a permissive MIT license. Bloore said yes, a decision Petrov still describes as generous and rare.

urllib3 is now the most downloaded third-party package in the Python ecosystem. Over a billion downloads a month. Petrov himself has said the only reason it holds that position is because requests uses it. For years, the total donations he received were five dollars from a single user. In twenty fourteen, Stripe gave a grant of three thousand seven hundred and fifty dollars for two weeks of full-time work. By twenty twenty-four, total annual funding had reached thirty thousand dollars, most from Tidelift subscriptions, with a five thousand dollar grant from the Microsoft FOSS Fund and twenty-five hundred from LaunchDarkly. The maintainer team, Illia Volochii, Seth Larson, Quentin Pradet, and Petrov as meta-maintainer, distributed about twenty-one thousand dollars among themselves after platform fees. Thirty thousand dollars a year for infrastructure that every Python application touching the internet depends on. Adding HTTP two support to urllib3 would cost an estimated fifty thousand dollars of dedicated full-time work. There is no funding for it.

The other three dependencies are certifi, which provides Mozilla's root certificates for SSL verification, originally extracted from requests itself, about a billion downloads a month. charset-normalizer, created by Ahmed Tahri, which replaced chardet because chardet's LGPL license caused problems for Apache projects and statically linked applications, reportedly twice as fast and MIT licensed. And idna, maintained by Kim Davies, which handles internationalized domain names because Python's built-in module only supports the two thousand three specification. Each is independently in the top ten on PyPI.

Every pip install requests pulls in five packages that collectively represent some of the most critical infrastructure in the Python ecosystem. Nearly all maintained by small teams or individual developers. Funded, in total, at a level that would not cover one engineer at any of the companies that depend on them.

The Standard Library Debate

Requests grew fast. Reitz joined Heroku in late twenty eleven, which gave him a platform and visibility. He was a Python Software Foundation Fellow by twenty twelve, keynoting PyCon Australia that same year. By twenty fourteen, requests was not just popular. It was the answer. If someone asked how to make an HTTP request in Python on any forum, the response was always "use requests."

In January twenty fifteen, Reitz opened a GitHub issue proposing to include requests in Python three point five's standard library. Forty-two comments followed. The arguments for inclusion were obvious. Everyone used it. It was more Pythonic than the built-in tools. It was the gold standard.

The arguments against were more interesting and ultimately more persuasive. Cory Benfield, a requests core maintainer, argued that being outside the standard library had given them the freedom to make choices that benefited users without being stuck behind core development policies. He added a warning: if requests entered the stdlib, the current core team would move on, and the loss of direction would inevitably erode the library's interface. Donald Stufft pointed out the practical impossibility: you could not add requests without also adding chardet and urllib3, or rewriting to remove those dependencies. Alex Gaynor noted that adding a new HTTP module to the standard library with no asyncio story seemed, in his word, bonkers.

Reitz conceded. He wrote that he was very inclined to agree. Perhaps requests was fine just the way it was. He had also said elsewhere, with the kind of bluntness that defined his public persona, that the Python standard library is where Python modules go to die. The stdlib's eighteen-month release cycle, backward compatibility requirements, and conservative change policies were the opposite of what made requests thrive.

The debate was never formalized into a PEP. It did not need to be. The consensus was clear. Requests belonged outside the standard library. The decision is still considered correct.

Twelve Days Awake

Here is where the story changes register, and it is important to give this the room it requires, because what happened to Kenneth Reitz is not a footnote to the requests story. It is part of the story.

In September twenty fifteen, Reitz experienced a total mental health crisis. He was awake for twelve days without fatigue. He experienced hallucinations that made him believe his world had new rules to follow, similar to lucid dreaming inside reality. When asked his name at the hospital, he struggled between answering Kenneth Reitz and answering I Am. He believed he was Jesus. He impersonated doctors. He was diagnosed with bipolar affective disorder with psychosis, later rediagnosed as schizoaffective disorder, a condition that combines mood disorder symptoms with psychotic features.

My brain can construct entire realities that do not exist.

He has been hospitalized, in his words, more times than he would like to count. During manic episodes, he has experienced what he describes as visions of the afterlife, extremely deep cosmological insights into what is going on behind the scenes in daily life, and extreme synesthesia between the colors of objects and emotions.

Reitz was among the first prominent open source developers to write about mental illness publicly. His essay "MentalHealthError: An Exception Occurred" appeared in twenty sixteen. He followed it with "MentalHealthError: Three Years Later," then "On Mania" in twenty twenty-four, then "What Schizoaffective Disorder Actually Feels Like" in twenty twenty-five. Each essay went deeper.

In August twenty twenty-five, he published "The Cost of Transparency: Living with Schizoaffective Disorder." In it, he described working for at least twenty companies over the preceding years, each following the same pattern. Initial success, disclosure or visibility of his condition, growing discomfort, elimination. One company that championed neurodiversity fired him within twenty-four hours of a manic episode triggered by a new medication. He was excluded from team meetings, passed over for promotions with vague explanations about communication style. He cited statistics: people with schizoaffective disorder die fifteen to twenty years younger than the general population. Only ten to twenty percent maintain competitive employment.

I'm done apologizing for living openly with schizoaffective disorder. The discrimination I've faced isn't my fault, it's a reflection of society's failure to move beyond tokenistic awareness toward genuine inclusion.

This is the person who wrote requests. A wandering street photographer with a Leica and a Moog synthesizer and a brain that can construct entire realities, who at twenty-two decided that HTTP should be for humans, and at twenty-seven learned that his own consciousness would periodically rewrite the rules of existence.

The Fundraiser, Step by Step

In March twenty eighteen, Reitz announced a fundraiser for Requests three. The headline feature would be native async and await support, which the community desperately wanted. The initial goal was approximately five thousand dollars, ostensibly for a new computer. Donations from Microsoft, Google, Slack, and individuals quickly brought the total to around thirty thousand dollars.

At the time, Reitz was sitting on the Python Software Foundation's Board of Directors and was a member of the Packaging Working Group. He had recently participated in voting for PyPI funding. The PSF staff were not aware of his personal fundraiser until it was brought to their attention.

A developer named Nathaniel J. Smith, the creator of Trio, an async library for Python, had been doing async integration work on urllib3 and was the natural person to handle the engineering. In late May, Smith left UC Berkeley and emailed Reitz to explore a consulting arrangement for the async work.

A month passed before Reitz responded. When he did, he said that most of the roughly twenty-eight thousand dollars had gone to taxes.

That is not how taxes work. That is not how commitments work.

Reitz proposed using the remaining funds for documentation rather than the promised async feature. He pointed out that he was totally dependent on Smith to implement the core technical work, but he expected Smith to do it as a volunteer.

Over the summer, communication deteriorated. Reitz stopped responding to emails. He deleted the original fundraising page from his website and scrubbed references to it from the requests documentation and his blog.

In February twenty nineteen, Reitz sent Smith an unsolicited email proposing a joint PSF grant application. Smith did not respond positively.

In May twenty nineteen, Smith published "Why I Am Not Collaborating with Kenneth Reitz." The post was detailed and specific. Smith wrote that there was a real risk that requests three would never materialize and the public impression would become that Reitz stole the money. He described the situation as a betrayal of trust that damages the entire community. He noted that Reitz's last direct code contribution to requests' source code had been whitespace cleanups in May twenty seventeen. Subsequent commits were merging documentation fixes and adding monetization features like donation links.

Another developer, Ian Stapleton Cordasco, went on record:

Having to deal with Kenneth all these years has made it such that I barely work on Python open source software anymore and have largely, quietly left the community.

In January twenty twenty-three, Reitz published "An Overdue Apology." He wrote that the psychological weight of getting everything perfect was too much, that the asynchronous landscape within Python's ecosystem had failed to meet his expectations, and that he ultimately decided, not transparently, that supporting async fully was not feasible. He quietly licensed the Requests three codebase under CC0, placing it in the public domain.

In August twenty twenty-five, he published a second response.

I raised funds to support development work. I did that work. The funds went to supporting that labor, my labor.

He drew a distinction between a failed project and misappropriated funds. He described Smith's blog post as following him ever since, dominating search results, preceding every job interview, coloring every conference proposal.

The truth probably lives somewhere in the middle, in the space between a person who took on more than they could deliver during a period of mental instability and a community that gave money for a specific thing and never got it. Neither reading fully cancels the other. The async HTTP library that requests three was supposed to become was eventually built anyway, by someone else, under a different name.

The Handoff

In July twenty nineteen, about two months after Smith's blog post, the requests repository was formally transferred to the Python Software Foundation. It moved from kennethreitz slash requests to psf slash requests on GitHub. Reitz has said he removed himself from requests on PyPI of his own accord because it was the right thing to do.

The primary maintainers became Nate Prewitt, a software engineer at AWS who also maintains boto3, and Seth Michael Larson, who in twenty twenty-three became the first-ever PSF Security Developer in Residence, a role funded by the Open Source Security Foundation's Alpha-Omega Project. Larson's job is to audit and secure the infrastructure of PyPI, CPython, and the broader Python ecosystem. Before taking that role, he was a maintainer of both urllib3 and requests.

Reitz stepped away from requests entirely. As of twenty twenty-six, he works as a Senior Software Engineer at The C++ Alliance, building the Boost dot io website in Django. He lives in Winchester, Virginia. He writes essays about mental health, consciousness, and discrimination. His main interest, he has said, is crafting for himself a simpler life. Love, relationships, friendships, stability and comfort, and the sanctuary of his home. He no longer maintains any of his major Python projects.

The Library That requests Three Was Supposed to Be

Tom Christie is the kind of developer whose work is everywhere and whose name almost nobody knows. He created Django REST Framework, one of the first examples of automatic API documentation, and the project that FastAPI's creator Sebastien Ramirez has specifically cited as inspiration. He created Starlette, currently the fastest Python web framework by benchmark. He created Uvicorn, the ASGI server that runs Starlette and therefore runs FastAPI. All of these live under his Encode organization on GitHub.

In twenty nineteen, Christie released httpx. It offers both synchronous and asynchronous support in a single library with a requests-compatible API. It supports HTTP two, which requests does not. It is, in many ways, the library that the Requests three fundraiser promised, with a different name and a different author. The irony is layered. FastAPI, which sits on top of Starlette, which Christie built, recommends httpx, which Christie also built, as its HTTP client. If you build anything with FastAPI, the middleware system you rely on descends from Starlette. The HTTP client that would replace requests in your stack was built by the same person who built the framework underneath your server.

As of twenty twenty-six, httpx's weekly downloads have overtaken requests by some measures, though the comparison is complicated by the difference between direct installs and transitive dependencies. requests still has over a million dependent repositories on GitHub, a legacy that will take years to unwind. The stable httpx one point zero has not shipped yet. The development releases suggest it is close.

The landscape looks like this. Requests remains the default for synchronous HTTP and will be for years. httpx is the modern alternative for anyone building async applications. aiohttp holds the async server niche. And urllib3 quietly underpins almost everything, downloaded over a billion times a month by people who mostly do not know it exists, maintained by a team whose entire annual budget is thirty thousand dollars, because a computer science graduate in Toronto asked his boss a generous question in two thousand seven.

Where It Runs

Requests is everywhere. If you write Python and your code talks to the internet, there is a very good chance requests is in your requirements file. Web scrapers use it. API clients use it. Deployment scripts use it. Data pipelines use it. Internal tools that nobody thinks about until they break use it. It is the kind of dependency that appears so often in requirements dot txt files that developers stop noticing it, the way you stop noticing the electrical wiring in your walls.

If requests vanished tomorrow, those projects would not stop working immediately. The installed version would remain. But the next pip install would fail, and fixing it would mean rewriting every HTTP call in every project to use httpx or, worse, urllib. The migration would be straightforward but tedious. Multiply that across every Python project in the world and you begin to understand what a billion monthly downloads means.

The latest release is version two point thirty-two point five, from August twenty twenty-five. The package is stable, mature, and largely finished. It does what it does and does it well. The excitement has moved elsewhere, to httpx, to async patterns, to HTTP two. But requests remains. The quiet foundation under everything, written by a twenty-two-year-old who thought HTTP should be for humans, who documented the library before he built it, who changed how an entire language talked to the internet, and who then discovered that his own mind would periodically rebuild reality from scratch.

He is still taking photographs. Black and white, thirty-five millimeter Summicron, everything in shades of grey.

Open a terminal and type pip install requests. It takes about two seconds. Now open a Python shell and type import requests, then r equals requests dot get, and pass it any public API URL you like. Hit enter. Now type r dot status underscore code. You should see two hundred. Type r dot json and you get the response as a Python dictionary. Three lines. That is what Kenneth Reitz was fighting for. That is what HTTP for Humans means.

That was episode two.