Deps
Deps
Deps
curl: Twenty-Eight Years from a Swedish Suburb
S1 E533m · Mar 17, 2026
Daniel Stenberg has maintained curl for 28 years from a Swedish suburb, and it now runs on over 20 billion devices—yet most people have no idea they're using it every single day.

curl: Twenty-Eight Years from a Swedish Suburb

The Most Installed Software You Have Never Thought About

This is episode five of What Did I Just Install.

There is a piece of software running on more than twenty billion devices right now. It is in your phone. It is in your car. It is in your television and your game console and your printer and your router. It is in the thermostat on your wall, the set-top box under your television, and the laptop you are using right now. It runs on over a hundred operating systems across twenty-eight processor architectures. NASA uses it, though when pressed for details, the only answer they gave was, and I quote, "We are using curl to support NASA's mission and vision." Which is the kind of non-answer that tells you the real answer is probably classified.

One person has maintained this software for twenty-eight years. He lives in a suburb south of Stockholm with his wife and two children. His name is Daniel Stenberg, and the tool is called curl, and if you have ever typed a command into a terminal that started with the word curl, you used his code. If you have never typed that command, you still used his code. You just did not know it. Every time your phone checks for an update, every time a streaming service buffers a video, every time an app talks to a server, there is a very good chance that somewhere in the stack, curl is doing the actual work of moving data from one place to another. Daniel Stenberg once sat down and counted every curl installation in his own household. He found between fifty-nine and seventy-four installations across his family's computers, phones, tablets, smart television, router, printer, and network storage device. That works out to roughly sixteen and a half curl installations per person in his house. He noted, with some amusement, that his electric car, his fitness tracker, and his heated coffee mug were the only connected devices that did not contain his code.

What curl does is deceptively simple. It transfers data using URLs. You give it an address and it fetches what is there, or it sends something to that address. Before curl existed, if you wanted your program to download a file from the internet, you had to write the socket code yourself. You had to understand the protocol. You had to handle the connection, the headers, the encoding, the errors, the redirects, the authentication, all of it. Curl wraps all of that complexity into a single command or a single library call. It supports twenty-eight protocols, from the obvious ones like HTTP and FTP to the obscure ones like Gopher and LDAP and MQTT and WebSockets. It has somewhere north of two hundred and sixty command-line options. It is, by any measure, the Swiss army knife of internet data transfer, except Swiss army knives are small and limited and curl is neither.

The Boy with the Commodore

Daniel Stenberg was born in nineteen seventy in Huddinge, a suburb south of Stockholm. His father worked in construction. His mother was into arts and handcrafts. He has four siblings, two brothers and two sisters. In the early nineteen eighties, he discovered computers through a friend's Commodore sixty-four, and in nineteen eighty-five, at the age of fourteen, he and his younger brother Bjorn pooled their money and bought their own. He started with BASIC, then moved to sixty-five-ten assembly language, and then he did something that would shape everything that came after. He joined the demoscene.

For anyone who has not encountered the Scandinavian demoscene, it was a subculture of young programmers who competed to create the most impressive audio-visual demonstrations possible on limited hardware. Stenberg's group was called Confusing Solution, and later he was part of groups called Horizon and Skyline Techniques. The demoscene rewarded a very specific set of skills. You had to squeeze every cycle out of the processor. You had to write tight, efficient code that did remarkable things with almost nothing. You had to be both creative and obsessive about technical detail. These are, if you think about it, exactly the skills you would need to spend three decades optimizing a data transfer tool that runs on every platform ever built.

Stenberg played football from age seven until about seventeen, when programming consumed him entirely. He was strong in mathematics at school. He did his military service, as young Swedish men did in that era, and then he went to work. His first job was at IBM in the early nineteen nineties, working on RS six thousand systems and AIX Unix. Then he moved to Frontec Railway Systems in nineteen ninety-three, writing embedded C code. Then Frontec Tekniksystem in nineteen ninety-six, consulting on embedded systems. In October nineteen ninety-seven, he co-founded a company called Haxx with a friend named Linus. It was a small consulting firm. Stenberg did contract work during the day and wrote open source software at night and on weekends.

But here is the thing about Stenberg that makes the curl story make sense. He was not just doing curl. Throughout his career he has created or co-created an astonishing number of open source projects. He co-founded Rockbox, an open source firmware replacement for MP3 players that gave you features the manufacturers never intended. He created c-ares, an asynchronous DNS resolver library. He maintained libssh2, an SSH protocol library. He wrote a tool called roffit and another called trurl. He wrote a text editor for the Amiga called FrexxEd that had its own scripting language. He even wrote an IRC bot. The man simply cannot stop building things. But of everything he built, curl was the one that ate the world.

A Currency Exchange Bot and a Brazilian Stranger

The story of curl does not begin with Daniel Stenberg. It begins with Rafael Sagula, a Brazilian developer who, on November eleventh, nineteen ninety-six, released a tiny command-line tool called HttpGet, version zero point one. It did exactly what the name says. It fetched things over HTTP. Stenberg found this tool because he had a peculiar need. He was running an IRC bot, the one he had written, and he wanted it to offer a currency exchange service. The exchange rates were published on a website, and he needed a way to automatically download them. HttpGet was close to what he needed but not quite there.

On December seventeenth, nineteen ninety-six, Stenberg contributed his first modifications and released HttpGet version zero point two. He and Sagula collaborated through nineteen ninety-seven, adding features. First came HTTP proxy support in April. Then Stenberg wanted his currency bot to grab rates from FTP servers and Gopher servers too, because in nineteen ninety-seven those were still real things that people used. At that point, a tool called HttpGet that also spoke FTP and Gopher needed a new name. They renamed it urlget.

Urlget went through versions two and three during nineteen ninety-seven. But then, in the spring of nineteen ninety-eight, Stenberg added support for FTP uploads. A tool called urlget that could also upload was, once again, misleadingly named. He needed another name. Something that captured the idea of a client-side tool that works with URLs. The letter C for "client," and URL. C, URL. Curl. Or "see URL," if you prefer the pun. On March twentieth, nineteen ninety-eight, he released the first version that carried the curl name. He called it curl four point zero, keeping the version numbering from the previous names, and that date, March twentieth nineteen ninety-eight, is curl's official birthday.

Taking curl this far and being able to work full time on my hobby project is a dream come real.

He was twenty-seven years old. He had no idea he would still be working on the same tool when he was fifty-five.

The Long March to Everywhere

What happened over the next two and a half decades is a story of relentless, incremental expansion. Not a viral moment. Not a single breakthrough. Just a developer in Sweden waking up every morning and making his tool support one more protocol, one more edge case, one more platform. In nineteen ninety-nine, curl added DICT support, then LDAP, then FILE, then TELNET. In two thousand, the tool split into two things. There was the command-line tool, curl, which is what you type in the terminal. And there was libcurl, a library with a stable programming interface that any application could embed. This was the move that changed everything.

PHP was the first major piece of software to integrate libcurl, starting with version four point zero point two. Once PHP had it, every web server running PHP had it, which in the early two thousands meant most of the internet. Then other languages picked it up. Python has pycurl. Ruby has curb. Perl, C sharp, Rust, Go, Java, all of them have bindings to libcurl. When you use a library in any of those languages that makes HTTP requests, there is a reasonable chance that underneath all the abstraction layers, libcurl is doing the actual network call.

In two thousand two, curl version seven point ten settled on the MIT license exclusively, after having gone through GPL and then the Mozilla Public License. The MIT license is as permissive as it gets, you can do whatever you want with the code as long as you keep the copyright notice. This was a deliberate choice. Stenberg wanted curl to be usable everywhere, including in commercial products that would never open their source code. The decision to go MIT is a big reason curl ended up inside proprietary devices from companies that would never touch GPL code.

Protocols kept coming. IMAP, POP three, and SMTP in December two thousand nine. RTSP in January two thousand ten. RTMP in May two thousand ten. HTTP two support landed with the help of the nghttp2 library. In August two thousand nineteen, curl made its first HTTP three request over QUIC, the new transport protocol that Google had been developing and that was rewriting the rules of how data moves across the internet. By two thousand twenty-two, WebSocket support arrived. The tool that started as a way to fetch currency exchange rates was now speaking practically every protocol that existed on the internet.

In two thousand eighteen, curl was bundled with Windows ten. This was a quiet milestone with enormous implications. Microsoft, the company that once called open source a cancer, was now shipping Daniel Stenberg's MIT-licensed tool as part of its flagship operating system. Every Windows machine in the world would now have curl installed by default. Apple had already been shipping it with macOS for years. Every Linux distribution included it. Every Android phone had it through its use in core libraries. Every iPhone had it through Apple's frameworks. The tool was everywhere, and it got there not through any corporate deal or marketing push, but through being so useful and so reliable that everyone independently decided they could not live without it.

The Man in Huddinge

For twenty-one years, from nineteen ninety-eight to two thousand nineteen, Stenberg worked on curl in his spare time. During the day, he had jobs. IBM, then Frontec, then Haxx, then a two-year stint at Enea working on embedded Linux distributions, then five years at Mozilla on their networking team working on HTTP and FTP and DNS. The Mozilla job was the closest thing to getting paid for curl-adjacent work, because Firefox uses libcurl-like code for its network stack and Stenberg was contributing directly to that expertise. But it was not curl itself. Curl was always the nights and weekends project.

In February two thousand nineteen, he finally achieved what he called a dream come true. He joined wolfSSL, an American company that specializes in cryptography and embedded TLS libraries, to do commercial curl support and development full time. He was the only Swede in the company, working from his home in the Stockholm suburbs. For the first time in twenty-one years, the person maintaining the most widely deployed software on Earth was actually getting paid to do it.

But let us be clear about what "getting paid" means in this context. Stenberg is employed by one company. Curl runs on twenty billion devices manufactured by thousands of companies. The economics of this are absurd. Every major car manufacturer uses curl in their vehicles. Every game console manufacturer ships it. Every smart television, every connected printer, every set-top box. Apple, Google, Microsoft, Amazon, companies with engineering teams the size of small nations, all depend on curl. Their combined research and development budgets could fund a small country. And the entire project? Maintained full-time by one man in Sweden, and one other developer, Stefan Eissing, whose work on curl's HTTP two and HTTP three implementation has been funded through the curl project. That is not a team. That is a miracle on a deadline.

curl is a team effort.

That is true, and it is also a deflection. Curl has had thousands of contributors over its lifetime. But Stenberg is the one who has been there every single day for twenty-eight years. He is the one who reviews the pull requests, manages the releases, writes the documentation, handles the security reports, maintains the website, runs the mailing lists, plans the events, finds the sponsors, and blogs about all of it. He once listed everything the job entails: security work, release management, website administration, mailing list administration, pull request reviews, user support, blogging, people management, debugging, pull request merging, continuous integration maintenance, sponsor relations, documentation, event planning, and feature development. That is not a job description. That is a small company's entire organizational chart, performed by one person.

Less Sleep, Less Other Things

How do you maintain one project for twenty-eight years? Stenberg's own explanation, when he wrote about the twenty-five-year mark, was characteristically understated.

Less sleep. Less other things.

That phrase deserves to sit for a moment. Twenty-eight years of less sleep. Twenty-eight years of less other things. He describes his approach as gradual and iterative, improving all aspects of the project continuously. He has never taken a sabbatical from curl. He has never stepped away for a year to work on something else. He has blogged over fourteen hundred posts on daniel dot haxx dot se, many of them detailed technical write-ups about curl features, curl security advisories, curl release notes. He streams curl development on Twitch. He has given talks at conferences around the world. He wrote a book called "Everything curl" that is available for free online, because of course he did. He also wrote "HTTP two explained," which has been downloaded over two hundred thousand times, and "HTTP three Explained," and a book called "uncurled" about the experience of running an open source project for decades. The man is a one-person institution.

And yet Sweden barely knew he existed for most of that time. That started changing around two thousand seventeen, when he received the Polhem Prize, one of Sweden's oldest and most prestigious technology awards, and had a gold medal placed around his neck by the king of Sweden. In two thousand twenty-five, things accelerated. He was named Developer of the Year in Sweden in September. A month later, the Royal Swedish Academy of Engineering Sciences awarded him their Gold Medal with a citation recognizing his outstanding contributions to software development and his central role in internet infrastructure and open source software.

There is something poetic about these honors coming from Sweden's most traditional institutions, the same royal academy that gives out the Nobel Prizes in other categories. Daniel Stenberg, a man who spent decades quietly writing C code in his home office while the entire internet used his work without knowing his name, finally getting his flowers from the Swedish establishment. But the king of Sweden using the internet to give an award to the man who helps the internet work is a loop that feels a little too perfect.

I Will Slaughter You

The flip side of building something used by twenty billion devices is that some of those users are, to put it politely, unhinged. In February two thousand twenty-one, Stenberg received an email with a subject line that read "I will slaughter you." The body of the message contained no text, only seven screenshot images. The first showed source code from curl with Stenberg's copyright notice. The rest showed various software components. No explanation. Just a death threat and some screenshots.

Clearly you don't deserve my code.

Stenberg reported the threat to Swedish police and wrote about it publicly on his blog. The sender, who identified himself as Al Nocai, later sent a rambling follow-up claiming that curl had been used as an "attack vector" in a failed project that cost him significant money and personal losses. He blamed Stenberg for security breaches, exploits, and what he described as federal server hijacking. He claimed to have contacted various government agencies. The emails were, by any reasonable reading, the product of a person in crisis projecting that crisis onto the nearest identifiable target, which happened to be the name in the copyright notice of a tool he did not understand.

This is what it means to be a solo maintainer of critical infrastructure. Your name is in the copyright header. Your email is on the website. Your face is on the conference talks. When something breaks, or when someone imagines something broke, you are the one they find. Stenberg has talked about receiving angry messages from people who believe he personally owes them support for software they downloaded for free. The entitlement is not rare. The death threat was extreme, but it exists on a spectrum that every prominent open source maintainer recognizes.

The Eternal Comparison

There is another command-line tool for downloading things from the internet, and its name is wget. If you have spent any time in a terminal, you have probably used both, and you have probably wondered what the difference is. Stenberg himself has written about this, and his framing is elegant. Curl, he says, is like the Unix cat command. It pipes data. It reads from one place and writes to another. Wget is like cp. It copies files. One is a pipe. The other is a file manager.

The practical differences flow from this philosophical split. Wget can recursively download entire websites, following links from page to page. Curl cannot. Curl supports twenty-eight protocols and can upload data. Wget supports HTTP and FTP and can only download. Curl is powered by libcurl, a library that any application can embed. Wget is a command-line tool only, with no library counterpart. Wget is part of the GNU project, copyrighted to the Free Software Foundation, licensed under GPL version three. Curl is entirely independent, owned by Stenberg, licensed under MIT.

The governance difference matters more than you might think. Wget, being part of GNU, has to follow GNU policies and assign copyright to the FSF. Contributors to wget have to sign legal papers. Curl contributors just submit a pull request. This difference in friction, the bureaucratic overhead of one versus the casual openness of the other, may explain why curl has a much higher pace of development. More commits, more releases, more mailing list activity, more contributors, and it has been that way for over fifteen years. Both tools are excellent. But Stenberg's advice is pragmatic. Use wget when you need recursive downloading. Use curl for just about everything else.

One Hundred and Eighty Thousand Lines of C

Curl is written in C. Not modern C, not C with training wheels, not C with a safety net. Just C. Roughly one hundred and eighty thousand lines of it. In a world where most new projects reach for memory-safe languages like Rust or Go, curl remains stubbornly, deliberately written in the language that gives you maximum control and maximum opportunity to shoot yourself in the foot.

Stenberg has written about this tension openly. He once analyzed curl's entire vulnerability history and found that roughly half of all security bugs were the kind of mistakes that only happen in C. Buffer overflows, use-after-free errors, off-by-one mistakes, the classic rogues gallery of memory safety problems. A memory-safe language would have prevented them entirely. So why not rewrite curl in Rust?

The answer is pragmatic, not ideological. Curl runs on over a hundred operating systems. Many of those platforms have C compilers but no Rust compiler. Some of them are embedded systems with constrained resources. A rewrite would take years and introduce new bugs while eliminating old ones. Instead, Stenberg has been pursuing a strategy of gradually introducing memory-safe components, using Rust-based TLS backends where possible, running extensive static analysis and fuzzing, and maintaining a security process that is among the most rigorous in all of open source.

That security process deserves its own moment. Curl has had a vulnerability disclosure program running since two thousand ten. Between two thousand nineteen and two thousand twenty-six, the program identified eighty-seven confirmed vulnerabilities and paid out over one hundred thousand dollars in bug bounties. Stenberg personally reviews every security report. He assigns CVE numbers. He writes detailed advisories explaining what went wrong, when it was introduced, what versions are affected, and how to fix it. The curl security advisory page is a masterclass in transparency. Every vulnerability gets a page with a severity rating, a credit to the reporter, and a link to the fix. Many open source projects handle security in the dark. Stenberg does it in public, with meticulous documentation, because he believes that is the only way to build trust.

The AI Slop That Killed the Bug Bounty

And then the robots came. Starting in late two thousand twenty-four and accelerating through two thousand twenty-five, curl's bug bounty program on HackerOne was flooded with AI-generated vulnerability reports. People were feeding curl's source code into large language models, asking the models to find security bugs, and submitting whatever the models hallucinated. The reports looked superficially plausible. They used the right terminology. They cited real function names and real code paths. But the vulnerabilities they described did not exist. The models were confabulating, generating convincing-sounding security analyses of problems that were not there.

The numbers tell the story. In previous years, curl's bug bounty had a confirmation rate north of fifteen percent, meaning more than one in seven reports turned out to be real vulnerabilities. By two thousand twenty-five, the rate plummeted below five percent. Not even one in twenty was real. Stenberg and his small team of volunteer security reviewers were spending their limited time reading page after page of AI-generated fiction, carefully evaluating each report to determine that no, this buffer overflow does not actually exist, and no, this use-after-free condition cannot actually be triggered.

It is our attempt to remove the incentives for submitting made-up lies.

On January twenty-sixth, two thousand twenty-six, Stenberg announced that curl was ending its bug bounty program. No more monetary rewards. No more HackerOne. Security reports would move to GitHub's private vulnerability reporting system and direct email. He wrote that the flood of AI slop had begun to interfere with addressing real issues and improving the software, and that the small team needed to protect its survival and its mental health. About twenty percent of all recent submissions were obvious AI slop, and many of the remaining eighty percent were low-quality human submissions that the bounty incentive had attracted.

Let that sit for a moment. The most meticulous security operation in open source, a program that had found eighty-seven real vulnerabilities and kept billions of devices safer, was shut down because people pointed language models at the code and submitted the hallucinations for beer money. The tools built to help find bugs were, in practice, creating busywork that prevented the actual maintainers from finding actual bugs. This is the open source sustainability crisis in miniature. Not a lack of funding, though that exists too, but a lack of respect for the finite time and attention of the people who keep the infrastructure running.

What Curl Depends On, and What Depends on Curl

Pull the thread on curl's own dependencies and you find a surprisingly small tree. Curl needs a TLS library for encrypted connections, and it supports multiple options here. OpenSSL, wolfSSL, GnuTLS, NSS, Secure Transport on macOS, Schannel on Windows. It needs nghttp2 for HTTP two support and ngtcp2 plus nghttp3 for HTTP three. It uses zlib for compression. It can use c-ares, the asynchronous DNS library that Stenberg himself wrote, for non-blocking name resolution. It uses libidn2 for internationalized domain names and libpsl for the public suffix list. The dependency tree is modest, well-understood, and largely maintained by people Stenberg knows personally and has worked with for years.

Now look at what depends on curl. The answer is: effectively everything that connects to the internet. PHP's curl extension is used by virtually every PHP application in the world. Python's pycurl wraps libcurl. So does Ruby's curb. The curl command-line tool is the default way to test APIs, download files, and debug network connections on every Unix-like system. Docker uses it. Kubernetes uses it. CI/CD pipelines use it. Configuration management tools use it. Monitoring systems use it. The dependency chain going up from curl is not a tree. It is more like an ocean. Everything floats on it.

Here is a thought experiment. If curl disappeared tomorrow, what would break? The answer is not "some things" or "many things." The answer is "the internet." Not in the dramatic, everything-goes-dark sense. The protocols would still work. The servers would still respond. But the glue that connects applications to those protocols, the library call that ten thousand different programs use to say "fetch this URL," that would be gone. Every automated deployment, every health check, every API integration that relies on curl or libcurl would fail. The blast radius is genuinely incomprehensible.

The Funding Situation

The Sovereign Tech Fund, the same German government initiative that funded Express five in its decade-long journey to release, as we talked about a few episodes back, has also contributed to curl's development. There are corporate sponsors. There are individual donors. The curl project accepts contributions and Stenberg has been transparent about where the money goes and what it supports.

But the fundamental arithmetic has not changed. Curl is maintained primarily by one full-time developer and a small group of regular contributors. The bus factor, the number of people who would need to be hit by a bus before the project dies, is uncomfortably close to one. Stenberg is aware of this. He has been methodical about documentation, about keeping the codebase clean and well-tested, about mentoring contributors, about making the project something that could theoretically survive without him. But "could theoretically survive" and "would thrive" are different things. The institutional knowledge in Stenberg's head, accumulated over twenty-eight years of daily work on the same codebase, is not something you can transfer with a README file.

Compare this to the resources available to the companies that depend on curl. Apple, Google, Microsoft, Amazon, Meta, every car manufacturer, every game studio, every streaming service. These are companies with engineering teams numbering in the tens of thousands and annual budgets in the billions. The software that powers their data transfer layer is maintained by a man in Stockholm whose salary is paid by a cryptography startup in Montana. The asymmetry is breathtaking.

Code I have written runs in virtually every internet-connected device on the planet, and in most cases the users download and use it without even telling me, for free.

Twenty-Eight Years and Counting

Here is where we are in two thousand twenty-six. Curl is twenty-eight years old. Stenberg is fifty-five. He has worked on curl for more than half his life. The tool supports over a hundred operating systems and twenty-eight processor architectures. It speaks twenty-eight protocols. It has over two hundred and sixty command-line options. Stenberg has received a gold medal from the king of Sweden, a gold medal from the Royal Academy of Engineering Sciences, and the title of Sweden's Developer of the Year. He has written four books about internet protocols and open source maintenance. He streams his development work on Twitch. He is, by any reasonable measure, one of the most important and least famous software developers in the world.

The project continues to evolve. HTTP three support is maturing. Rust-based TLS backends are being integrated. The security process continues, now without the bug bounty but with the same rigor. New contributors appear. The mailing lists stay active. Stenberg still blogs, still streams, still answers questions, still reviews pull requests. Twenty-eight years is an almost incomprehensible stretch of dedication to a single piece of software. Most companies do not last twenty-eight years. Most products do not last twenty-eight years. Most careers do not stay focused on one thing for twenty-eight years.

Gradually and iteratively improve all aspects of it.

That is his description of how curl evolves, and it is also, whether he intends it or not, a description of how he has lived his professional life. No pivots. No rewrites. No dramatic reinventions. Just a steady, daily practice of making the thing a little bit better, a little more reliable, a little more capable, for twenty-eight years and counting.

Where It Connects

Curl is everywhere in your stack, and I mean everywhere. Every deployment script, every sync tool, every health check in your infrastructure ultimately relies on network transfer, and curl is the baseline. Every time rsync negotiates a connection, every time pip downloads a package, every time your server checks for updates, curl or libcurl is likely involved somewhere in the chain. You have it installed on your Mac, on your VPS in Paris, on your Raspberry Pi. Your phone has it. Your router has it.

But here is the specific connection that matters for this series. In the requests episode, we talked about Kenneth Reitz building HTTP for Humans on top of urllib3. In the Express episode, we traced the middleware pattern from the nineteen sixties to modern web frameworks. Curl predates all of them. It predates the web frameworks, the package managers, the language ecosystems. It is the bottom of the stack. When requests makes an HTTP call, it goes through urllib3, which uses Python's socket library, which eventually talks to the operating system's network stack. But when you test that same API endpoint from the command line, you type curl. When the deployment script checks if the server is alive, it uses curl. When the CI pipeline downloads a dependency, it often uses curl. The tool is so fundamental that it has become invisible, like plumbing.

Daniel Stenberg has been maintaining that plumbing from a suburb of Stockholm for twenty-eight years. He was twenty-seven when he started. He has won gold medals from the king. He has received death threats from strangers. He has weathered a flood of AI-generated nonsense that forced him to shut down his bug bounty program. And tomorrow he will wake up in Huddinge, open his laptop, and make curl a little bit better. Because that is what he does. That is what he has always done.

That was episode five.

Curl is already on your machine. Open a terminal and type curl dash capital I followed by any website address. You will see the HTTP headers, the invisible metadata that every server sends before the actual content. Status codes, content types, caching rules, server software. Now try curl wttr dot in. You will get a weather forecast rendered in plain text, right there in the terminal. That is curl being a toy box. It has over two hundred and sixty options, and most people only ever use two or three. Pick a quiet evening and read through curl dash dash help all. You will not finish. Nobody finishes. But you will find something you did not know the internet could do.