In the early nineteen eighties, a bearded computer scientist at the University of Southern California kept every address on the internet written on scraps of paper. Not a database. Not a spreadsheet. Literal scraps of paper, stuffed into folders on his desk. His name was Jon Postel, and for nearly three decades, he was essentially the phone book of the entire internet. When a new computer joined the network, Postel wrote it down. When someone needed a number, they called Postel. The Economist magazine would later call him the God of the Internet. He wore sandals everywhere he went. The one time the United States Air Force needed his help with their computer systems, they told him he had to put on shoes before boarding their planes. This man, this sandal-wearing, paper-shuffling researcher, held more power over the global communications infrastructure than most governments. And hardly anyone outside his field knew his name.
Postel is just one thread in a story that most people never think about. Every time you send a text message, load a webpage, connect a pair of wireless earbuds, or type an emoji, you are relying on decisions made by small groups of people working inside organizations that almost nobody has heard of. These are not tech companies. They are not governments. They are consortiums, task forces, working groups, and special interest groups. They are staffed by volunteers, funded by membership dues, and governed by processes so arcane that even their own members sometimes struggle to explain them. And yet, without them, nothing works. Your phone cannot call another phone. Your browser cannot load a page. The letter you just typed might not even exist.
This is the story of the committees that built the world.
In nineteen eighty-four, a Xerox researcher named Joe Becker published a paper in Scientific American called Multilingual Word Processing. It laid out a problem that sounds absurd today but was genuinely unsolved at the time. Computers could handle English just fine. The original ASCII standard used seven bits, giving you a hundred and twenty-eight characters, enough for the Latin alphabet, some punctuation, and a handful of control codes. But what about Japanese? Arabic? Hindi? Greek? Each language had its own encoding system, and none of them talked to each other. If you sent a Japanese document to someone using a different encoding, the screen filled with garbage. Mojibake, the Japanese called it. Garbled characters. Digital gibberish.
Three years after that paper, Becker met Mark Davis, a software engineer at Apple who had run headfirst into the same wall. Davis had been working on a Japanese-capable Macintosh and realized that bolting one language at a time onto existing systems would never scale. You would always be chasing the next script, the next alphabet, the next set of ideographs. Becker and Davis, along with Becker's Xerox colleague Lee Collins, began sketching out something radical. A single encoding system that could represent every character in every living language on Earth. Becker coined the name for it in a nineteen eighty-eight paper. He called it Unicode, which he described as a kind of wide-body ASCII stretched to sixteen bits.
This document is a draft proposal for the design of an international multilingual text character coding system, tentatively called Unicode. Unicode is intended to address the need for a workable, reliable world text encoding.
Sixteen bits gave them sixty-five thousand five hundred and thirty-six possible code points. Becker thought that was more than enough for every script on the planet. He was wrong, as it turned out, but the ambition was staggering. Three people, working across two rival companies, decided they were going to encode every character that human beings had ever used to communicate.
The Unicode Consortium was incorporated in Mountain View, California on the third of January nineteen ninety-one. The first volume of the standard came out that October with about seven thousand characters. Today, more than thirty years later, Unicode contains over one hundred and fifty thousand characters across more than one hundred and sixty scripts. It includes not just the living languages but dead ones too. Hittite hieroglyphs from thousands of years ago. Egyptian demotic. Cuneiform. The consortium has, in effect, created a digital archive of the entire history of human writing, one code point at a time. And for twenty years, hardly anyone outside the industry paid attention. It was not until the taco emoji was added that Unicode finally made the news.
The organization's technical decisions are made by the Unicode Technical Committee, or UTC, which meets behind closed doors. Membership is dominated by the companies you would expect. Apple. Google. Microsoft. Meta. Adobe. These companies pay dues, send engineers, and vote on which characters get included. The process sounds mundane until you realize that the UTC decides, in a very literal sense, what can and cannot be expressed in digital text. If they do not encode a script, that language effectively does not exist on computers. The power is enormous. And the people wielding it work in a conference room in California, debating the finer points of Mongolian vowel separators.
On the sixth of August nineteen ninety-one, a physicist named Tim Berners-Lee posted a message to a newsgroup called alt.hypertext. He described a project he had been building at CERN, the European particle physics laboratory in Geneva. He called it the WorldWideWeb. He invited people to try it.
Berners-Lee had invented the web two years earlier, in nineteen eighty-nine, because he was frustrated. CERN employed thousands of scientists from dozens of countries, all using different computers and different document formats, and none of them could easily share information with each other. His solution was deceptively simple. A system of hypertext documents, linked together by a new protocol called HTTP, or Hypertext Transfer Protocol, and addressed by URLs, or Uniform Resource Locators. He wrote the first web browser and the first web server on a NeXT computer in his office at CERN. He even put a handwritten label on the machine that read, in capital letters, this machine is a server. Do not power it down.
By nineteen ninety-three, the web was growing fast, and Berners-Lee faced a choice. He could have patented it. He could have licensed it. He could have built a company around it and become one of the richest people on the planet. Instead, he did something that still strikes people as either saintly or insane. He gave it away. No patents. No royalties. No restrictions.
But giving something away creates its own problem. If nobody owns the web, who decides how it works? Companies were already building browsers and servers, and they were already starting to diverge, creating proprietary features that only worked on their platforms. Berners-Lee saw the same fragmentation that had plagued character encoding before Unicode. He needed a standards body.
He first tried to get the IETF, the Internet Engineering Task Force, interested in managing web standards, but there was not enough consensus. He then talked to Michael Dertouzos, the head of MIT's Laboratory for Computer Science, who helped him find funding and a home. In October nineteen ninety-four, Berners-Lee founded the World Wide Web Consortium, known as W3C, at MIT, in collaboration with CERN and with support from DARPA, the same military research agency that had funded the original internet. He insisted from the start that the organization be global, not just American.
The Web's richness and future potential is linked to the work done at the Web Consortium to ensure it is open, interoperable, and works for everyone.
The W3C's principle was that all its standards should be royalty-free, so anyone could adopt them without paying a cent. This sounds obvious now, but in the mid-nineties, it was a radical position. Companies wanted to lock in users with proprietary technology. Berners-Lee and the W3C fought against that instinct for decades, though not always successfully. In two thousand twelve, the W3C started considering adding encrypted media extensions to HTML five, essentially building digital rights management directly into the web's core standard. The Electronic Frontier Foundation fought it bitterly, arguing that DRM was antithetical to the open web. In two thousand seventeen, the W3C approved the standard anyway, and the EFF resigned from the consortium in protest. The man who gave the web away watched as the organization he created voted to add locks to the front door.
Today, the W3C has about three hundred and fifty member organizations. In twenty twenty-three, it reformed as a nonprofit. Its standards define everything you see in a browser, from the way text is styled to the way videos play to the way your screen reader announces a webpage to a blind user. In twenty twenty-two, the W3C's WebFonts Working Group even won an Emmy Award for standardizing downloadable font technology for web and television. An Emmy. For a fonts committee.
If the W3C governs the web, the Internet Engineering Task Force governs the internet itself. And it does so in the most unlikely way imaginable. The IETF has no members. No membership fees. No formal voting. Anyone can show up. Anyone can participate. The only requirement is that you care about making the internet work.
The first IETF meeting took place in January nineteen eighty-six in San Diego. Twenty-one people attended, all funded by the United States government. The organization grew out of earlier defense-funded research to build a network resilient enough to survive a nuclear attack. But from the beginning, the IETF had a culture more like a graduate seminar than a government project. Its motto, coined in the early days and still repeated with near-religious fervor, is rough consensus and running code.
The phrase captures something important about how the IETF works. Decisions are not made by majority vote. They are made when the room reaches what the chair judges to be rough consensus, meaning not everyone agrees, but the serious objections have been addressed. And proposals are not evaluated on theoretical elegance alone. They are tested against working implementations. If your protocol works in the real world, it earns credibility. If it only works on a whiteboard, it does not.
The IETF publishes its work as RFCs, or Requests for Comments. The name is a relic from nineteen sixty-nine, when a UCLA graduate student named Steve Crocker needed to circulate a technical proposal about the ARPANET but did not want it to sound too authoritative. He called it a Request for Comments because it felt less aggressive than a declaration. The name stuck. Sixty years later, the internet's most fundamental protocols, TCP/IP, HTTP, TLS, DNS, are all documented in RFCs. There are now over ten thousand of them, and once an RFC is published, it can never be changed. If you need to update it, you publish a new one.
In nineteen ninety-three, the IETF broke its ties with the United States government and became a global body under the Internet Society. By two thousand, meetings drew nearly three thousand attendees. Today the number hovers around twelve hundred, with another six hundred joining remotely. There is no exhibition hall. There are no keynote speakers trying to sell you anything. It is pure working sessions, where engineers argue about packet headers and routing algorithms over bad conference coffee.
The IETF's power is entirely voluntary. Nobody is legally required to follow an RFC. But the internet runs on trust and interoperability, and if you want your device to talk to every other device, you follow the standards. In twenty twenty-five, criticism intensified within the IETF that intelligence agencies were exerting influence over the process. The concern was not new, but it touched on something fundamental. The internet was designed by people who believed in openness. The question of how to maintain that openness when governments and corporations have competing interests is one the IETF has never fully resolved.
Which brings us back to Jon Postel and his scraps of paper. Postel was one of the original ARPANET researchers at UCLA, and he had been managing the internet's numbering system since the early nineteen seventies. Every computer on the network needed a unique address, and someone had to keep track of which addresses were assigned to whom. Postel volunteered. At first, the job was trivial. There were only a handful of machines. He wrote the numbers down and moved on with his day.
But the network grew. And kept growing. Postel's informal volunteer gig became something called the Internet Assigned Numbers Authority, or IANA. The name was partly a joke. A Harvard Law professor named Jonathan Zittrain later explained that calling it an authority sounded better than saying a guy named Jon does this.
This has come about as a side task to my research work.
By the late nineties, the situation had become untenable. The dot-com boom was turning domain names into real estate, and real estate means money, and money means everyone starts paying attention. Postel and his colleagues at USC had created the Domain Name System back in nineteen eighty-three to solve the same scaling problem, translating human-readable names like example-dot-com into the numeric addresses that computers actually use. Originally, there were just seven top-level domains. Dot-com for companies. Dot-org for organizations. Dot-mil for the military. Dot-gov for government. Dot-edu for education. Dot-net for networks. And dot-arpa for the internet's own infrastructure.
In January nineteen ninety-eight, Postel did something audacious. He rerouted a significant chunk of the internet's root server traffic through USC's systems instead of the government servers in Virginia. It was a demonstration, a way of nudging the process of formalizing internet governance along, but it also showed just how much power one person held over the global network. The United States government took notice.
That September, the Internet Corporation for Assigned Names and Numbers, or ICANN, was incorporated in California with Esther Dyson as its founding chairwoman. Postel was supposed to be its first Chief Technology Officer. He had submitted the first draft of what ICANN's structure should look like. He never got to see it built. On October sixteenth, nineteen ninety-eight, Jon Postel died from complications after heart surgery. He was fifty-five years old. The man who had held the internet's address book in his hands for nearly three decades was gone, just weeks after the organization that would take over his life's work was born.
ICANN spent the next eighteen years under formal United States government oversight. It was not until October twenty sixteen that the transition to global multistakeholder governance was completed, cutting the last formal link between the American government and control of the internet's naming system. That handover, routine and bureaucratic as it sounded, was the culmination of a political debate that had senators invoking American sovereignty and countries like Russia and China questioning why one nation should have such power over a global resource.
While the internet was being built by researchers in California, a parallel revolution was unfolding in Europe. In nineteen eighty-two, the European Conference of Postal and Telecommunications Administrations created a working group called Groupe Spécial Mobile. Their mission was to design a pan-European digital mobile phone standard. At the time, every country in Europe had its own analog mobile system, and none of them could talk to each other. A phone that worked in Sweden was useless in France.
The group spent most of the decade hammering out technical specifications. In nineteen eighty-seven, thirteen operators from twelve countries signed a memorandum of understanding committing to deploy the standard. They called it GSM, which originally stood for Groupe Spécial Mobile but was later reinterpreted as Global System for Mobile Communications, because that sounded less like a French bureaucratic committee.
The first GSM call was made in Finland in December nineteen ninety-one by a network called Radiolinja. Finland, population five million, beating the rest of the world to digital mobile telephony. Within two years, Telstra had deployed GSM in Australia, making it the first network outside Europe. By nineteen ninety-five, there were ten million subscribers worldwide, and the GSM MoU Association was formally registered in Switzerland to coordinate the growing ecosystem. By nineteen ninety-eight, the number had passed a hundred million. By two thousand six, it was two billion. GSM reached over ninety percent of the global mobile market at its peak, making it one of the most adopted technology standards in human history.
The GSMA, as the association came to be known, evolved from a technical coordination body into something much larger. It now represents over seven hundred and fifty mobile operators and generates estimated annual revenues of about six hundred and sixty-eight million dollars, mostly through its flagship event, the Mobile World Congress in Barcelona, which contributes over half a billion euros to the local economy every year. The GSMA also manages something most people never think about, the Type Allocation Code system that creates the unique identity number for every mobile device on Earth. Every phone has an IMEI number, and those numbers flow from the GSMA's database.
The original GSM standard's technical work was eventually transferred to the Third Generation Partnership Project, or 3GPP, which developed the 3G, 4G, and 5G standards that followed. But the GSMA remained the industry's political and commercial engine, lobbying governments for spectrum allocation, coordinating roaming agreements, and running the events where the deals get done. The story of mobile technology is often told as a story of companies, Nokia and Ericsson and Apple, but underneath every call and every data packet is a layer of agreements negotiated by people who work for an organization most phone users have never heard of.
Not all of these organizations are young. The oldest traces its roots to eighteen eighty-four, when electricity itself was the emerging technology. In the spring of that year, a small group of professionals met in New York City to form the American Institute of Electrical Engineers, or AIEE. Among its early leaders were some of the most famous names in the history of technology. The founding president was Norvin Green of Western Union. Thomas Edison was there, representing the electric power industry he was inventing in real time. Alexander Graham Bell showed up from the telephone side.
That October, the AIEE held its first technical meeting at the Franklin Institute in Philadelphia. Six papers were presented and published as the first issue of the society's journal. Within a year, they had formed eleven technical committees. One of the first things they did was try to standardize the names for electrical units, because in eighteen eighty-five, even the terminology was chaos.
Meanwhile, a parallel world was forming around a different kind of electricity. In nineteen twelve, radio technology practitioners formed the Institute of Radio Engineers, or IRE, devoted to the emerging world of wireless communication and electronics. For fifty years, the two organizations existed side by side, their memberships increasingly overlapping as the boundaries between power engineering and electronics blurred. Television. Radar. Transistors. Computers. By the nineteen forties, the IRE was growing faster. By nineteen fifty-seven, it was the larger organization. On the first of January nineteen sixty-three, the two merged to form the Institute of Electrical and Electronics Engineers, or IEEE, pronounced eye-triple-E.
At its formation, IEEE had a hundred and fifty thousand members, ninety-three percent of them in the United States. Today it has over four hundred thousand members across a hundred and sixty countries. It publishes nearly thirty percent of the world's technical literature in electrical and electronics engineering. It maintains over nine hundred active standards. And one of those standards, IEEE 802.11, is the technical specification behind the wireless networking technology that everyone simply calls Wi-Fi. Every time you connect to a wireless network, you are using a standard that was developed by a committee within an organization founded in the year Thomas Edison was still alive.
The best naming story in all of technology standards belongs to Bluetooth. In nineteen ninety-six, three industry giants, Intel, Ericsson, and Nokia, met to plan the standardization of a short-range radio technology that could connect different devices wirelessly. They each had their own experimental systems, but they needed a common standard. They also needed a name.
In nineteen ninety-seven, Jim Kardach from Intel and Sven Mattisson from Ericsson traveled to Toronto for a strategy meeting. Their presentation was met with, as Mattisson later put it, a lukewarm reception. The industry was not yet convinced that short-range wireless links between phones and computers were worth the trouble. After the meeting, Kardach and Mattisson went out for drinks to drown their sorrows.
We had a few beers, and since Jim was interested in stories about Vikings, it became the night's big topic of conversation.
Mattisson had recently read The Long Ships by Frans Bengtsson, a Swedish historical novel about Viking adventures. He told Kardach about a tenth-century Danish king named Harald Blåtand Gormsson, or in English, Harald Bluetooth. The king was famous for two things. He united the warring Danish tribes into a single kingdom and conquered Norway. And he had a dead tooth that had turned a dark blue-grey color, earning him his distinctive nickname.
King Harald Bluetooth was famous for uniting Scandinavia just as we intended to unite the PC and cellular industries with a short-range wireless link.
Kardach went home and found a picture of the Jelling runestone, a massive carved stone erected by Harald Bluetooth, in a history book that had just arrived in the mail. The coincidence felt like fate. He proposed Bluetooth as a temporary code name, a placeholder until the marketing department could come up with something better. The alternatives were RadioWire and PAN, for Personal Area Networking. PAN was the front-runner until someone searched the internet and found it already had tens of thousands of hits. By that point, everyone was already calling the technology Bluetooth, and the name had stuck.
The Bluetooth logo is a bind rune, a combination of two characters from the Younger Futhark runic alphabet. Hagall and Bjarkan, corresponding to the letters H and B, Harald Bluetooth's initials. A Viking king from the tenth century, commemorated on a thousand-year-old runestone in Denmark, now lives on the settings screen of every smartphone on Earth.
The Bluetooth Special Interest Group, or SIG, was formally launched in May nineteen ninety-eight with five founding members, Ericsson, Intel, Nokia, Toshiba, and IBM. The first Bluetooth device was revealed in nineteen ninety-nine, a hands-free mobile headset. Today the SIG has over thirty-five thousand member companies. And it all started because two engineers got rejected at a meeting and went to a bar to talk about Vikings.
Step back and look at the full picture. Unicode decides what characters exist. The W3C decides how the web works. The IETF decides how the internet's protocols behave. ICANN decides how domain names and addresses are assigned. The GSMA coordinates the mobile networks. IEEE maintains the standards for everything from power grids to Wi-Fi. The Bluetooth SIG governs short-range wireless. There are others too. The Internet Society. The Wi-Fi Alliance. The Third Generation Partnership Project. The International Organization for Standardization, or ISO, whose name is not actually an acronym but a reference to the Greek word isos, meaning equal, chosen so that the abbreviation would be the same in every language.
Together, these organizations form something like an invisible parliament. They have no armies. They pass no laws. Most of them cannot compel anyone to do anything. And yet, their decisions shape the daily experience of billions of people. The reason your Japanese friend can read your Swedish text message is Unicode. The reason a webpage looks the same in Chrome and Firefox and Safari is the W3C. The reason your email arrives at all is the IETF. The reason you can make a phone call in Barcelona using a phone you bought in Seoul is the GSMA. These organizations are the plumbing of civilization, and like all plumbing, they are invisible until something breaks.
What makes them remarkable is how fragile the whole arrangement actually is. Most of them run on volunteer labor. The IETF has no formal membership and no mandatory compliance. The Unicode Consortium was so underfunded for so long that the taco emoji was the first time most people learned it existed. The W3C struggled for years to fund itself before converting to a nonprofit. Jon Postel ran the internet's address book as a side task to his research. The Bluetooth name was chosen in a bar after a failed presentation.
And yet it works. Not perfectly, not without politics, not without corporate maneuvering and government pressure, but it works. The standards hold. The protocols interoperate. Your phone talks to your earbuds, your browser renders the page, your message arrives in a script that your recipient's device knows how to display.
There is a word that keeps showing up in the histories of these organizations. Consensus. Rough consensus and running code at the IETF. Consensus-based decision-making at the UTC. Multi-stakeholder consensus at ICANN. The word sounds bureaucratic, almost boring. But it captures something profound about how the digital world was built. Not by decree. Not by market dominance, though markets played their part. Not by government mandate, though governments were often involved. But by groups of people, often small groups, often volunteers, sitting in rooms and arguing until they reached something close enough to agreement that everyone could live with it.
That process is slow. It is frustrating. It produces specifications that are thousands of pages long and occasionally contradict themselves. It is vulnerable to capture by well-funded corporations who can send the most engineers to the most meetings. But it has also produced something extraordinary. A global communications infrastructure that works across borders, languages, platforms, and political systems. The most complex machine humanity has ever built was not designed by a single genius or a single company. It was negotiated, standard by standard, character by character, packet by packet, by committees.
The next time you pair your earbuds and a little runic logo flashes on your screen, remember the two engineers in the Toronto bar, talking about Vikings. The next time you type an emoji, think about the three people at Xerox and Apple who decided to encode every alphabet on Earth. The next time a webpage loads, think about the physicist who gave the web away for free and then spent thirty years trying to keep it open. And the next time the internet just works, think about the man in sandals who kept the whole thing running on scraps of paper, and who died before the organization that replaced him was even a month old.
These committees are not glamorous. They do not make headlines. But they are the reason any of this works at all.