我将探讨开源软件的起源、发展及其对当今互联网和科技产业的深远影响。从早期计算机编程的简陋方式,到如今开源软件在服务器、操作系统和各种应用程序中的广泛应用,开源软件的演变展现了技术发展与合作共享理念的奇妙交织。
早期,由于计算机硬件极其昂贵且软件规模相对较小,软件通常在研究人员和开发者之间免费共享。这种共享文化在学术界和研究机构中尤为盛行。像SHARE这样的用户组,更是为软件的共享和合作提供了正式的结构。这种早期合作的氛围,为日后开源软件的蓬勃发展埋下了种子。
Unix系统的出现及其开放的授权方式,堪称开源软件兴起的关键转折点。它被授权给大学使用,并允许查看源代码。这使得学术机构,特别是加州大学伯克利分校,得以深入研究和修改代码。伯克利软件发行版(BSD)的诞生,进一步推动了软件的开放和共享,培养了一代期望能够查看和修改源代码的程序员。 这种文化,深刻地影响了后来的软件开发模式。
然而,20世纪70年代末和80年代初,软件的商业化趋势日益明显,软件共享文化开始衰退。为了应对这一变化,Richard Stallman启动了GNU项目,并成立了自由软件基金会。他的目标是推动自由软件运动,并通过GNU通用公共许可证(GPL)确保软件对所有用户保持自由。 这标志着自由软件运动正式拉开帷幕。
Linux内核的出现,恰好填补了GNU操作系统缺失的关键组件。Linux与GNU项目结合,形成了完整的GNU/Linux操作系统。Linux的诞生,是开源软件发展史上的一个里程碑。
互联网的兴起,极大地促进了代码共享和对自由软件的需求。许多重要的互联网软件组件,例如Apache web服务器、PHP脚本语言和MySQL数据库,都在这个时期蓬勃发展。
为了更有效地推广自由软件运动,尤其是在商业领域,“开源”一词应运而生,并催生了开源倡议。“自由软件”和“开源软件”的概念虽然存在细微差别,但两者都强调软件的开放性和共享性。 多种开源软件许可证的存在,使得开源软件的使用和修改方式存在差异。开源软件项目也可能出现分支,导致项目发展方向出现差异。
如今,开源软件在各个领域都极其流行,尤其是在服务器和移动操作系统领域。许多流行的网络浏览器、服务器和网站都依赖于开源软件。开源软件已经成为互联网运作的基石,其对当今世界的影响是不可估量的。 从早期程序员间的代码共享,到如今支撑着全球互联网的庞大生态系统,开源软件的成功,证明了合作与共享的力量。
Computer software seems to be everywhere. No matter what kind of computer you use or where you use it, all computers use software. That's the entire point of a computer. However, not all software is the same. There are actually enormous differences between software applications. Not just what they do, but how they were written, the business models that run them, the legal licenses that cover them, and the philosophy behind them.
Learn more about free and open source software, what it is and how it works on this episode of Everything Everywhere Daily.
This episode is sponsored by Quince. Vacation season is nearly upon us, and you've heard me talk before about my favorite blanket and towels that I got from Quince, but did you know that they also have a collection of great travel products? Like lightweight shirts and shorts from just $30, pants for any occasion, and comfortable lounge sets. They also have premium luggage options and durable duffel bags to carry everything in.
The best part? All Quince items are priced 50 to 80% less than similar brands. By partnering directly with top factories, Quince cuts out the cost of the middleman and passes the savings on to us. And Quince only works with factories that use safe, ethical, and responsible manufacturing practices and premium fabrics and finishes.
For your next trip, treat yourself to the luxury upgrades you deserve from Quince. Go to quince.com slash daily for 365-day returns, plus free shipping on your order. That's q-u-i-n-c-e dot com slash daily to get free shipping and 365-day returns. quince.com slash daily
This episode is sponsored by Mint Mobile. Do you say data or data? Well, I say data, and for the longest time I thought paying a fortune on my monthly data plan was just normal. That wasn't until I found out about Mint Mobile and their premium wireless plans that start at just $15 a month. With Mint Mobile, I use the exact same network on the exact same cell towers I used before with the exact same phone and exact same phone number. The only thing that isn't the same are the monthly fees.
All plans come with high-speed data or high-speed data, your choice, as well as unlimited talk and text delivered on the nation's largest 5G network. No matter how you say it, don't overpay for it. Shop data plans at mintmobile.com slash EED. That's mintmobile.com slash EED.
Upfront payment of $45 for a three-month, five-gigabyte plan required, equivalent to $15 a month. New customer offer for the first three months only, then full-price plan options available. Taxes and fees extra. See Mint Mobile for details. Software is ubiquitous in the modern world. It isn't just in our smartphones and our desktop computers. It's in our televisions, refrigerators, and washing machines.
Some people have become billionaires from the creation of software. Around the world, there are probably hundreds of thousands, if not millions of people who make their living from writing computer software. In fact, I am confident that some of you listening to me right now are involved in the development of computer software. As important as software is today, it wasn't always considered so important. The first programmable computer is considered to be ENAC, the Electronic Numerical Integrator and Computer, which was built in 1945.
ENAC initially did not have stored programs. You didn't load code into memory like modern computers. Programming is done by physically rewiring cables, setting switches, and configuring plug boards. A single program could take days or weeks to physically set up. If you wanted the computer to do something else, then you had to do it all over again. So software with respect to ENAC was just a set of instructions for which cables to set up and which switches to flip.
At the time, nobody was even considering that this was something that could be copyrighted or owned. It was the equivalent of a cooking recipe more than anything else. Soon after ENAC, computers had the ability to store instructions in memory.
This episode isn't about the history of programming languages, so suffice it to say that compiled programming languages were developed in the 1950s. A compiled programming language is one where the source code is translated into machine code by a compiler before execution, allowing the program to run directly on the hardware. And machine code is the lowest level programming language, and it consists of the ones and zeros that a computer can execute directly.
These early computers had two relevant attributes for the purposes of this episode. They were extremely large and expensive, and they weren't very powerful, at least compared to the computers that would come later. That means from a business standpoint, what was being sold and what everyone cared about was the hardware. The size of the programs written for these early computers was relatively small.
For example, the first business computer sold by IBM in 1953 was the IBM 650, and it had programs which were about 100 to 1,000 instructions. One instruction was 10 decimal digits, which was about 40 bits. So in terms of a size that you could compare with modern computers, a 100 instruction program would be about 4,000 bits or 500 bytes.
A 1,000 instruction program would be 5,000 bytes or 5 kilobytes. So, these programs weren't very big. In the early days of computing, software was generally shared freely amongst researchers and developers. It was a very small community. As computer hardware was the primary commercial product, software was often distributed with source code as a practical matter to anyone who purchased the computer.
The concept of freely shared software began in academic and research institutions where collaboration was the norm. At places like MIT, Berkeley, and Bell Labs, programmers routinely shared code to solve problems and built upon each other's work. Given that computer software was totally useless to anyone who didn't own a very expensive computer, which at the time was limited to large institutions, nobody was concerned about things like ownership or rights.
The SHARE user group was formed in 1955 as one of the first computer user groups in history. It was established by a collection of IBM mainframe customers who were using IBM's 704 scientific computing system. SHARE's name wasn't an acronym, but rather it reflected a core purpose – to share information, software, and resources amongst its members.
At a time when computers were enormously expensive and software was not viewed as a separate commercial product, share provided a formal structure for collaboration. And just as a side note, share still exists as a user group today. This ethos of sharing software continued into the 1960s and 1970s. An important development occurred in 1973 with the release of Unix.
Unix is a multi-user, multi-tasking operating system which began development in 1969 at Bell Labs, which, if you remember from my previous episode, invented everything. Unix was created as a simpler, more flexible alternative to the complex, resource-heavy systems of the time. Designed to be portable, efficient, and modular, it introduced key concepts like the hierarchical file system, pipes, and a shell-based command line interface.
However, AT&T was prohibited from entering the computer business by a 1956 consent decree by the United States government because of the whole telephone monopoly. This led them to license Unix to universities for minimal fees, including letting them see the source code. Academic institutions, particularly the University of California, Berkeley, received, studied, and modified the code. Computer science students learned programming by reading actual production code.
This created a generation of programmers who expected to be able to see and modify source code, establishing a culture that valued openness and knowledge sharing. BSD, or Berkeley Software Distribution, originated in the late 1970s at the University of California, Berkeley, as a series of enhancements to AT&T's original Unix. Led by Bill Joy and others, the project began by adding useful tools and features and eventually evolved into a full-fledged operating system.
AT&T began complaining about BSD infringing on its rights, which eventually led to a lawsuit in 1992, which Berkeley won with some minor concessions. The issues with AT&T were just one of many changes that were happening to the world of software in the late 70s and early 80s. As computers became more ubiquitous and software was finding itself in more devices, more companies began to make their software proprietary, and the culture of the free sharing of software began to wane.
In the wake of these changes to the culture of software, the GNU project was launched in September 1983 by Richard Stallman, then a programmer at MIT's Artificial Intelligence Laboratory. The project name is a recursive acronym for GNU's Not Linux, a humorous acknowledgement that while GNU was designed to be Unix-compatible, it would contain no Unix code.
In 1985, Stallman founded the Free Software Foundation to support and promote the development of free software, software that respected users' freedom to use, study, modify, and share. As proprietary software became more common in the 1980s, the Free Software Foundation provided legal, philosophical, and organizational support for the free software movement, including the creation of the GNU General Public License, or GPL, a license that ensured software would remain free for all users.
More on free software in a bit. The GNU operating system was lacking one major component, however, a kernel. A kernel acts as a bridge between applications and the physical machine, ensuring that programs run efficiently and safely on the computer's hardware. It handles the essential tasks like memory management, process scheduling, device control, and system calls. The kernel issue was addressed in 1991 by a Finnish computer scientist by the name of Linus Torvaldis.
Traveldus released Linux under the GPL, allowing anyone to use, modify, and distribute it freely. Because the GNU project had already developed many essential systems utilities but lacked a working kernel, Linux quickly became the missing piece to form a fully functional, free, Unix-like operating system called GNU Linux. Although many people today just shorten it to Linux.
The 90s saw the rise of the internet, which dramatically improved the ability for people to share code and also created more demand for free software. Many of the software components that make up the backbone of the internet were developed during this time. Apache, the world's most popular web server, PHP, a very popular web scripting language, and MySQL, a popular free database, were all developed in the 1990s.
You may have noticed that this far into the episode, I have yet to mention the phrase which is on the title of this episode, open source. In the late 1990s, the term open source was developed as a way to rebrand and reframe the free software movement in more pragmatic, business-friendly terms.
While the Free Software Foundation emphasized software freedom as an ethical and political issue, some developers and advocates believe that this messaging limited the broader adoption of free software, especially within the commercial world.
In 1998, after Netscape released the source code for its browser, which became Mozilla, a group including Eric S. Raymond, Bruce Perens, and Christine Peterson coined the term open source to highlight the practical benefits of collaborative, transparent development, such as higher quality, faster innovation, and lower costs, without the ideological framing.
This led to the creation of the Open Source Initiative to define and promote open source software through a more inclusive and commercially palatable lens. The movement rapidly gained momentum, drawing in major companies and reshaping the entire software industry. And here I should explain the differences between free and open source, because it can be confusing, because we use the term free for two different things.
If software is free, as in you don't have to pay for it, it doesn't mean that it's open source. Someone could create a program and allow people to download it without payment, but still retain full rights to the code. The word free with respect to the Free Software Foundation refers to freedom as in liberty. However, free software under this meaning is also free as in you don't have to pay for it.
Free software, as advocated by the Free Software Foundation, is also all open source software. However, not all open source software is free, as in freedom. Depending on the license, there might be restrictions placed on the code.
there are multiple licenses available under which open-source software can be published. Some of the most popular licenses include the previously mentioned GPL, the MIT license, the Apache License 2.0, the BSD license, the Mozilla Public License, and the Eclipse Public License. Each license is a little bit different and provides different rights to the software users. What they all pretty much have in common is that they allow users to use the software freely and to view and edit the source code.
However, it also requires any changes to the software be subject to the same license, meaning that you can't take open source software and then sell it as proprietary. Sometimes there might be differences in the direction of an open source software project, and a group might take the code and create what's known as a fork, which is just a way of saying that they're going to take the project into a different direction. So in a world with multi-billion dollar software companies, how popular is open source software?
And the answer is, extremely popular. And you probably use it every day without even knowing it. Let's start with the GNU Linux operating system. Linux has never really caught on as a desktop operating system. Today it has about a 4% market share of the desktop operating system market. However, Linux is the number one operating system for web servers. So if you visit a website, there's a very good chance that it is running Linux.
Of the top 500 supercomputers in the world, 100% of them run Linux. The Linux kernel is also the core of the Android operating system for smartphones, which has a global 72% share of the market. Your web browser almost certainly has open-source software. The Google Chrome browser is based on the open-source Chromium project. In addition to Chrome, the Microsoft Edge browser, Opera, and Brave all use Chromium.
Apple's Safari browser uses the WebKit engine, which is open source, and the entire Firefox browser is open source. The Apache web server is open source, and it is by far the most popular web server application on the Internet. 40% of all web pages on the Internet are hosted on WordPress, which is open source. And one of the most popular websites in the world is Wikipedia, which is entirely open source.
There are open source alternatives that exist for almost every type of proprietary program you can think of, including word processors, photo editing, and media players. So whether or not you realize it, open source software is absolutely pivotal to the working of the internet. Take it away and everything would cease to function. And this pillar of our online world stems directly from the early culture of programmers sharing their work with each other.
The executive producer of Everything Everywhere Daily is Charles Daniel. The associate producers are Austin Okun and Cameron Kiefer. Today's review comes from listener Skunk1010 over on Apple Podcasts in the United States. They write, Great variety. This show does such a great job of bringing the listener into a wide range of topics from science and math to history and sports.
Gary does a good job of making the subject matter interesting and digestible. His travels allow him to speak on topics far and wide, including my own backyard, the Cardiff Giant. Keep up the good work. Well, thanks, Gunk. I'm glad that you enjoy listening to them as much as I enjoy making them. Remember, if you leave a review or send me a boostagram, you too can have it read on the show.