Skip to the content.

History of open models

The history of open models is intimately linked to the digital evolution. Their arrival is gradual as our use of this tool intensifies.

This movement began in the software world where direct contact with computing has enabled people to appreciate the ability to share a resource, in this case software, and to realise that open collaboration can bring certain unique properties. This understanding has developed in several stages over the last few decades.

Little by little, the software world began to have a more or less direct influence on other fields of activity. These collaborative dynamics are occurring across the entire field of knowledge.

Open models in the software world

Open Source Timeline

In the early days of computing, open source software and open collaboration were technically impossible.

In 1945, the first computer weighed 30 tons, built by academics with programs written on punched cards. The first digitized program appeared on June 11, 1948. When the first companies to produce computer hardware and then software appeared, software was in a sense “open source” even if the term didn’t exist; the source code was supplied with the hardware as business models were based on the sale of physical components. Computers were reserved for small groups of specialists who were proficient in programming and required to modify software.

From the 70s onwards, a new stage in the democratization of computing took place with the arrival of “microcomputer”, as computers became more personal. Hobbyist and hackers got to grips with the technology while new companies such as Microsoft in 1975 and Apple in 1977 emerged. As the latter wanted to retain control over software, the code was gradually closed, intellectual property began to apply to software with the Copyright Act of 1976. The source code became the exclusive property of companies, software was distributed without the source code in the form of executables.

In response, in 1983 Richard Stallman announced the GNU project to give everyone access to a free operating system which would provide the software suite needed to run a computer. He formulated 4 freedoms for software: the freedom to run it, to study it, to redistribute it and to distribute modified versions.
The battle was as much philosophical, with the launch of a manifesto, as it was legal. In 1989, the first version of the GNU General Public License was published. The Free Software movement took shape under the aegis of the Free Software Foundation, formalizing code sharing.

At the end of the 80s, a complete GNU operating system was almost ready, only the kernel was missing. At the same time in Finland, Linus Torvalds, still a computer science student, had a personal project to build one for his computer. With no financial motivation, he chose in 1991 to share his code on the Internet, which had been maturing since 1969 following the ARPANET program and was accelerating with the invention of the World Wide Web in 1991 by Tim Berners-Lee. As a kernel is an indispensable part of a computer’s operation, Linux spread easily, with everyone using it for personal use while sharing improvements that were integrated by Linus. By the end of the 90s, Linux had become a robust piece of software, licensed under the GNU GPL, massively used and co-produced by an army of developers worldwide.

The way in which Linux was produced was atypical and pushed back the limits of certain recognized and taught development models, which challenged Eric Raymond. Based on his observations, he tried to sketch out a set of rules to apply to the development of his own software fetchmail. Seeing that these 19 rules worked, he shared them in his essay The Cathedral and the Bazaar, proposing a new development model. His book influenced a whole section of the software world, the term open source was coined - partly for semantic reasons - to promote this new model which would later be supported by the Open Source Initiative. This development method saw open collaboration as a more efficient way of producing code.

20 years later, Free and Open Source Software (FOSS) is structuring our digital ecosystem to become a norm.

The emergence of open models outside software

This way of sharing and collaborating to produce software was built up over several decades. Computer scientists were able to develop the technical infrastructures that made this possible, gradually and somewhat by chance discovering the collaborative potential of digital technology. The democratization of computers, the Internet and the Web to the rest of society was to influence a multitude of other fields which then drew inspiration from the movements stemming from the software world, giving rise to a multitude of open models that are truly porous to each other.

The term open hardware was coined around 1997 by the co-founder of the Open Source Initiative, creatives commons in 2001 for content publication, wiki technology for collaborative editing of web pages emerged in 1995, with Wikipedia building on it in 2001, the concept of open innovation was formulated in 2003, open data in 1995… The idea of open science is a little peculiar, the desire for shared science began very early on since the people behind these tools were academics, but the concept of open science was formulated more explicitly in the same period, around 2000. Many new dynamics that are sprouting and gradually consolidating

For many of these phenomena, as in the software world, we should be able to observe a 2-step movement. The sharing of intellectual resources comes first, followed by the structuring of open collaboration facilitated by the evolution of technical infrastructure, alongside a growing culture of contribution.

Knowledge becomes easily accessible, co-produced massively, potentially leading to high-quality solutions due to these particularities of collective intelligence, continuously improvable, and available worldwide.

It’s a mechanism that seems to be taking place, these are profound changes over a long period of time, but could potentially be sped up by gaining a deeper understanding of the phenomenon.