Do software evolve too fast? A biologist’s point of view
Numerical Ecology of aquatic systems laboratory
COMPLEXYS Research Institute
University of Mons, Belgium
Do software evolve too fast? As a specialist in this field, you probably consider that the answer is definitely: NO. Software must adapt to its rapidly changing environment… or it dies. Right, but let’s consider the question differently. Perhaps the most pertinent question is HOW should a software evolve? You also know that,… but what could a biologist that look at software evolution could think about it?
Well, first, I am a software user. As such, I HATE the “revolutions”: a brand new OS version, a totally different user interface, a completely different standard for data format, … They tell us this new version is much better than before, but I hate it because I need to change all my habits and I fear the cortege of bugs that inevitably come with those major new versions.
More than the software in itself, it the *standards* that cause trouble when they change. Who says changes in standards are inevitable to adapt to new technologies or new needs? Really? What happens when you look at another complex system: life? Well, several “standards” emerged quite early in the history of life on Earth. Take for instance the genetic code. It is the same for all life beings, since a very long time. It is a kind of de facto standard to represent proteins in the genetic code. It was “designed” very early in life history and it has never changed. Yet, incredibly complex organisms exist today, in comparison with those that set up the genetic code in the dawn of our world.
Are computer software so much different than life on Earth in the way it should or could evolve as to necessitate radical changes in its standards and bases so often? I am not sure. May be the rapid change of standards is a side effect of other causes than the intrinsic need to change: (commercial) strategies, egos of people that like to design their own brand new standards, etc.
In science everyone claims we need more reproductible, if not replicable research. In biology, it is just a dream today. Computing science is probably one of the disciplines in sciences which is closest to the goal. Yet, reproducible research seeks for strong bases: strong standards that do not change every year if someone wants to be able to reproduce a research done, say, ten years ago without rebooting a very old computer.
In this talk, I discuss and compare different “standards” in life and computer science to point out how different they are. Given the need for more stability by many end users, or for reproducible research, I ask the question: could software evolve more softly? Could we avoid all those revolutions? Would it be possible to think today about standards that are designed to last much longer?
Philippe Grosjean is professor in biostatistics and aquatic ecology at University of Mons (Belgium). He is the head of the laboratory of Numerical Ecology of Aquatic Systems (http://econum.umons.ac.be) since 2004, and member of the interdisciplinary COMPLEXYS Research Institute at UMONS. His research is focused on the study of the impact of global climate change on marine life. He studies how tropical corals adapt to those changes in artificial reef mesocosms run at UMONS. He also develops and maintains software, mainly written in R, as Zoo/PhytoImage: a system used by planktonologists around the world to automate the analysis of phytoplankton and zooplankton samples (those little organisms that drift in the open ocean). He is also co-author of two patents. Since 2012, he is involved in an interdisciplinary research project on ecological studies of open source software ecosystems (ECOS) together with Tom Mens also from UMONS.