The never ending variety we observe in life is the result of natural selection, a process that results in the survival and reproductive success of individuals or groups best adjusted to their environment and that leads to the perpetuation of genetic qualities best suited to that particular environment. This process is similar enough to methods by which programmers attempt to solve problems. They try many different unique solutions, isolate the best ones, and repeat as many times as necessary. This is extremely similar to evolution as we know it.

  Evolution is an algorithm. Over the years evolution has used natural selection to “learn” us, in a way. If that’s possible then theoretically if we could replicate such an  algorithm and provide it with a large enough database of data on a powerful enough computer then it could learn everything there is to learn. All we have to to is simulate evolution with computer programs and we might very well have a path to the master algorithm. After all, evolution has been running for billions of years on Earth and it seems to have fared pretty well, therefore , it is a likely candidate for the ultimate learner.


  In order to replicate nature’s evolution process we need a fitness function. In nature every single combination of genes provides a creature with a score on a fitness scale, in  other words each particular combination of genes causes the organism to either survive and thrive or not survive at all depending on where it falls on the scale. A life form that isn’t truly fit for survival will have it’s genes weeded out of the gene pool We can utilize this process with computer programs as well. We just need to create an artificial fitness function by which we score algorithms and eliminate ones that don’t make the cut, thus recreating the natural selection process.


  All potential candidates for the master algorithm simply need an assigned purpose that it’s supposed to fill and then the fitness function will assign it a score based on how well it fits the criteria and goes from there. That sounds good on paper but we still don’t know if that is truly an accurate way to interpret natural selection not to mention we haven’t pinned down the true purpose for evolution either. Nevertheless, what we do know is that when it comes to machine learning we still need some kind of fitness function to accurately determine whether a certain program actually does what it was intended to do.

  If we had a self driving car that drove and made accurate decisions 99 percent of the time it would be much better than another one that was only 98 percent accurate. When it comes to self driving cars you need something that will drive safely 100 percent of the time but that’s beside the point. With a scale to measure usefulness the first self driver would be the better candidate to improve upon in order to fulfill the intended purpose. The same  could be said for a future AI that will have to correctly diagnose patients from a database of information. Obviously a program that is 70 percent accurate would have a higher fitness score than a program that only diagnosed accurately 60 percent of the time. That is in essence how the fitness function would weed out unfit algorithms.


   All the animals and plants we see today are the result of mating generation after generation, the organism that best served its intended purpose lived on to reproduce. A genetic algorithm would recreate this process with computer programs. The same way that DNA encodes a life form with sequences of base pairs, we can encode programs with strings of bits. Flipping a random bit to change the “genetics” of an algorithm would work more or less the same way a mutation in DNA does, we could even mimic the process of sexual reproduction and have two programs that have a high fitness function score combine the bits they were encoded with to produce a more capable program as an offspring.

  This process would repeat over and over with each new program being let loose and given its own fitness score based on its performance, this would end when the desired fitness is reached and the ideal program is created.

  Evolution tends to search for structures that will work and machine learning will fill in the gaps. Over time those steps will take us closer to the master algorithm. When it comes to the nature versus nurture debate “nature” would be the program and “nurture” would be the data making it clear that neither one is necessarily more important. In order for us to move forward with this theory that replicating evolution is a path to a universal learner we have to make sure both of those factors are ideal alongside the fitness function.


  However, given all the problems with this approach it is unlikely that we will discover the master algorithm using just this theory alone and even when combined with the human brain as  discussed in the previous post we don’t really have enough to go on at this point. The real goal here is to find the best possible learning algorithm by any means necessary and  let it loose so it can change the world in ways that have already been discussed.

  Looking at how our brains learn and observing how nature learns may not be sufficient, we will have to dive deeper if we want to solve this problem. Nevertheless, the seeming existence of learning algorithms in the way our brains are wired and in the way evolution takes its course is more then sufficient evidence that a master algorithm does exist in some form or another, we just have to figure out how to convert it into a form we can use.


Leave a Reply

Your email address will not be published. Required fields are marked *