Artificial intelligence and genetics of living beings are trained to very similar

A network of genes in animals are somewhat similar to the network of neurons in our brains - they can also "learn" on the go. In 1996, a young graduate student named Richard Watson decided to read an article about evolution. It was provocative and touched an old problem in evolutionary biology: we have little understanding of how organisms are so well adapted to their environment.

Artificial intelligence and genetics of living beings are trained to very similar

The living beings in their lifetime subject to changes, or mutations, in the genes, but they do not seem to be random. Instead, they are actually "improve" its ability to adapt. It seems that this ability is due not only to the process of natural selection, when the best features of the most successful organisms are transferred.

Therefore, the authors Günter Wagner of Yale University and Lee Altenberg of Hawaii Institute of Geophysics and Planetology in Honolulu decided to look for answers in an unexpected place: computer science.

Watson, a computer scientist, literally mad. In the 20 years that have passed since then, he read the article, he developed a theory based on the ideas expressed by then. It could help explain why the animals are so well evolve: this feature is called "evolyutsioniruemost" (or razvivaemost). Moreover, it could help to solve old interesting questions in evolutionary biology.

Artificial intelligence and genetics of living beings are trained to very similar

Many people are familiar with the idea that genes are passed from parent to child, and the genes that help their owners to survive and reproduce are more likely to be transmitted. This is the essence of evolution and natural selection.

But that's not all, because genes often work together. They form a "network of genes," and the network of genes is also sometimes can be transferred intact for generations.

"The fact that organisms have a gene networks and they are inherited from one generation to another, this information is not new," says Watson, currently working at the University of Southampton in the UK. His contribution is mainly related to how natural selection acts on these networks.

He believes that it acts not just a partial barrier that allows some adaptation to pass, and some do not. Instead, the effect of this filter allows genetic networks in animals actually "learn" what works and what does not, over time. Thus, they can improve their performance - in much the same way as artificial neural networks used by computer scientists can "learn" to solve problems.

"Gene Networks" evolve neural networks - learning, "he says. "That's what's really new."

At the heart of Watson's statement is the idea that the relationship between the genes can be strengthened or weakened by the evolution and type of change - and that is the strength of these connections in gene networks allows organisms to adapt.

This process is similar to how artificial neural networks are working on computers.

Artificial intelligence and genetics of living beings are trained to very similar

In our time, these systems are used to perform a variety of tasks. For example, they can recognize people's faces in photos or videos and even analyze the shooting football games in order to understand the tactics of a team performs much better and why. How to manage computers even define it?

Artificial neural networks are created in the image and likeness of biological networks - for the most part of the brain. Each network - is a collection of simulated "neurons" that are linked in a certain way; like stations and metro lines.

Networks like these are capable of receiving input data - for example, the word "hello" written on the page - and compare them with the output - say, in this case with the word "hello", which is stored in computer memory. About how children learn to read and write. As a child, the neural network can not immediately make the connection, and must be trained over time. This training is complex, but essentially involves changing the strong ties between the virtual neurons. Every time it improves the result, until the entire network is not able to reliably bring the desired response: in our example, funny characters on the page ( "hello") correspond to the word "hello". Now the computer knows what you've recorded.

Watson believes that something similar happens in nature. Developing kind of "outputs" line just for a particular environment.

There are different ways of learning neural networks. Those who focus on Watson, is a good example of what is happening in the biological gene networks, "Hebbian Learning".

In Hebbian learning the connection between adjacent neurons that have similar results, increasing over time. Briefly, "neurons that fire together, communicate with each other." The network "learns" by creating strong ties within itself.

If the body has certain genes that are fired together in this way and this organism is quite successful for breeding, his offspring do not just inherit it useful genes, says Watson. It also inherits the relationship between these genes.

Artificial intelligence and genetics of living beings are trained to very similar

A particular advantage of Hebbian learning that these networks can develop "modular" function. For example, one group of genes may determine whether the animal has its hind legs, or eye, or fingers. Similarly, a handful of related adaptations - fish as the ability to cope with high temperature and salinity of water - may be contacted and inherited entire gene in one network.

"If there is a separate entity, which has a slightly stronger regulatory relationship between these genes than any other, it would be preferable," Watson says. "They will choose natural selection. And then, after the lapse of evolutionary time, the strength of the links between these genes will be increased. " For Watson, it helps to get around the problem of stickiness to the theory of evolution.

Artificial intelligence and genetics of living beings are trained to very similar

Imagine for a moment that the genome of an organism is a computer code. Novice programmer could gradually update your code from time to time, trying to make improvements. With their help it was possible to determine whether another sequence command can make it work a little better.

Let's start with the fact that this process of trial and error can work quite well. But over time, the code update thus make it quite cumbersome. The code starts to look messy, making it difficult to determine what the consequences may cause a certain change. Sometimes it happens and in programming, the result is called "spaghetti code."

If the body really evolved in such a way, said Watson, their "evolyutsioniruemost - the ability to adapt to the new environment or stress - was not the best." But in fact, "the ability of organisms to adapt to the natural environment or selective issues is simply amazing."

Watson also suggested that the gene networks can include "memory" of previous adaptations, which could be due to environmental requirements.

For example, perhaps some groups of organisms can grow quickly to consume food, harmful to other members of the same species - because their ancestors have endured such a diet. In the past, gene regulation structure could change some triggers facilitating gene expression. This "bias" would eventually help their descendants to digest complex foods.

One real-world examples of Watson - stickleback. These fish developed the same time fresh tolerability, then salt water, and then returned back, depending on what is required from them the current environment.

Artificial intelligence and genetics of living beings are trained to very similar

The idea of ​​Watson means that the body must be filled with a variety of options for adaptation.

It also means that the gene networks have evolved - all animals - to adapt to the natural world of the Earth. That's why organisms respond so well to the environment: the stresses and strains in the environment of the Earth depicted in the regulatory relationships between genes over millions of years.

"I think it has always been a deep capacity to explore the parallels between computer training and evolution, but no one is doing it with the same rigor as Richard Watson," St. Andrews says Kevin Lalande from the University in the UK, took part in a large-scale project with Watson .

However, the biggest problem of Watson hypothesis is that it is possible to find any empirical evidence of its nature.

Until now, all Watson's ideas based on computational experiments in the laboratory. Apparently, these experiments could produce results similar to real organisms, but specific processes have not yet been observed.

Artificial intelligence and genetics of living beings are trained to very similar

"This is a question on the $ 64 million," admits Watson.

But Watson and Lalande believe that there are other ways to test this theory evolyutsioniruemosti. Watson proposes to analyze how to change the network of genes in microbes that develop in the laboratory. Because microbes, such as bacteria reproduce quickly, over a few days can be seen a few generations to adapt.

"If you want to spend a tough test of the theory, you have to wonder if you can make new predictions, is not yet reflected in literature?" Says Lalande.

For example, you could develop a computer system based on the ideas of Watson, which could predict how organisms will grow in the wild under certain known conditions. If such a system would be accurate, it will certainly help strengthen the theory. The gene networks already there are several features that help you reach Watson's approach. Mini-genes network defining specific adaptations, such as one of the modules mentioned above may sometimes be switched on or off only one other activator gene.

Examples of this can be found in nature, says Watson. Among them are "evolutionary drop": organisms with adaptations, which were believed to have been even disappear from their ancestors. This is called a "throwback".

A well-known example of this - the teeth in chickens. Chickens genetically able to grow teeth, but do not do it is usually in the wild or in captivity. However, the growth of teeth may be included in the laboratory with the help of molecular biology.

Artificial intelligence and genetics of living beings are trained to very similar

Sometimes atavistic traits are manifested in natural populations. One of the last possible cases - the whale found on a beach in Australia in February 2016. He had fangs like teeth, which are not usually seen in whales. Perhaps it is left from our ancestors, who also had teeth like fangs, millions of years ago.

Another topical phenomenon is the "convergent evolution" when unrelated species living in very different habitats, in some way come to the same adaptation. Examples include - certain patterns on the wings of butterflies and is very similar fish that live in some lakes in Africa, says Lalande.

"The same form, the same patterns appear again and again," he says. "It may be easier to create a certain kind of fish than others. Some forms may appear more often in the course of generations. "

Evolyutsioniruemost of this kind, described by Watson, may explain this. Genetic networks, he says, gradually learned to respond in a similar way in similar situations. These modular functions, such as drawing a butterfly's wing may be the most likely solutions for education system than others. In other words, if there are several necessary conditions for evolution will perform the same tricks over and over again.

And there are born rather philosophical questions. On the one hand, the evolution of large natural functions as a computer. And can "evolyutsioniruemost" to assume that life in some sense programmed to improve - at least at the genetic level? Some biologists are horrified by the idea, but if the ability of organisms to adapt to improve with age, if studying the evolution over time, really all so transparent?

Watson thinks so.

"Only if you present system with corresponding variability, selection and inheritance, you force the evolution of the work. And without evolyutsioniruemosti it is impossible to imagine. "