The current usage of the term artificial life (ALife) began in the late 1980s when the field was established by American computer scientist Chris Langton. The term was used in 1987 as a title for the Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems” organized in Los Alamos, New Mexico. There is a biannual international conference series called Artificial Life XI that alternates with the biannual European Conference on Artificial Life (ECAL). Langton was the first editor-in-chief of the journal Artificial Life, which was founded in 1993 and is the journal coordinated through the International Society for Artificial Life (ISAL), which was established in 2001.
ALife was originally defined by Langton as “life made by man rather than by nature,” meaning it is the study of man-made systems that show behaviors characteristic of natural living systems. Langton changed the definition in 1998 to “the study of natural life, where nature is understood to include rather than to exclude, human beings and their artifacts.” ALife is interdisciplinary and includes the study of life and life-like processes and attempts to understand living systems by artificially synthesizing simple life-like systems. ALife provides insight into life-as-we-know-it and also life-as-it-could-be.
There are three intertwining branches of artificial life using computers, machines, or molecules. “Soft” artificial life works in simulations or other purely digital constructions that exhibit life-like behavior, such as Lenia. “Hard” artificial life builds hardware implementations of life-like systems, such as swarm robotics. “Wet” artificial life synthesizes living systems from biochemical substances.
The themes of ALife include origins of life, autonomy, self-organization, adaptation (evolution, development, and learning), ecology, artificial societies, behavior, computational biology, artificial chemistries, information, living technology, art, and philosophy.
John von Neumann is considered to be one of the founding fathers of artificial life due to his work on cellular automata, mathematical models used to simulate complex systems or processes. Von Neumann did pioneering work on sequential computing systems and parallel architectures. Neumann proposed the concept of the self-reproducing machine or automata in the 1940s, which borrowed the concept of genetic information from biology before DNA was discovered. A cellular automaton is a theoretical machine made of elements called cells, which each have a value, or state, with cells connected to neighboring cells in a one- or multidimensional lattice. The states of cells change over time, and predefined rules are used to compute the new state of a cell from the previous states of the neighboring cells. Von Neumann demonstrated a similarity between the dynamics of a cellular automaton and the biological processes of self-reproduction and evolution. Other cellular automata were later developed.
The 1966 publication of von Neumann’s book, Theory of Self-Reproducing Automata, was completed by Arthur Brooks after von Neumann’s death and led to an increased interest in cellular automata. The Game of Life model developed by John Conway in 1970 is a two-dimensional lattice with infinite size in which cells have two possible states: dead or alive. The state of the cell is calculated based on the number of neighboring cells that are alive or dead. A player of Game of Life chooses how many cells are alive and then observes the patterns formed by cells as they divide.
In 1984, Christopher Langton proposed Langton's loop, a self-replicating cellular automaton with eight possible states, with a construction arm and a looped pathway. Langton’s loop has an outside layer of cells with a fixed state and an inside layer of cells in diverse states circulating around the loop, which serve as the building instructions for self-reproduction of the loop. A new Langton’s loop grows on the end of the construction arm and disconnects when completed.
Cellular automata proved able to exhibit complex behavior based on a few rules. For this reason, cellular automata are a fundamental concept in artificial life. Cellular automaton was used in Hugo De Garis’s Embryo Project, which aimed to build an artificial embryo and establish artificial embryogenesis as a branch of artificial life research. Von Neumann didn’t produce his self-producing machine, but Langton suggested that showing that self-reproducing machines are possible opened up the possibility that life itself could be achieved by machines.
Life is considered to have functional integration of three characteristics: self-replication, metabolism, and compartmentalization. Self-replication is the ability to produce copies of itself that are not necessarily exact. Compartmentalization is the ability to separate the living organism from its surroundings. Metabolism is the entrance of material or energy into a system, and the material contributes to chemical reactions that result in an increase in inner constituents and energy supply of the system.
A mechanism for self-replication with small peptide-containing molecules was discovered by a University of Groningen research team led by Professor Sijbren Otto. The peptide rings form growing stacks in solution. When a stack breaks, both halves regrow. As the growth of rings depletes the number of rings in solution, new rings are stimulated to form from the building blocks. The system of self-replicating molecules was shown to evolve and develop catalytic capabilities, which speed up chemical reactions that would normally occur, a basic form of metabolism. It is thought that a 3-D configuration arises, which acts as a catalytic center.
In the theme of autonomy, the term “autopoiesis” was coined by Maturana and Varela (1980) to characterize a network of processes that self-maintains its organization. Autopoiesis is now understood to include both self-organizing and self-producing aspects. A computer model created by these biologists in 1974 may be considered one of the first examples of ALife. The idea of autopoiesis is sometimes formalized as operational closure, which can be considered a network of processes where each process enables and is enabled by at least one other process in the network. Using this concept, Varela has described other biological systems, such as the nervous system and the immune system as autonomous, even though they do not chemically self-reproduce.
In robotics, autonomy does not include self-production but instead refers to the capacity of a system to move and interact without depending on remote control by an operator. Rather than micromanaging many aspects of a system’s behavior, behavior-based robots have been developed.
Artificial life uses a bottom-up approach to modeling natural phenomena where local interactions between its constituent parts give rise to dynamical behavior. Two fundamental properties of behaviors that come out of bottom-up computational synthesis are self-organization and emergence. Self-organization is a pattern or behavior that is due to interactions internal to the system with no influences from outside. Self-organization shows an increase in order, autonomy, adaptability, and dynamics. Self-organized patterns and behaviors in natural phenomena include sand storms, tornados, immune reactions, and nest construction by insects such as bees and wasps.
Emergence or emergent behavior is a new process or new behavior that comes out at a macro level in the system that cannot be reduced to behaviors and properties of the system components. The lower or micro levels in the system have a collective behavior that can explain the emergent behavior. Self-organization and emergence are sometimes used interchangeably. A flock of birds results from interactions between individual birds. Birds in a flock have two balanced and opposing behaviors—a desire to stay close to the flock and a desire to avoid colliding with other birds.
Models of collective behavior, such as flocks, schools, herds, crowds, and swarming fall under self-organization. Self-replication can be seen as a special case of self-organization. Homeostasis, related to self-maintenance, can be viewed as a special case of self-organization and has been studied in relation to artificial chemistries. In hard ALife, self-assembling or self-reconfiguring robots have been researched. Insect swarming has been used as inspiration for robots.
Since chemical components are nonliving but form living systems, artificial chemistries are used to study questions related to the origin of life from chemical components. One example is the computer simulation introduced by Varella, Maturana, and Uribe in 1974, of the formation of a protocell with a metabolic network and a boundary, which introduced the concept of autopoiesis. Other examples include M,R systems, chemoton, hypercycle, autocatalysis, and algorithmic chemistry. Artificial chemistries also include evolution.
Astrobiology, the study of the origins, evolution, distribution, and future of life in the universe, uses computational biology and artificial life approaches. These approaches are used by astrobiologists, to explore the potential for alien life by investigating different processes that could exist on other planets with different environmental and geochemical conditions.
The first implementation of a model of quantum artificial life on a quantum computer was in 2018 on the IBM ibmqx4 cloud quantum computer by researchers at the UPV/EHU-University of Basque Country in the Quantum Technologies for Information Science (QUTIS) group. The quantum artificial life algorithm that followed Darwin’s laws of evolution. The biomimetic protocol, or quantum biomimetics, is the reproduction of certain properties of living beings in quantum systems. Each unit of quantum life is made up of two qubits that act as genotype and phenotype. The genotype contains information describing the type of living unit and the information is transmitted across generations. The phenotype is the characteristics displayed by individuals, as determined by genetic information as well as the interaction of the individuals with the environment. Interactions between individuals and mutations were implemented as random rotations of individual qubits.
One of the major questions about the origin of life is the organization of molecules into complex molecular chains that became the basis for self-replicating life. Artificial life software called Avida runs self-replicating programs that mimic biology and evolution. Avida has been used to test the idea that life can be defined as information that self-replicates and the laws of probability govern the selection of useful molecular systems for life. In the virtual world of Avida, virtual lifeforms are computer programs competing for CPU time and memory access, which is analogous to organisms competing for resources in the real world. The computer programs self-replicate with copy error, to simulate DNA mutations and natural selection.
In the field of evolutionary robotics, Professor A.E. Guszti Eiben at Vrije Universiteit in Amsterdam investigates AI in a physical body, using artificial evolution as a way to achieve artificial intelligence. Eiben’s goal is to develop physically evolving robot systems that can reproduce, learn, and develop over consecutive generations. The study of evolving robots can be used to investigate questions about evolution and the emergence of intelligence in a non-carbon-based form. Professor Eiben postulated that if evolution can create intelligence, artificial evolution can create artificial intelligence. Eiben’s research team presented the first robot baby in 2016 as part of the Robot Baby Project. The robots consisted of configurations of motorized blocks that moved toward a bright light. Those that arrived quickly contacted another to check if they were suitable mates. The code and hardware were mixed and synthesized using 3D printing to form a new robot.
EvoSphere, funded through the Autonomous Robot Evolution (ARE) consortium in the UK, is an evolutionary robot habitat for studying evolution and the emergence of intelligence to answer questions important in the fields of biology and computer science. A potential application of this research is the development of robot breeding farms, controlled by humans, analogous to selective breeding used in farming. This approach and more open-ended evolutionary processes may be used in situations where designing a robot is difficult, such as robots that need to perform complex tasks in a dynamic environment. There are potential dangers associated with the emergence of unwanted or dangerous robot properties. One safeguard promoted by Eiben is to keep robot reproduction centralized, where it can be turned off if needed.
Swarm robotics is inspired by collaborative behaviors of animal groups like ant colonies, schools of fish, and flocks of birds that behave in a coordinated fashion. In swarm robotics swarm intelligence, the collective behavior of decentralized systems is applied to AI. Without a central control, a group of agents or boids collaborate in their environment while following a set of algorithmic rules. From the group, behavior emerges to solve a complex problem that is unknown to each individual agent/boid.
An individual in a robot swarm may consist of a robot with a sensor such as a camera, radar, or sonar. One robot collects and shares data with others in a group, and the robot swarm combines the knowledge of many independent agents to make a single unified decision. In swarm robotics, interactions between robots are local, but collectively, the swarm accomplishes a task. Swarming is only one possible collective behavior that may occur in swarm robotics.
Robot platforms used in swarm robotics include jasmine, alice, kilobots, e-puck, swarm-bots, and swarmanoids. BEECLUST is an algorithm used to enable a swarm of Jasmine III robots to find locations of maximum illumination by following a few simple rules. Researchers at the Wyss Institute at Harvard University developed Kilobots, which coordinate motion similar to schools of fish. A swarmanoid is comprised of three types of bots that cooperate to perform tasks. Footbots transport objects at ground level, handbots are specialized for climbing, and eyebots have visual sensors.
Applications for swarm robotics are tasks such as mapping and foraging in environments that are difficult for humans to reach. The Wyss Institute at Harvard University is developing swarm robotics for search and rescue missions, construction, environmental remediation, and medical applications. Collective behaviors in nature and swarm robotics systems are used to study neuroscience because cognition is a complex higher-order process that emerges from interactions between individual neurons.
The field of synthetic biology has historical connections to artificial life research. Both synthetic biology and artificial life strive to design “life as it could be,” living systems designed for a specific purpose or for scientific inquiry, and both use systems biology thinking and synthetic methodology. Synthetic biology may be considered a form of wet artificial life. The "holy grail" of wet artificial life and synthetic biology has been stated to be the creation of an artificial cell or synthetic cell, out of biochemicals or biological parts.
Liquid droplets can be viewed as liquid robots or animated soft matter because they possess lifelike properties and can be designed to move and be controlled by external forces. Liquid droplets can mimic the behavior of nonliving things, such as rocks rolling down a hill, but also the behavior of living cells or small organisms moving in response to stimuli. Applied forces or stimuli perceived by droplets include gravitational force and chemical gradients. A gradient of chemicals that are soluble in the fluid causes the drop to move along the gradient, which mimics chemotaxis, the movement of cells or organisms in response to a chemical stimulus. Electric and magnetic fields can also be applied to move liquid droplets. Haptotaxis is the motion induced by an adhesion gradient, which occurs due to differences in surface properties between the front and rear of the droplet. Liquid droplets can also be induced to divide or fuse. Future applications for droplet-based devices include delivering therapeutics inside the body and environmental sensing and remediation.
Artificial cells, besides providing models to understand life, could be developed for industrial applications. No artificial cells deemed to be living have been built from constituent parts as of 2022. Artificial cells that resemble living cells in terms of functional and compositional complexity were constructed with pieces of broken-down cells organized on a synthetic scaffold by Xu et al. (2022). These artificial cells met several attributes of living cells: an outer membrane, a crowded interior, enzymatic reactions, protein synthesis, structural organization with compartments, a primitive cytoskeleton, and the ability to asymmetric cellular shapes.
Artificial cells by Xu and colleagues were constructed by rupturing two types of bacterium and introducing their contents into membrane-free droplets called coacervates. One bacterium contributed components to the membrane surrounding the coacervate core ad the other bacterium contributed internal sub-compartments. Bacteria were later introduced as surrogate mitochondria for the production of ATP. The resulting artificial cells have a bacterium-derived membrane around a coacervate core that contains enzymes, plasmid DNA, protein-synthesis machinery, and vacuoles.
The approach of using living systems as the basis of artificial cells leaves many unknown constituents. In 2016, a minimal cell was engineered by J. Craig Venter and colleagues, to contain only the genes thought to be essential for life, but these cells still contain genes with unknown functions. To create completely controllable artificial cells, researchers seek to identify the constituent parts that are needed and understand how they interact with each other.
Artificial life art is the use of biologically inspired artifacts to prompt reconsideration of our views on life and its manufacture. Artists in this field explore our relationship with living systems and technology. Human-animal hybrids, such as mermaids, have been long-depicted by artists and are considered to be life-as-it-could-be explorations.
The creative potential of generative processes is explored with software programs such as Biomorphs, described in The Blind Watchmaker by Richard Dawkins; Latham’s evolved sculptures; and in the shape of carrots, by Verstappen and Driessens' Morphotheque works. Images can be generated using software that incorporates the processes of ant pheromone path-laying for image generation.