Lenia

Continuous generalization of cellular automata
A sample autonomous pattern from Lenia.
An animation showing the movement of a glider in Lenia.

Lenia is a family of cellular automata created by Bert Wang-Chak Chan.[1][2][3] It is intended to be a continuous generalization of Conway's Game of Life, with continuous states, space and time. As a consequence of its continuous, high-resolution domain, the complex autonomous patterns ("lifeforms" or "spaceships") generated in Lenia are described as differing from those appearing in other cellular automata, being "geometric, metameric, fuzzy, resilient, adaptive, and rule-generic".[1]

Lenia won the 2018 Virtual Creatures Contest at the Genetic and Evolutionary Computation Conference in Kyoto,[4] an honorable mention for the ALIFE Art Award at ALIFE 2018 in Tokyo,[5] and Outstanding Publication of 2019 by the International Society for Artificial Life (ISAL).[6]

Rules

Iterative updates

Let L {\displaystyle {\mathcal {L}}} be the lattice or grid containing a set of states S L {\displaystyle S^{\mathcal {L}}} . Like many cellular automata, Lenia is updated iteratively; each output state is a pure function of the previous state, such that

Φ ( A 0 ) = A Δ t , Φ ( A Δ t ) = A 2 Δ t , , Φ ( A t ) = A t + Δ t , {\displaystyle \Phi (A^{0})=A^{\Delta t},\Phi (A^{\Delta t})=A^{2\Delta t},\ldots ,\Phi (A^{t})=A^{t+\Delta t},\ldots }

where A 0 {\displaystyle A^{0}} is the initial state and Φ : S L S L {\displaystyle \Phi :S^{\mathcal {L}}\rightarrow S^{\mathcal {L}}} is the global rule, representing the application of the local rule over every site x L {\displaystyle \mathbf {x} \in {\cal {L}}} . Thus Φ N ( A t ) = A t + N Δ t {\displaystyle \Phi ^{N}(A^{t})=A^{t+N\Delta t}} .

If the simulation is advanced by Δ t {\displaystyle \Delta t} at each timestep, then the time resolution T = 1 Δ t {\displaystyle T={\frac {1}{\Delta t}}} .

State sets

Let S = { 0 , 1 , , P 1 , P } {\displaystyle S=\{0,1,\ldots ,P-1,P\}} with maximum P Z {\displaystyle P\in \mathbb {Z} } . This is the state set of the automaton and characterizes the possible states that may be found at each site. Larger P {\displaystyle P} correspond to higher state resolutions in the simulation. Many cellular automata use the lowest possible state resolution, i.e. P = 1 {\displaystyle P=1} . Lenia allows for much higher resolutions. Note that the actual value at each site is not in [ 0 , P ] {\displaystyle [0,P]} but rather an integer multiple of Δ p = 1 P {\displaystyle \Delta p={\frac {1}{P}}} ; therefore we have A t ( x ) [ 0 , 1 ] {\displaystyle A^{t}(\mathbf {x} )\in [0,1]} for all x L {\displaystyle \mathbf {x} \in {\mathcal {L}}} . For example, given P = 4 {\displaystyle P=4} , A t ( x ) { 0 , 0.25 , 0.5 , 0.75 , 1 } {\displaystyle \mathbf {A} ^{t}(\mathbf {x} )\in \{0,0.25,0.5,0.75,1\}} .

Neighborhoods

A 9-square Moore neighborhood like those used in Game of Life.
The "ball" neighborhoods used by Lenia.

Mathematically, neighborhoods like those in Game of Life may be represented using a set of position vectors in R 2 {\displaystyle \mathbb {R} ^{2}} . For the classic Moore neighborhood used by Game of Life, for instance, N = { 1 , 0 , 1 } 2 {\displaystyle {\mathcal {N}}=\{-1,0,1\}^{2}} ; i.e. a square of size 3 centered on every site.

In Lenia's case, the neighborhood is instead a ball of radius R {\displaystyle R} centered on a site, N = { x L : x 2 R } {\displaystyle {\mathcal {N}}=\{\mathbf {x} \in {\mathcal {L}}:\lVert \mathbf {x} \rVert _{2}\leq R\}} , which may include the original site itself.

Note that the neighborhood vectors are not the absolute position of the elements, but rather a set of relative positions (deltas) with respect to any given site.

Local rule

There are discrete and continuous variants of Lenia. Let x {\displaystyle \mathbf {x} } be a vector in R 2 {\displaystyle \mathbb {R} ^{2}} within L {\displaystyle {\mathcal {L}}} representing the position of a given site, and N {\displaystyle {\mathcal {N}}} be the set of sites neighboring x {\displaystyle \mathbf {x} } . Both variations comprise two stages:

  1. Using a convolution kernel K : N S {\displaystyle \mathbf {K} :{\mathcal {N}}\rightarrow S} to compute the potential distribution U t ( x ) = K A t ( x ) {\displaystyle \mathbf {U} ^{t}(\mathbf {x} )=\mathbf {K} *\mathbf {A} ^{t}(\mathbf {x} )} .
  2. Using a growth mapping G : [ 0 , 1 ] [ 1 , 1 ] {\displaystyle G:[0,1]\rightarrow [-1,1]} to compute the final growth distribution G t ( x ) = G ( U t ( x ) ) {\displaystyle \mathbf {G} ^{t}(\mathbf {x} )=G(\mathbf {U} ^{t}(\mathbf {x} ))} .

Once G t {\displaystyle \mathbf {G} ^{t}} is computed, it is scaled by the chosen time resolution Δ t {\displaystyle \Delta t} and added to the original state value:

A t + Δ t ( x ) = clip ( A t + Δ t G t ( x ) , 0 , 1 ) {\displaystyle \mathbf {A} ^{t+\Delta t}(\mathbf {x} )={\text{clip}}(\mathbf {A} ^{t}+\Delta t\;\mathbf {G} ^{t}(\mathbf {x} ),\;0,\;1)}
Here, the clip function is defined by clip ( u , a , b ) := min ( max ( u , a ) , b ) {\displaystyle \operatorname {clip} (u,a,b):=\min(\max(u,a),b)} .

The local rules are defined as follows for discrete and continuous Lenia:

U t ( x ) = { n N K ( n ) A t ( x + n ) Δ x 2 , discrete Lenia n N K ( n ) A t ( x + n ) d x 2 , continuous Lenia G t ( x ) = G ( U t ( x ) ) A t + Δ t ( x ) = clip ( A t ( x ) + Δ t G t ( x ) , 0 , 1 ) {\displaystyle {\begin{aligned}\mathbf {U} ^{t}(\mathbf {x} )&={\begin{cases}\sum _{\mathbf {n} \in {\mathcal {N}}}\mathbf {K(n)} \mathbf {A} ^{t}(\mathbf {x} +\mathbf {n} )\Delta x^{2},&{\text{discrete Lenia}}\\\int _{\mathbf {n} \in {\mathcal {N}}}\mathbf {K(n)} \mathbf {A} ^{t}(\mathbf {x} +\mathbf {n} )dx^{2},&{\text{continuous Lenia}}\end{cases}}\\\mathbf {G} ^{t}(\mathbf {x} )&=G(\mathbf {U} ^{t}(\mathbf {x} ))\\\mathbf {A} ^{t+\Delta t}(\mathbf {x} )&={\text{clip}}(\mathbf {A} ^{t}(\mathbf {x} )+\Delta t\;\mathbf {G} ^{t}(\mathbf {x} ),\;0,\;1)\end{aligned}}}

Kernel generation

The kernel shell, kernel skeleton, and growth mappings for Lenia.

There are many ways to generate the convolution kernel K {\displaystyle \mathbf {K} } . The final kernel is the composition of a kernel shell K C {\displaystyle K_{C}} and a kernel skeleton K S {\displaystyle K_{S}} .

For the kernel shell K C {\displaystyle K_{C}} , Chan gives several functions that are defined radially. Kernel shell functions are unimodal and subject to the constraint K C ( 0 ) = K C ( 1 ) = 0 {\displaystyle K_{C}(0)=K_{C}(1)=0} (and typically K C ( 1 2 ) = 1 {\displaystyle K_{C}\left({\frac {1}{2}}\right)=1} as well). Example kernel functions include:

K C ( r ) = { exp ( α α 4 r ( 1 r ) ) , exponential , α = 4 ( 4 r ( 1 r ) ) α , polynomial , α = 4 1 [ 1 4 , 3 4 ] ( r ) , rectangular , etc. {\displaystyle K_{C}(r)={\begin{cases}\exp \left(\alpha -{\frac {\alpha }{4r(1-r)}}\right),&{\text{exponential}},\alpha =4\\(4r(1-r))^{\alpha },&{\text{polynomial}},\alpha =4\\\mathbf {1} _{\left[{\frac {1}{4}},{\frac {3}{4}}\right]}(r),&{\text{rectangular}}\\\ldots ,&{\text{etc.}}\end{cases}}}

Here, 1 A ( r ) {\displaystyle \mathbf {1} _{A}(r)} is the indicator function.

Once the kernel shell has been defined, the kernel skeleton K S {\displaystyle K_{S}} is used to expand it and compute the actual values of the kernel by transforming the shell into a series of concentric rings. The height of each ring is controlled by a kernel peak vector β = ( β 1 , β 2 , , β B ) [ 0 , 1 ] B {\displaystyle \beta =(\beta _{1},\beta _{2},\ldots ,\beta _{B})\in [0,1]^{B}} , where B {\displaystyle B} is the rank of the parameter vector. Then the kernel skeleton K S {\displaystyle K_{S}} is defined as

K S ( r ; β ) = β B r K C ( B r  mod  1 ) {\displaystyle K_{S}(r;\beta )=\beta _{\lfloor Br\rfloor }K_{C}(Br{\text{ mod }}1)}

The final kernel K ( n ) {\displaystyle \mathbf {K} (\mathbf {n} )} is therefore

K ( n ) = K S ( n 2 ) | K S | {\displaystyle \mathbf {K} (\mathbf {n} )={\frac {K_{S}(\lVert \mathbf {n} \rVert _{2})}{|K_{S}|}}}

such that K {\displaystyle \mathbf {K} } is normalized to have an element sum of 1 {\displaystyle 1} and K A [ 0 , 1 ] {\displaystyle \mathbf {K} *\mathbf {A} \in [0,1]} (for conservation of mass). | K S | = N K S Δ x 2 {\displaystyle |K_{S}|=\textstyle \sum _{\mathcal {N}}\displaystyle K_{S}\,\Delta x^{2}} in the discrete case, and N K S d x 2 {\displaystyle \int _{N}K_{S}\,dx^{2}} in the continuous case.

Growth mappings

The growth mapping G : [ 0 , 1 ] [ 1 , 1 ] {\displaystyle G:[0,1]\rightarrow [-1,1]} , which is analogous to an activation function, may be any function that is unimodal, nonmonotonic, and accepts parameters μ , σ R {\displaystyle \mu ,\sigma \in \mathbb {R} } . Examples include

G ( u ; μ , σ ) = { 2 exp ( ( u μ ) 2 2 σ 2 ) 1 , exponential 2 1 [ μ ± 3 σ ] ( u ) ( 1 ( u μ ) 2 9 σ 2 ) α 1 , polynomial , α = 4 2 1 [ μ ± σ ] ( u ) 1 , rectangular , etc. {\displaystyle G(u;\mu ,\sigma )={\begin{cases}2\exp \left(-{\frac {(u-\mu )^{2}}{2\sigma ^{2}}}\right)-1,&{\text{exponential}}\\2\cdot \mathbf {1} _{[\mu \pm 3\sigma ]}(u)\left(1-{\frac {(u-\mu )^{2}}{9\sigma ^{2}}}\right)^{\alpha }-1,&{\text{polynomial}},\alpha =4\\2\cdot \mathbf {1} _{[\mu \pm \sigma ]}(u)-1,&{\text{rectangular}}\\\ldots ,&{\text{etc.}}\end{cases}}}

where u {\displaystyle u} is a potential value drawn from U t {\displaystyle \mathbf {U} ^{t}} .

Game of Life

The Game of Life may be regarded as a special case of discrete Lenia with R = T = P = 1 {\displaystyle R=T=P=1} . In this case, the kernel would be rectangular, with the function

K C ( r ) = 1 [ 1 4 , 3 4 ] ( r ) + 1 2 1 [ 0 , 1 4 ) ( r ) {\displaystyle K_{C}(r)=\mathbf {1} _{\left[{\frac {1}{4}},{\frac {3}{4}}\right]}(r)+{\frac {1}{2}}\mathbf {1} _{\left[0,{\frac {1}{4}}\right)}(r)}
and the growth rule also rectangular, with μ = 0.35 , σ = 0.07 {\displaystyle \mu =0.35,\sigma =0.07} .

Patterns

Some of the wide variety of "species" in Lenia.

By varying the convolutional kernel, the growth mapping and the initial condition, over 400 "species" of "life" have been discovered in Lenia, displaying "self-organization, self-repair, bilateral and radial symmetries, locomotive dynamics, and sometimes chaotic nature".[7] Chan has created a taxonomy for these patterns.[1]

Related work

Cellular automata as a convolutional neural network.[8]

Other works have noted the strong similarity between cellular automata update rules and convolutions. Indeed, these works have focused on reproducing cellular automata using simplified convolutional neural networks. Mordvintsev et al. investigated the emergence of self-repairing pattern generation.[9] Gilpin found that any cellular automaton could be represented as a convolutional neural network, and trained neural networks to reproduce existing cellular automata[8]

In this light, cellular automata may be seen as a special case of recurrent convolutional neural networks. Lenia's update rule may also be seen as a single-layer convolution (the "potential field" K {\displaystyle \mathbf {K} } ) with an activation function (the "growth mapping" G {\displaystyle G} ). However, Lenia uses far larger, fixed, kernels and is not trained via gradient descent.

See also

External links

  • The Github repository for Lenia
  • Chan's website for Lenia
  • An invited seminar at Stanford given by Chan

References

  1. ^ a b c Chan, Bert Wang-Chak (2019-10-15). "Lenia: Biology of Artificial Life". Complex Systems. 28 (3): 251–286. arXiv:1812.05433. doi:10.25088/ComplexSystems.28.3.251.
  2. ^ "Lenia". chakazul.github.io. Retrieved 2021-10-12.
  3. ^ Roberts, Siobhan (2020-12-28). "The Lasting Lessons of John Conway's Game of Life". The New York Times. ISSN 0362-4331. Retrieved 2021-10-13.
  4. ^ "The virtual creatures competition". virtualcreatures.github.io. Retrieved 2021-10-12.
  5. ^ "ALife Art Award 2018". ALIFE Art Award 2018. Retrieved 2021-10-12.
  6. ^ "2020 ISAL Awards: Winners".
  7. ^ "Lenia". chakazul.github.io. Retrieved 2021-10-13.
  8. ^ a b Gilpin, William (2019-09-04). "Cellular automata as convolutional neural networks". Physical Review E. 100 (3): 032402. arXiv:1809.02942. doi:10.1103/PhysRevE.100.032402. ISSN 2470-0045.
  9. ^ Mordvintsev, Alexander; Randazzo, Ettore; Niklasson, Eyvind; Levin, Michael (2020-02-11). "Growing Neural Cellular Automata". Distill. 5 (2): e23. doi:10.23915/distill.00023. ISSN 2476-0757.