Delay-induced Turing instability in reaction-diffusion equations.

Phys Rev E Stat Nonlin Soft Matter Phys

Hubei Key Lab of Intelligent Robot, Wuhan Institute of Technology, Wuhan 430073, China.

Published: November 2014

Time delays have been commonly used in modeling biological systems and can significantly change the dynamics of these systems. Quite a few works have been focused on analyzing the effect of small delays on the pattern formation of biological systems. In this paper, we investigate the effect of any delay on the formation of Turing patterns of reaction-diffusion equations. First, for a delay system in a general form, we propose a technique calculating the critical value of the time delay, above which a Turing instability occurs. Then we apply the technique to a predator-prey model and study the pattern formation of the model due to the delay. For the model in question, we find that when the time delay is small it has a uniform steady state or irregular patterns, which are not of Turing type; however, in the presence of a large delay we find spiral patterns of Turing type. For such a model, we also find that the critical delay is a decreasing function of the ratio of carrying capacity to half saturation of the prey density.

Download full-text PDF

Source
http://dx.doi.org/10.1103/PhysRevE.90.052908DOI Listing

Publication Analysis

Top Keywords

turing instability
8
reaction-diffusion equations
8
biological systems
8
pattern formation
8
time delay
8
patterns turing
8
turing type
8
delay
7
delay-induced turing
4
instability reaction-diffusion
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!