# 2.3] What is a dynamical system? (nonlinear science)

## Description

This article is from the Nonlinear Science FAQ, by James D. Meiss jdm@boulder.colorado.edu with numerous contributions by
others.

# 2.3] What is a dynamical system? (nonlinear science)

A dynamical system consists of an abstract phase space or state space, whose

coordinates describe the dynamical state at any instant; and a dynamical rule

which specifies the immediate future trend of all state variables, given only

the present values of those same state variables. Mathematically, a dynamical

system is described by an initial value problem.

Dynamical systems are "deterministic" if there is a unique consequent to every

state, and "stochastic" or "random" if there is more than one consequent

chosen from some probability distribution (the "perfect" coin toss has two

consequents with equal probability for each initial state). Most of nonlinear

science--and everything in this FAQ--deals with deterministic systems.

A dynamical system can have discrete or continuous time. The discrete case is

defined by a map, z_1 = f(z_0), that gives the state z_1 resulting from the

initial state z_0 at the next time value. The continuous case is defined by a

"flow", z(t) = \phi_t(z_0), which gives the state at time t, given that the

state was z_0 at time 0. A smooth flow can be differentiated w.r.t. time to

give a differential equation, dz/dt = F(z). In this case we call F(z) a

"vector field," it gives a vector pointing in the direction of the velocity at

every point in phase space.

Continue to: