The purpose of this paper is to provide a complete probabilistic analysis of a large class of stochastic differential games with mean field interactions. We implement the Mean-Field Game strategy developed analytically by Lasry and Lions in a purely probabilistic framework, relying on tailor-made forms of the stochastic maximum principle. While we assume that the state dynamics are affine in the states and the controls, and the costs are convex, our assumptions on the nature of the dependence of all the coefficients upon the statistical distribution of the states of the individual players remains of a rather general nature. Our probabilistic approach calls for the solution of systems of forward-backward stochastic differential equations of a McKean-Vlasov type for which no existence result is known, and for which we prove existence and regularity of the corresponding value function. Finally, we prove that a solution of the Mean-Field Game problem as formulated by Lasry and Lions, does indeed provide approximate Nash equilibriums for games with a large number of players, and we quantify the nature of the approximation.
All Science Journal Classification (ASJC) codes
- Control and Optimization
- Applied Mathematics
- McKean-Vlasov forward-backward stochastic differential equations
- Mean-field games
- Propagation of chaos
- Stochastic maximum principle