Numerical Simulation of Feller’s Diffusion Equation
Numerical Simulation of Feller’s Diffusion Equation
This article is devoted to Feller’s diffusion equation, which arises naturally in probability and physics (e.g., wave turbulence theory). If discretized naively, this equation may represent serious numerical difficulties since the diffusion coefficient is practically unbounded and most of its solutions are weakly divergent at the origin. In order to …