Article contents
Decision problems for differential equations
Published online by Cambridge University Press: 12 March 2014
Extract
In this paper we shall consider some decision problems for ordinary differential equations. All differential equations will be algebraic differential equations, i.e. equations of the form P(x, y, y′, …, y(n)) = 0 (or P(x, y1, …, ym, y′1 …, y′m, …) = 0 in the case of several dependent variables), where P is a polynomial in all its variables with rational coefficients. (We call P a differential polynomial.)
Jáskowski [6] showed that there is no algorithm to determine if a system of algebraic differential equations (in several dependent variables) has a solution on [0, 1]. An easier proof of this using results on Hilbert's tenth problem is given in [2]. It is natural to restrict the problem in the hope of finding something which is decidable. Two ways to do this are the following: (a) One can ask only for the existence of solutions locally, say around x = 0. Here solution may mean (i) analytic functions, (ii) germs of C∞ functions, or (iii) formal power series, (b) One can consider only one equation in one dependent variable, but ask for the existence of a solution on an interval. In [4] we gave an algorithm for deciding (a)(ii) and (a)(iii) and showed that (a)(i) is undecidable. In §4 (Theorem 4.1) we shall show that (b) (for real analytic solutions) is undecidable. This partially answers Problem 9 of Rubel [13], which stimulated this investigation. We shall also prove (Theorem 4.2) that, given an analytic function f(x) which is the solution of an algebraic differential equation, determined by some initial conditions at x = 0, one cannot in general determine if the radius of convergence of f is < 1 or ≥ 1.
- Type
- Research Article
- Information
- Copyright
- Copyright © Association for Symbolic Logic 1989
References
REFERENCES
- 16
- Cited by