The paper studies discrete/finite-difference approximations of optimal control problemsgoverned by continuous-time dynamical systems with endpoint constraints. Finite-differencesystems, considered as parametric control problems with the decreasing step ofdiscretization, occupy an intermediate position between continuous-time and discrete-time(with fixed steps) control processes and play a significant role in both qualitative andnumerical aspects of optimal control. In this paper we derive an enhanced version of theApproximate Maximum Principle for finite-difference control systems, which is new even forproblems with smooth endpoint constraints on trajectories and occurs to be the firstresult in the literature that holds for nonsmooth objectives and endpoint constraints. Theresults obtained establish necessary optimality conditions for constrained nonconvexfinite-difference control systems and justify stability of the Pontryagin MaximumPrinciple for continuous-time systems under discrete approximations.