Journal of Applied Mathematics and Stochastic Analysis
Volume 2005 (2005), Issue 2, Pages 159-165
doi:10.1155/JAMSA.2005.159
Abstract
We study Davis-type theorems on the optimal rate of convergence of
moderate deviation probabilities. In the case of martingale
difference sequences, under the finite
pth moments hypothesis
(1≤p<∞), and depending on the normalization factor,
our results show that Davis' theorems either hold if and only if
p>2 or fail for all p≥1. This is in sharp contrast with
the classical case of i.i.d. centered sequences, where both Davis'
theorems hold under the finite second moment hypothesis (or less).