Let an estimated function belong to a Lipschitz class of order a. Cons
ider a minimax approach where the infimum is taken over all possible e
stimators and the supremum is taken over the considered class of estim
ated functions. It is known that, if the order alpha is unknown, then
the minimax mean squared (pointwise) error convergence slows down from
n(-2 alpha/(2 alpha+1)) for the case of the given alpha to [n/ln(n)](
-2 alpha/(2 alpha+1)). At the same time, the minimax mean integrated s
quared (global) error convergence is proportional to n(-2 alpha/(2 alp
ha+1)) for the cases of known and unknown alpha. We show that a simila
r phenomenon holds for analytic functions where the lack of knowledge
of the maximal set to which the function can be analytically continued
leads to the loss of a sharp constant. Surprisingly for the more gene
ral adaptive minimax setting where we consider the union of a range of
Lipschitz and a range of analytic functions neither pointwise error c
onvergence nor global error convergence suffers an additional slowing
down.