An inverse eigenvalue problem, where a matrix is to be constructed fro
m some or all of its eigenvalues, may not have a real-valued solution
at all. An approximate solution in the sense of least squares is somet
imes desirable. Two types of least squares problems are formulated and
explored in this paper. In spite of their different appearance, the t
wo problems are shown to be equivalent. Thus one new numerical method,
modified from the conventional alternating projection method, is prop
osed The method converges linearly and globally and can be used to gen
erate good starting values for other faster but more expensive and loc
ally convergent methods. The idea can be applied to multiplicative inv
erse eigenvalue problems for the purpose of preconditioning. Numerical
examples are presented.