This paper presents a model for describing the positional error of line seg
ments in geographical information systems (GIS). The model is based on stoc
hastic process theory with the assumptions that the errors of the endpoints
of a line segment follow two-dimensional normal distributions. The distrib
ution and density functions of the line segments are derived statistically.
The uncertainty information matrix of line segments is derived to indicate
the error of an arbitrary point on the line segment. This model covers the
cases where two-end points are correlated to each other and points on the
line segment are stochastically continuous to each other. The model is a mo
re generic error band model than those previously developed and is called t
he G-Band model.