In information theory, the fundamental tool is the entropy function, w
hose upper bound is derived by the use of Jensen Inequality. In this p
aper, we extend the Jensen Inequality and apply it to derive some usef
ul lower bounds for Various entropy measures of discrete random variab
les.