In this provocative book, H.W. Brands confronts the vital question of why an ever-increasing number of Americans do not trust the federal government to improve their lives and to heal major social ills. From the Revolution on, argues Brands, Americans have been chronically skeptical of their government.