
Is the way you see your life and the world based in Truth? In a culture where everyone operates from the same basic beliefs, understanding the common worldview may not be too important. America used to be that kind of place--but is no longer...
Or, is it just a Mirage?