God has not called us to church tradition and religion, he has in fact called us to advance and increase HIS Kingdom on the earth. The church was never intended to be just a religious institution, but a place to get an education in Kingdom mandates, principles, and order. When the good news is preached that the 'Kingdom is at hand' then deliverance, miracles, and prosperity won't be just random events, but will become our expectation and lifestyle...