While this world is full of counterfeit, so-called, “faith healers,” the Bible, nevertheless, explicitly reveals that God does heal people. The power to heal the sick is a gift from God through His Holy Spirit, and healing is something about which Jesus Christ carefully instructed His Church.