Email us for help
Loading...
Premium support
Log Out
Our Terms of Use and Privacy Policy have changed. We think you'll like them better this way.
Is God Obligated to heal you? It's a question I've been asking myself this past week. Well-meaning people have offered their advice about 'how to get healed' by 'having enough faith'. The basic premise is that if you don't truly believe God can heal you, then you won't be healed. Or rather, the strength of your faith is what will heal you.
Watching my sister fight ovarian cancer and the myraid other issues she's had during this time has caused me to have some serious questions about faith and healing. Mainly, is faith and healing directly correlated with each other? Some would say that you have to have 'real' faith. One hundred and ten percent of faith, you have to believe that if you want to be healed, the force of your faith in God will heal you.
Is that really the case though? What does the Bible say about faith and healing? Even more so, is God obliated to heal us?
Join me as I talk about this very important topic. Call in at 646-668-8485, press 1 to be live on air. Or, you can download Stitcher on your mobile device. Or, click on the link here.