Why do so many Christians in America believe professional Christians are actually destroying Christianity? Clearly, no one person or a group of leaders can truly destroy what God has instituted. But when it comes to all of the never-ending need to fundraise, professionalize everything, and act like the Gospel and/or the Church is a corporation, then why do we really need faith, when all they want is more money? Even missionaries and faith-based nonprofits continue in this never-ending pursuit of more with the underlying assumption that they cannot share the love of Jesus without first receiving a hefty donation? Did Jesus really set up His mission and church on the backs of dollar bills? As American philosopher Sam Pascoe writes: “Christianity began in Palestine as a fellowship (a relationship), and then moved to Greece and became a philosophy (a way to think). Afterwards it moved to Rome and became an institution (a place to go), and then to Europe where it became a culture (a way of life). Finally, it settled in America, where it has become an enterprise (a business).” Maybe it is time for generous givers and faithful tithers to rethink what they are really giving to in the name of God? What do you think?