Emerging Tech and Religious Freedom
Like their secular counterparts, religious organizations—churches and synagogues, ministries, and religious schools—have rapidly embraced digital and AI technologies. Some have placed them at the very core of their religious mission. Whether it’s sophisticated data analysis on donor behavior, faith-infused chatbots, or digital cloning of religious leaders, many organizations see technology as simply another tool: another way to spread their message, make disciples, minister to the faithful, educate the young, care for the poor, and build support for their missions.
Yet where technology goes, regulation soon follows. As lawmakers across the country increase their scrutiny of emerging technologies, tech-savvy religious organizations will have to navigate an increasingly contested boundary line between the requirements of law and the demands of faith.
At the moment, neither side seems aware of the problem. The tech regulations emerging out of state capitals, administrative agencies, and Congress are largely oblivious to the ways religious organizations use technology. On the flip side, religious organizations have yet to reckon with new legal restrictions on practices that are essential to their missions. Many are simply unaware that they’re subject to regulation at all. All this will change in the coming years.
Religion and Data Privacy
Legislators have lately devoted much of their attention to data privacy—how the personal information of individuals is gathered, stored, used, and shared. “[O]ur very selves have become reducible to a frightening degree to data,” psychologist Jordan Peterson recently told Congress. Data now makes up the very “image of our identity—an identity which can be and is increasingly bought and sold.”
Technology enables the inexpensive capture, manipulation, and sale of the personal data of millions of individuals, yielding new analytical insights that are at once exciting and frightening. For example, retailers are using video surveillance and AI to analyze customer behavior in their stores, enabling targeted promotions, better product placement, and inventory management. In healthcare, vast amounts of medical data allow for the creation of “digital humans”—complex mathematical models of the human body that can be used for medical experiments, like how different people might react to a new kind of drug. And charitable organizations are using data-driven AI to analyze and predict donor behavior, personalize communications, and optimize fundraising. These are just a few examples.
These technological advances come packaged with new risks because the data that powers them can also be misused. At some point, data becomes so personal, detailed, and intimate that it’s almost indistinguishable from the very human beings to which it relates, opening new frontiers for manipulation and abuse.
As of this writing, nineteen states and the District of Columbia have enacted data privacy laws to limit the ways organizations gather, use, buy, and sell the personal data of “consumers.” Some laws expressly exempt nonprofit organizations. But many don’t, and applying these laws to religious groups will raise thorny constitutional questions.
Large religious denominations maintain data on tens or hundreds of thousands of members, congregants, visitors, donors, and other individuals. To take a simple example, a church will collect personal information from a Sunday morning congregant and then use that information in some way, perhaps emailing the congregant to invite her to a Wednesday night class or a weekend retreat. The technical label for this activity is “processing.” Processing refers to just about any act involving a person’s data, including gathering, storing, analyzing, using, sharing, and selling it.
Data privacy laws now require organizations to specify the purposes for the data they’re collecting and processing (usually through a privacy policy) and then stick to those purposes. If they want to use a person’s data for new or different purposes, they need consent.
Most data privacy laws also create a special category of “sensitive” data, which includes information that “reveals” a person’s religious beliefs. A lot of data maintained by religious groups may fit this category because it’s reasonable to infer that, if a person attends a church or donates to an organization, she shares its beliefs in some way. Because religious belief data is considered legally sensitive, it’s not enough to passively specify how the data will be used. Rather, an organization must obtain a person’s affirmative consent before collecting or processing it. This advance consent requirement subjects the data practices of religious organizations to extra regulatory burdens—burdens not imposed on non-religious data.
Take the Sunday morning congregant example above. The follow-up email the church wants to send may violate legal restrictions on “sensitive” data processing unless the church 1) specifically notifies the congregant, when it collects her data, that it will communicate with her about other ministry opportunities and 2) obtains her affirmative consent to do so. The law thus dictates both what the church can do with the congregant’s information and how the church can communicate with her. And the law bars the church from communicating with her unless she provides government-approved consent in advance.
These are obvious restraints on both the free exercise of religion and religious speech. Extend the scenario from a single congregant to thousands, and the restraints become severe.
The restraints are also selective. Organizations are free to collect and process personal data that reveals a host of non-religious information—political beliefs, social affiliations, buying decisions, and driving habits, for instance—without obtaining advance consent. Because the advance consent requirement applies only to religious belief data (and a few other narrow categories of “sensitive” information), data privacy laws effectively single out religious data for disfavored treatment, imposing unique burdens on organizations for whom religious data is critical. Legislators may not have intended these results, but they are the troubling effects of the data privacy laws they are enacting.
Similar issues arise for the disclosure of religious data. (Disclosure is another form of “processing.”) Religious organizations share data all the time, for reasons closely tied to religious exercise and speech. For example, two ministries may collaborate on a religious lecture series, merging their databases to promote the series to individuals they believe will be interested. A national college ministry may share student data with local churches so students can be “plugged in” to a local church body for community, mentoring, and discipleship. Individual churches may share their data “upward” with higher denominational bodies. Or vice versa, denominational bodies may share data “downward” with individual churches. Or, a nationally known pastor may share his mailing list with a book publisher in order to promote his new book and a sermon series.
All these real-world scenarios are potentially within the purview of data privacy laws. And religious entities now must think twice before they share and collaborate on information to further their ministry efforts. They must ask questions like:
What is the nature of the personal data we have?
What was our initial, specified purpose in collecting the data?
Can we now use the data for other purposes?
Can we share the data with our ministry partners so they can use it? And can they share their data with us?
Do we need to notify individuals and obtain their consent before we share their data or use the data in a new or different way?
Some of these questions can be addressed through well-drafted privacy policies. Others cannot, and the prospect of liability, official investigations, and lawsuits will inevitably chill some faith-driven data practices.
Regulatory burdens may be avoided in states like California and Virginia, whose data privacy laws don’t apply to nonprofits. But in states whose laws do apply, religious organizations must rely on other avoidance arguments. One argument is based on the size of the data sets. At present, data privacy laws regulate organizations only if they maintain relatively large amounts of data. (The numerical thresholds vary from state to state.) Thus, small religious organizations may not be covered under these laws. But laws have a tendency to expand, and lower numerical thresholds will bring even small religious organizations within the regulatory sweep.
Other avoidance arguments are based on statutory language. Religious organizations might assert that their constituents—congregants, donors, etc.—aren’t “consumers” as that term is defined in data privacy laws. (One does not typically call a parishioner a “consumer” of word and sacrament, for example.) Organizations also might argue that religious activities—worship, evangelism, etc.—don’t amount to “doing business” in a state so as to trigger legal requirements.
A final set of arguments will be based on the First Amendment. Religious organizations may argue that data privacy laws unconstitutionally restrict free speech and selectively burden data practices essential to the religious mission.
All this is writing on a blank slate, though. These arguments are untested. The scope and enforcement of data privacy laws, and whether and to what extent they apply to religious organizations and their data, remain uncertain.
Faith-Driven AI
Data privacy may be the biggest source of legal exposure for tech-savvy religious organizations, but it is hardly the only one. Regulation of other emerging technologies will present legal challenges.
In 2024, Colorado became the first state to enact a law that seeks to regulate harms from AI. Senate Bill 24-205, titled “Consumer Protections for Artificial Intelligence,” bans “algorithmic discrimination,” which occurs when an AI system makes important decisions—like educational enrollment, employment, healthcare, and housing—that result in differential treatment based on protected characteristics like race, sex, and religion.
The law is not effective until 2026 (and may be amended before then). And most religious organizations aren’t using AI-automated screening tools anyway. But some large organizations could be affected. Christian universities, for example, might use AI for educational enrollment screening. Religious hospitals might use AI tools to screen patients or identify treatments. They’ll need to review their practices for legal compliance.
On the flip side, a law like Colorado’s could end up benefiting religious organizations. Some secular job posting sites ban organizations from expressing a religious preference when they advertise a job opening. Yet both federal and state laws protect this practice. A Jewish community center can limit its hiring to Jews; a Latter-Day Saints organization can require employees to be in good standing with the LDS Church; and Christian ministries can enforce a Christians-only hiring policy. If sites are using AI to analyze and auto-reject job postings that express lawful religious preferences, they may run afoul of legal bans on algorithmic religious discrimination.
Forty-four other states are considering bills on various forms of AI regulation. California’s law banning “deceptive” digitally modified content about political candidates was swiftly and rightly struck down late last year.
How old laws apply to new technologies presents another challenge. Take large language models (LLMs), the technology behind generative AI chatbots like ChatGPT. Many religious organizations are adopting and deploying this technology. It is at the forefront of AI-driven Bible translation, even though human input and oversight remain key to the work. Meanwhile, the use of religious chatbots has exploded: South Carolina megachurch pastor Ron Carpenter has created an AI version of himself, allowing congregants to engage in one-on-one interactions with a chatbot that will counsel and even pray with them. (Inquiring minds may wonder whether the prayers of a chatbot are religiously effectual.) As another example, apologetics ministry Catholic Answers created “Father Justin,” a chatbot that “provide[s] users with faithful and educational answers to questions about Catholicism.” And CV Global is exploring the use of chatbots “as a creative and conversational tool” to enhance evangelistic efforts.
These are a select few examples. Many other ministries are using or building chatbots for prayer, counseling, therapy, education, and other purposes. The legal risks of this technology are many, and they’re magnified when individuals interact with a ministry’s AI system in one-on-one contexts.
One set of risks involves user inputs, that is, what a user communicates to the AI system. If a user confesses his sins to a church’s chatbot, is the information protected by the clergy-penitent privilege? If a confession involves child or elder abuse, is there a mandatory duty to report? How should organizations handle protected health information or other legally sensitive materials?
Another set of risks involves AI system outputs. LLMs are known to hallucinate, generating inaccurate, misleading, or wholly fabricated information. The models can also be biased, depending on their training dataset. To what extent is a ministry responsible for chatbot falsehoods and prejudice? Is the chatbot an agent of the organization for legal purposes? Is speech by a chatbot the kind of speech the First Amendment protects?
Right now, these questions don’t have obvious answers. Legislators, regulators, and courts will be fleshing them out, in both religious and non-religious contexts, for years to come.
Legal risks are not the only risks that religious organizations take on when they embrace new technologies. Reputational, operational, and even theological risks abound. There are deeper issues, too—the displacement of humans by machines, the disembodiment of relationships, and whether artificial intelligence can effectively substitute for ancient wisdom and sacred tradition.
As these existential questions increasingly assume a legal dimension, resolving them will be a key area of uncertainty and risk for religious groups in the coming years.
Image by tcerovs and licensed via Adobe Stock.