Victims of ‘doxing’ could have right to sue under proposed bills

By Nika Schoonover and Andrew Adams Capitol News Illinois

Sen. Julie Morrison (left, D-Lake Forest) and Sen. Mary Edly-Allen (middle, D-Libertyville) converse on the Senate floor with Sen. Laura Fine, D-Glenview, earlier this year. Morrison and Edly-Allen are the sponsors of two digital privacy bills allowing civil action for victims of “deepfake porn” and “doxing.” (Capitol News Illinois photo by Jerry Nowcki)

 

SPRINGFIELD – Victims of “deepfake porn” and “doxing” would have a legal pathway to sue their perpetrators in Illinois under a pair of digital privacy measures that have so far received unanimous support in the General Assembly.

House Bill 2954, which would allow victims of “doxing” to pursue civil litigation, needs only a signature from the governor to become law after clearing both chambers of the General Assembly unanimously.

Doxing, as defined by HB 2954, occurs when an individual intentionally publishes another person’s personal information without their consent. For an offense to qualify as doxing, the person publishing the information must have acted with intent to “harm or harass” the victim with “knowledge or reckless disregard” that it could lead to “death, bodily injury, or stalking.”

Additionally, the published information must have caused the victim harm in some way, including economic injury or emotional distress. A person found by a court to have suffered from doxing would be eligible to recover damages and other relief such as attorney’s fees.

“It is absolutely critical for our laws to evolve with the changing nature of the cyber world,” bill sponsor Sen. Julie Morrison, D-Lake Forest, said in a news release. “This legislation provides a necessary solution to the dangerous practice of doxing, by both helping victims and deterring future bad actors.”

The bill creates some exemptions for those publishing others’ private information, including if the information was shared to report criminal activity or if it was shared to the news media and was constitutionally protected.

The measure would also allow a court to grant temporary restraining orders and injunctions ordering the defendant to cease publication of the identifying information while the case is pending.

The ACLU of Illinois opposed the bill, saying they’re concerned about the overbroad definitions of “publish” and “personally identifiable information.”

The measure defines “publish” as making information available to another person.

“Personally identifiable information” is defined as any identifying information combined with other information such as the person’s social security number, social media accounts, education and employment information, information about gender identity or sexual orientation, or information about how to enter a person’s teleconference meeting.

Angela Inzano, policy and advocacy strategist for ACLU Illinois, said the organization is concerned the definitions would allow individuals to file suit even if the information is shared privately or already publicly available.

“They did narrow the definition of ‘publish’ to make it clear that it did not apply to two people texting back and forth,” Inzano said in an interview. “That was useful, but we feel like it’s still not far enough.”

‘Deepfake porn’

House Bill 2123, a measure allowing victims of so-called “deepfake porn” – or digitally altered, nonconsensual sexual images – to sue the creator of those images passed the Senate unanimously last week and cleared the House unanimously Wednesday, May 17.

Angela Inzano, ACLU

The bill would give victims of deepfake porn the same legal footing as victims who have had their actual sexual images shared without their consent. It does so by amending a 2019 law which created a pathway to civil action for people who have suffered harm from the unwanted distribution of their sexual images. These images are sometimes called “revenge porn,” so-called to describe a situation in which someone who obtained sexual images consensually later shares them to harm the subject.

HB 2123 would add “intentionally digitally altered” sexual images to the existing law. It also expands the remedies in the existing law to allow a court to grant the victim temporary restraining orders and injunctions ordering the defendant to cease publication of the images in question.

A previous version of the bill passed unanimously in the House in March. Because it was amended in the Senate, it now awaits further consideration in the House.

Sen. Mary Edly-Allen, D-Libertyville, sponsored the bill and said that the deepfake porn is part of a long history of gender-based harassment.

“This is a way to silence women,” Edly-Allen said of deepfakes.

Republicans voiced support for the bill and its previous versions in both chambers of the Statehouse.

“We have to ensure that people’s privacy rights are protected and this bill helps to do that,” Sen. Steve McClure, R-Springfield, said during debate.

Edly-Allen said in an interview that the bill is a first step in a broader conversation about the impact of evolving technologies like artificial intelligence. She said she hopes to work with others in the legislature to hold a hearing on the subject later this year.

“It can’t just be legislation,” Edly-Allen said in an interview. “We also need literacy in schools. How do you tell what’s true and not true?”

The ACLU of Illinois also opposed the deepfake porn bill. When the measure was introduced, it would have allowed the subjects of any digital forgery to sue the forgeries’ creators if they intended to cause harm, incite violence or acted with reckless disregard toward the subject. Amendments to the bill have since narrowed its scope to include only sexually explicit images and videos.

“The bill has been markedly improved since it was introduced, but we still have some concerns,” ACLU spokesman Ed Yohnka told Capitol News Illinois. Yohnka said one potential free speech concern stems from the proposal’s provision allowing temporary restraining orders prior to adjudication.

Generative AI

The Senate last week also passed House Bill 3563, which would create a task force to investigate natural language processing and generative artificial intelligence. These technologies have drawn significant attention in recent months thanks to services like OpenAI’s ChatGPT and Google’s Bard.

The task force would be managed by the state’s Department of Innovation and Technology and would include representatives from the agency, the legislative caucuses, the state board of education, business associations, labor associations and the attorney general’s office, among others.

“It is a broad, broad task force because it is a broad, broad issue,” bill sponsor Sen. Robert Peters, D-Chicago, said.

The measure passed both chambers unanimously and will head to the governor.

Editor’s note: This story has been updated to reflect the most recent action on all of the bills. They have all cleared both chambers of the General Assembly.

nschoonover@capitolnewsillinois.com
aadams@capitolnewsillinois.com

Capitol News Illinois is a nonprofit, nonpartisan news service covering state government. It is distributed to hundreds of print and broadcast outlets statewide. It is funded primarily by the Illinois Press Foundation and the Robert R. McCormick Foundation, along with major contributions from the Illinois Broadcasters Foundation and Southern Illinois Editorial Association.