Twitter’s taking another step in its bid to weed out bad behavior on its online messaging service.
The company said Monday it’s enlisting help from several academic researchers to study how users behave when sending each other tweets or commenting on political topics.
For years, Twitter has been dealing with people posting offensive messages on its service, that have led to criticism that the company fails to take the issue of online harassment and bullying seriously. The study is part of Twitter CEO Jack Dorsey’s grand plan to create a set of metrics that the company can use to determine the “health” of conversations on the Twitter platform.
Dorsey has made several public comments regarding Twitter’s plans to stop bad behavior over the past year, coinciding with rising criticism toward some of the tech industry’s biggest companies that some refer to as the “techlash.” The failure of companies like Twitter, Facebook (FB), and Google’s (GOOG) YouTube to prevent Russian entities from spreading misleading information on their respective platforms prior to the 2016 U.S. Presidential election, as well as Facebook’s Cambridge Analytica scandal, are just some of the latest developments that have sowed public distrust in technology companies.
A group of researchers led by Dr. Rebekah Tromble, an assistant political science at Leiden University, will help Twitter study how people using the service form “communities” geared around political topics. The point of the study is to determine how “echo chambers” are formed, in which people only seek and rely on information that matches their beliefs.
Twitter said that the researchers’ previous work on echo chambers discovered that they “form when discussions involve only like-minded people and perspectives,” which “can increase hostility and promote resentment towards those not having the same conversation.”
“In the context of growing political polarization, the spread of misinformation, and increases in incivility and intolerance, it is clear that if we are going to effectively evaluate and address some of the most difficult challenges arising on social media, academic researchers and tech companies will need to work together much more closely,” Tromble said in a statement.
Get Data Sheet, Fortune’s technology newsletter.
On the flip side, another research team from the University of Oxford and the University of Amsterdam will work with Twitter to learn “how exposure to a variety of perspectives and backgrounds can decrease prejudice and discrimination.” The hope is that if enough people are exposed to a wider variety of voices from people that don’t share their beliefs, they will be less inclined to hurl offensive insults or say hurtful and threatening comments to others.
Twitter executives acknowledged in a blog post that enlisting academic research groups to study user behavior “is a very ambitious task.” The company did not say how long the studies will take or if Twitter is planning to pay the researchers for their work.
This is not the first effort by Twitter to curb discord on its platform. In March it partnered with the non-profit Cortico to help Twitter (TWTR) brainstorm a possible framework that could “help encourage more healthy debate, conversations, and critical thinking.”