When people use AI for writing assistance, it can shift their political attitudes by autocompleting sentences in biased ways.
Yet people are often unaware of the AI bias and it's influence on them.
And this is not merely about the facts presented, since attitudes changed less when the information was presented as static text.
This could pose a real problem if AI chatbots are socially and political biased: science.org/doi/10.1126…