Updated: Chatbots — propaganda is being taught how to speak

Bots–“simple computer scripts–were originally designed to automate repetitive tasks … sparing humans hours of tedium.”  But, bots “can also be used to operate large numbers of fake [social media] accounts, which makes them ideal for manipulating people.”

As became evident after the 2016 election in the United States and from many other examples, not only do bot-operated fake social media accounts “broadcast extremist viewpoints,” but they also enhance and amplify similar views from authentic human accounts by “liking, sharing, retweeting,” etc., etc.  They game the algorithms and give those posts and tweets more visibility.

While current bot technology uses brute-force (large numbers of bots) to have influence, and some progress has been made to try and limit them, the next generation of bots will be harder to recognize and control.  Just like Alexa, Cortana, and Google Assistant, the new bots (chatbots) will behave and talk a lot more like real people.

While Alexa and other current chatbots (also used by various companies for customer service purposes) declare themselves to be automated, chatbots used for propaganda will not do that.  They will present themselves as humans participating in online comment sections, group chats, message boards, and private chat channels.  Will you be able to tell that you are talking to a machine?

The technology to make chatbots indistinguishable from humans in speech is not there yet.  But, it is getting close.  “Some simple preprogramed bot scripts have been successful at misleading users.”  The open development strategies used by companies like Google and Amazon to improve natural-language processing by machines–by “opening their language-processing algorithms to the public” via APIs–also help the developers teaching chatbots to spread propaganda.

“There’s still a long way to go before a bot will be able to spoof a human in one-on-one conversation.  Yet as the algorithms evolve, those capabilities will emerge” and probably sooner than we think.

Featured article:


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.