Beware of small chatbot data leaks
May recommends IT determine who has permission to install sooner rather than later. “It’s one thing to install a bot that does a lunch poll,” he says, but “a lot of these bots are going to use credentials to connect to systems. How do you monitor that?”
To further limit ad hoc implementation, remind employees that a series of small data leaks can cause just as much damage as a major breach. May shares the story of an attempted hack at Backupify, a cloud-to-cloud backup provider he owned before founding Talla: Someone used his CFO’s email address to unsuccessfully try to transfer funds from the company’s bank account. “If a bot knows things about your company,” he says, “when people ask a series of questions and retrieve pieces of information and you add it all together, they find out something they shouldn’t find out.”
Don’t scare your colleagues into submission. Of the many business operations that bots currently impact, HR is most affected. Enterprise chatbots are used to make recruiting more efficient, to onboard new hires, and to predict how likely an employee will be to leave. Talla itself sells to the HR space, answering common questions like “How much vacation do I have left?” so HR reps don’t have to. For any chatbot—and especially HR chatbots—to work, employees must be comfortable talking to them. O’Neill says, “PII, PHI, and all this information is going to be there. Don’t try to skirt it; embrace it. If you get comfortable that this data’s going to be there and you trust the companies that you’re doing your bot integrations with and chat interactions with, then you can trust that you’re going to have better results. You’re going to get better data, make better informed decisions.”
Sign up for Computerworld eNewsletters.