August 1, 2018 at 11:45PM ethics - Google News Automating ethics? Why tech supports bias training, monitoring — to a point
< !DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd">
As HR systems become increasingly sophisticated, you might be tempted to delegate training around harassment, bias and other sensitive issues to technology. Don’t do it, warn PwC’s Jennifer Allyn and Bhushan Sethi. Allyn, the firm’s diversity strategy leader, and Sethi, who leads PwC’s people and organizational consulting practice for the financial services sector, recently discussed the value and limits of tech in this realm with Employee Benefit News. Edited highlights follow.
Employee Benefit News: How has cultural sensitivity and anti-harassment training evolved with the advent of advanced information technology?
Jennifer Allyn: As a foundation, without using technology, we want our leaders to make it clear that sexual harassment is never acceptable. Plus, we want our people who may have been on the receiving end of it know how to escalate their concerns. And we also have rolled out mandatory online training for supervisors. With that approach, we will know when people have received the information that they need.
EBN: What about actual in-person discussions?
Allyn: Absolutely. I have been leading sessions around the country that we call a “candid conversation” about respect in the workplace. Some of this is in light of the #MeToo movement, making us think about our diversity strategy. We also want to demystify our investigations process. The effort is a collaboration between diversity and our ethics team. But these are supplemented by online training.
EBN: Is there a way for you and employees to assess the degree of learning that has actually occurred with the online training system?
Allyn: Yes, there are questions at the end, and if you don’t get enough right, you’re going to have to take it again.
Bhushan Sethi: We also sometimes model our programs after those of our clients, particularly in highly regulated industries like financial services. For example, after the financial crisis 10 years ago, banks implemented new culture change programs, and we have learned from their experience. It’s broad in scope, and includes how you can use social media, what you can tweet and re-tweet, for example.
EBN: Aren’t a lot of the problem behaviors unconscious, and therefore hard to change?
Allyn: PWC has invested in unconscious bias training for many years to try to identify blind spots that people have, those unconscious assumptions we might make, and how we can evaluate all of our talent management systems with an eye toward reducing that subjectivity. We’ve done quite a bit to objectify our criteria for leadership competencies, so that everyone knows what we’re talking about. When you get rated on a scale of one to five for leadership performance, they’re confident the criteria are being applied objectively and consistently.
EBN: But how is technology used to identify decisions based on biased assumptions?
Allyn: We track all of our hiring, retention, and promotion data by race and gender because we are looking for any patterns that might not seem regular or correct. We also advise our clients on those kinds of systems as well.
EBN: There are plenty of systems you can buy or that are built into HCM platforms that analyze those patterns and flag the aberrations. Do you believe they’re effective?
Allyn: The hard thing about algorithms is they are made by people and have bias, too, right? For example, I heard about some facial recognition software that recognizes white faces more accurately than faces with any sort of ethnicity or color or race to them. So you need a balance between the human oversight and the algorithm that helps you analyze the big data.
Sethi: I would add that when monitoring employees for compliance purposes, you can use teams of people and natural language processing techniques to understand some of the key words, to evaluate the level of exposure to the risk or the issue. There are other patterns of behavior to look at, like when people have been in the office, how much vacation they’ve taken, who they’ve spoken to, who they haven’t spoken to.
EBN: How do you analyze patterns of employee complaints about what they perceive as harassment or bias?
Allyn: Our ethics team is always evaluating the number and volume of complaints that we get against an industry benchmark. We try to learn what the norm is for an organization of our size. If it seems very low, that could be good news, but it could also mean you have an intimidating culture where people feel silenced, and they’re not speaking up. And obviously if it’s too high above that benchmark, you know you have an ethics issue to look at. I know that in real time they are monitoring where complaints are coming from to see if they can isolate it: Is it a region? A practice group? Are there any themes or patterns?
Sethi: The monitoring and categorization of incidents and complaints is important as a starting point, but getting to the root cause is key. You can categorize based upon what you think the symptoms are, but when you get into it, it could be something else.
EBN: Talking about root causes brings us back to the front end training part. What have you learned about what’s most effective, whether it’s online or face-to-face?
Allyn: From my perspective, online training is really effective for creating a shared vocabulary to define the problem, identify the boundaries, articulate the policy. It’s helpful to use vignettes to talk about how you would resolve an issue like this, how would you react when you experience or witness the behavior. But if you really want to get into the nuances of these issues, and to put the topic into the context of organization’s bigger purpose and mission, you also need in-person discussion. You can’t do it just with someone sitting at a laptop.
Sethi: I agree. Technology is great for analyzing, for communicating your expectations, for trafficking case management and making sure that everything is getting responded to and categorized. But organizational culture is a critical piece. If we want to have meaningful conversations about race or other sensitive topics that people are usually uncomfortable discussing, you need to get the leaders to talk about them, focusing on what the organization stands for. Then you can use technology to reinforce the message and get people to take some mandatory training.
EBN: Technology also isn’t capable of disciplining offenders. At least I don’t think it is — yet. Can it give employers a false sense of confidence that harassment and cultural insensitivity can be addressed without serious supervisory intervention?
Sethi: Technology can let you analyze what kind of remediation, what consequences, have an impact on behavior, whether they’re financial or ultimately termination and legal consequences. It’s really important to track how cases get escalated and resolved.
EBN: To give credence to the policy, don’t employees need to know that people — without the specifics — actually do suffer consequences when the anti-harassment policy is violated?
Allyn: I think what we’re trying to do in these dialogues is to say there’s a range of penalties and that there’s a range of people who make decisions on imposing these penalties. A lot of people are also worried about due process — how investigations are conducted, what kind of evidence we look at, who is interviewed, how we think about confidentiality. In our training sessions we do go through case studies, like one involving an incident at a client’s location. In that case we collaborated with the client’s ethics team, and the result was a termination. We need to get the point across, but keep people feeling confident in the process. That takes leadership, not technology.