HireVue recognizes the impact our software can have on individuals and the community, and we act upon this responsibility with a deep commitment to fairness and equality.
When HireVue creates an assessment model or algorithm, a primary focus of the development and testing process is finding and removing factors that may cause bias (or “adverse impact”) against protected classes. The HireVue team carefully tests for bias related to age, gender, ethnicity, and any other demographic data we have, throughout the process—before, during, and after development of the assessment model. Thorough testing is done before candidates take interviews and continues as long as the model is being used to assess candidates and prevent bias.
At HireVue, our goal is not just to use our technology to eliminate bias in employment decisions, but to actively promote diversity, and work toward a common goal of equal opportunity for everyone—regardless of gender, ethnicity, age, or disability status. Because of this, HireVue will continue to evolve our practices as we work with our customers, job-seekers, technology partners, ethicists, legal advisors, and the community at large to always ensure we are holding ourselves to the highest possible standards in doing so.
Learn more about how HireVue uses artificial intelligence to prevent and reduce the introduction of both conscious and unconscious biases against any group or individual in this article here.