You are developing HR software with GitHub Copilot. While reviewing code suggestions for a candidate filtering feature, you notice potential bias favoring certain demographic attributes, such as gender or ethnicity. You want to ensure the AI-generated code promotes fairness. What is the most responsible action to address this bias?