Skip to content
Payroll

PayrollOrg Encourages United Nations to Develop AI Guidelines in Payroll to Identify Potential Biases

Checkpoint Payroll Update Staff  

· 5 minute read

Checkpoint Payroll Update Staff  

· 5 minute read

In a letter to the United Nations (UN) Office of the Secretary-General’s Envoy on Technology, PayrollOrg (PAYO) encouraged the global organization to develop guidelines on artificial intelligence (AI) to help payroll management software developers and users identify potential biases, especially for hidden disparities.

The December 18, 2023 letter was in response to the UN convening a multi-stakeholder High-Level Advisory Body on AI to analyze and advance recommendations for the international governance of AI. The letter noted the following three AI concerns that the UN Secretary General spoke about at the United Kingdom’s Safety Summit in November 2023: (1) the safety and security of users, (2) long-term negative consequences, (3) and potential inequalities.

AI disparity and discrimination concerns.

PayrollOrg’s letter described several AI applications that may be used in payroll processes, such as updating tax withholding formulas. It explains that AI systems are dependent on data input to identify patterns and if historical data is not accurate or includes discriminatory information, the AI system could highlight bad data through machine learning.

Also, if biases are built into an AI system, the system could miscalculate the pay of certain workers or fail to correctly apply new worker situations and law changes to all covered workers. For example, if managers use an algorithm to determine which workers should receive bonuses or pay increases, and the AI system is biased, workers will not be treated fairly and equally.

Disparities also can arise if payroll data is not carefully managed. PayrollOrg explains that although countries have different laws and regulations on data privacy, personal information about workers should not be made publicly available or shared with any entity not partnered with or authorized by the employer.

Potential to eliminate potential AI discrimination.

Despite the potential concerns, PayrollOrg believes that if an AI system is trained to recognize disparities, there is an opportunity to eliminate discrimination going forward. Machine learning could be used to help eradicate discriminatory practices. The letter says that while some discriminatory practices are easily recognized, other biases are more inadvertent.

PayrollOrg urges the development of AI systems to include diverse teams and consultants who are experts in identifying discrimination in coding processes. Before use, AI system testing should not only include system functionality for customer and business needs but also processes and procedures to prevent building disparities into the system.

In addition, PayrollOrg thinks that developers, users, payroll professionals, and employers should add data use and sharing prevention measures into payroll management systems to prevent workers’ personal information from leaking or being used by unauthorized individuals.

 

Get all the latest tax, accounting, audit, and corporate finance news with Checkpoint Edge. Sign up for a free 7-day trial today.

More answers