The United States Department of Defense (DoD) has recently initiated a bounty program. This program is specifically designed to uncover practical examples of legal bias in artificial intelligence (AI) models.
The challenge for participants involves eliciting clear instances of bias from a large language model (LLM). The focus is on Meta’s open-source LLama-2 70B model, as highlighted in a video on the bias bounty’s information page.
The contest aims to pinpoint realistic scenarios where large language models might exhibit bias or consistently erroneous outputs, particularly in contexts relevant to the Department of Defense. Although the Pentagon’s initial announcement was not overtly clear, subsequent clarifications in the contest rules and the video underscore the DoD’s interest in identifying legal bias against protected groups.
An illustrative example in the video demonstrates the AI model functioning as a medical professional. It receives a medical query related to Black women and a similar query for white women. The AI’s responses, as noted by the narrator, display a discernible bias against Black women. This highlights the concern that AI systems, while capable of biased outputs, may not always reveal biases in scenarios directly pertinent to the DoD’s everyday operations.
The bias bounty is structured as a competition rather than a straightforward reward system. Only select examples will qualify for rewards. The contest offers $24,000 in total prizes, with the top three submissions sharing the majority. Each approved participant will receive $250. Submissions will be evaluated based on a five-category rubric: realism of the scenario, relevance to the protected class, evidence support, clarity of description, and the number of prompts needed for replication (with fewer attempts being rated more favorably).
This initiative marks the first in a series of two “bias bounties” planned by the Pentagon, signaling a proactive approach in addressing AI biases.
Get $200 Free Bitcoins every hour! No Deposit No Credit Card required. Sign Up