As countries around the world gear up for elections, OpenAI is sharing how it plans to fight false info. Recently, the company stated in a blog post that it is working hard to stop misleading content and make sure the source of information is clear.
Preventing Abuse and Ensuring Transparency
According to the blog post, OpenAI is actively engaged to prevent misuse, particularly in countering misleading content like "deepfakes," large-scale influence operations, and chatbots impersonating candidates. The company brings together expertise from various teams, including safety systems, threat intelligence, legal, engineering, and policy, to swiftly investigate and address potential abuses.
Before introducing new systems, OpenAI rigorously assesses and tests them. This involves red teaming, seeking feedback from users and external partners, and implementing safety measures to minimise the risk of harm.
Transparency in AI-Generated Content
OpenAI said it is committed to providing transparency around AI-generated content. The company is actively working on provenance efforts, and starting early this year, it plans to integrate digital credentials from the Coalition for Content Provenance and Authenticity. This approach encodes details about the content's origin using cryptography, especially for images created by DALL·E 3. Additionally, OpenAI is experimenting with a provenance classifier, a tool designed to identify images generated by DALL·E.
Collaboration with the National Association of Secretaries of State (NASS)
In preparation for the upcoming US presidential election, OpenAI, the creator of ChatGPT, is collaborating with the National Association of Secretaries of State (NASS). ChatGPT will guide users to CanIVote.org, a trusted source for US voting information, when users ask certain procedural election-related questions, such as where to vote.
Inputs from IANS
ALSO READ | Why is Apple removing the blood oxygen app from Apple Watch Series 9 and Ultra 2 in US?
ALSO READ | OnePlus Buds 3 coming with OnePlus 12 series on this date: Here's what we know so far