The Associated Press has issued guidelines on AI. It has set rules on how to integrate tech tools like ChatGPT into their work.
The aim of the news organisation is to give a good story. As we are aware, generative AI has the ability to create text, images, audio and video on command. However, AI is not yet fully capable of distinguishing between fact and fiction. AP therefore, advised that AI-generated material should be vetted carefully ( like any other material from any other source). AP advises that a photo, video or audio segment generated by AI should not be used (unless such altered material is itself the subject of the story.) This aligns well with the tech magazine Wired’s policy.
There are instances of ‘hallucinations’ or made-up facts in AI-generated content. Consumers must know that there are standards in place so as to ensure that the content they are reading is verified, credible and fair.
Generative AI can facilitate publishing in a number of ways. Stories can be summarized or be put in the form of a digest. AI can be used to create headlines. AI can generate story ideas. AI can be used to make a story concise, and more readable. AI can suggest possible questions for an interview.
AP has simply used AI for a decade. It has created short news stories out of sports box scores or financial data. They want to enter the new phase of generative AI with caution. They are careful about their credibility. ChatGPT and OpenAI has struck to deal with AP of licensing news stories for training purposes.
News organisations are also concerned about the use of their content by AI companies without permission or payment. Their associations arrive at a deal with AI and Big Tech companies to protect the IP rights of their members.