MacDirectory Magazine

Régis Mathias

MacDirectory magazine is the premiere creative lifestyle magazine for Apple enthusiasts featuring interviews, in-depth tech reviews, Apple news, insights, latest Apple patents, apps, market analysis, entertainment and more.

Issue link: https://digital.macdirectory.com/i/1518973

Contents of this Issue

Navigation

Page 111 of 195

In the above example where the frame of the photo stops around Purcell’s hips, Photoshop just extends the dress as might be expected. But if you use generative expand with a more tightly cropped or composed photo, Photoshop has to “imagine” more of what is going on in the image, with variable results. Is it legal to alter someone’s image like this? It’s ultimately up to the courts to decide. It depends on the jurisdiction and, among other aspects, the risk of reputational harm. If a party can argue that publication of an altered image has caused or could cause them “serious harm”, they might have a defamation case. How else is generative AI being used? Generative fill is just one way news organisations are using AI. Some are also using it to make or publish images, including photorealistic ones, depicting current events. An example of this is the ongoing Israel-Hamas conflict. Others use it in place of stock photography or to create illustrations for hard-to-visualise topics, like AI itself. Many adhere to institutional or industry-wide codes of conduct, such as the Journalist Code of Ethics from the Media, Entertainment & Arts Alliance of Australia. This states journalists should “present pictures and sound which are true and accurate” and disclose “any manipulation likely to mislead.” Some outlets do not use AI-generated or augmented images at all, or only when reporting on such images if they go viral. Newsrooms can also benefit from generative AI tools. An example includes uploading a spreadsheet to a service like ChatGPT-4 and receiving suggestions on how to visualise the data. Or using it to help create a three-dimensional model that illustrates how a process works or how an event unfolded. What safeguards should media have for responsible generative AI use? I’ve spent the last year interviewing photo editors and people in related roles about how they use generative AI and what policies they have in place to do so safely. I’ve learned that some media outlets bar their staff from using AI to generate any content. Others allow it only for non-realistic illustrations, such as using AI to create a bitcoin symbol or illustrate a story about finance. News outlets, according to editors I spoke to, want to be transparent with their audiences about the content they create and how it is edited. In 2019, Adobe started the Content Authenticity Initiative, which now includes major media organisations, image libraries and multimedia companies. This has led to the rollout of content credentials, a digital history of what equipment was used to make an image and what edits have been done to it. This has been touted as a way to be more transparent

Articles in this issue

Links on this page

Archives of this issue

view archives of MacDirectory Magazine - Régis Mathias