Updated News Around the World

Where Memory Ends and Generative AI Begins

In theory, these cryptographic standards ensure that if a professional photographer snaps a photo for, say, Reuters and that photo is distributed across Reuters international news channels, both the editors commissioning the photo and the consumers viewing it would have access to a full history of provenance data. They’ll know if the cows were punched up, if police cars were removed, if someone was cropped out of the frame. Elements of photos that, according to Parsons, you’d want to be cryptographically provable and verifiable. 

Of course, all of this is predicated on the notion that we—the people who look at photos—will want to, or care to, or know how to, verify the authenticity of a photo. It assumes that we are able to distinguish between social and culture and news, and that those categories are clearly defined. Transparency is great, sure; I still fell for Balenciaga Pope. The image of Pope Francis wearing a stylish jacket was first posted in the subreddit r/Midjourney as a kind of meme, spread amongst Twitter users and then picked up by news outlets reporting on the virality and implications of the AI-generated image. Art, social, news—all were equally blessed by the Pope. We now know it’s fake, but Balenciaga Pope will live forever in our brains. 

After seeing Magic Editor, I tried to articulate something to Shimrit Ben-Yair without assigning a moral value to it, which is to say I prefaced my statement with, “I’m trying to not assign a moral value to this.” It is remarkable, I said, how much control of our future memories is in the hands of giant tech companies right now simply because of the tools and infrastructure that exist to record so much of our lives.

Ben-Yair paused a full five seconds before responding. “Yeah, I mean … I think people trust Google with their data to safeguard. And I see that as a very, very big responsibility for us to carry.” It was a forgettable response, but thankfully, I was recording. On a Google app. 

After Adobe unveiled Generative Fill this week, I wrote to Sam Lawton, the student filmmaker behind Expanded Childhood, to ask if he planned to use it. He’s still partial to AI image generators like Midjourney and DALL-E 2, he wrote, but sees the usefulness of Adobe integrating generative AI directly into its most popular editing software. 

“There’s been discourse on Twitter for a while now about how AI is going to take all graphic designer jobs, usually referencing smaller Gen AI companies that can generate logos and what not,” Lawton says. “In reality, it should be pretty obvious that a big player like Adobe would come in and give these tools straight to the designers to keep them within their ecosystem.” 

As for his short film, he says the reception to it has been “interesting,” in that it has resonated with people much more than he thought it would. He’d thought the AI-distorted faces, the obvious fakeness of a few of the stills, compounded with the fact that it was rooted in his own childhood, would create a barrier to people connecting with the film. “From what I’ve been told repeatedly, though, the feeling of nostalgia, combined with the uncanny valley, has leaked through into the viewer’s own experience,” he says. 

Lawton tells me he has found the process of being able to see more context around his foundational memories to be therapeutic, even when the AI-generated memory wasn’t entirely true.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.