One of the first photos Mohandas tried uploading was a selfie with his wife and daughter in front of a temple in Indonesia. Google’s analysis was exhaustive, even documenting the specific watch model that his wife was wearing, a Casio F-91W. But then, Mohandas says, the AI did something strange: It noted that Casio F-91W watches are commonly associated with Islamic extremists. “We had to tweak the prompts to make it slightly more wholesome but still spooky,”
As a research physicist from MIT, Stuart Sevier learned a lot about reality, technology, and perhaps most importantly, the perception of reality. He veered off his hardcore academic track to pursue the concept of reality from a more engineering-based perspective, ultimately founding Atom Images and working with a talented team to build the Atom H1, a tool built for photographers to capture trusted, authentic images in a world where the line between real and fake is becoming blurrier by the day.
The knowledge that Halide won’t gloss over imperfections makes me slow down and consider the creative process for a beat longer. It makes me think more about what I’m seeing
The Adobe Content Authenticity web app will be available as a free public beta starting early next year. Content Authenticity will be further integrated into all Adobe Creative Cloud apps that support Content Credentials, including Photoshop, Lightroom, and more. Adobe will share additional information at Adobe MAX later this month, and interested users can sign up to join the beta waitlist now.
That wasn’t the only contentious comments delivered by Schmidt who left Google in 2020. He also blamed Work From Home (WFH) culture for the company’s woes.
The response to the podcast was immediate — so fast that it isn’t even possible that a majority of those who left comments and hit the dislike button could have listened to the whole podcast. Simply for setting foot in Adobe’s building, we were called shills as the hate flowed in. It feels very much like a “shoot the messenger” situation — one I’ve been in before, but that doesn’t make it any easier to come to terms with.
Over the past few weeks, we’ve compiled the questions you want Adobe to answer related to its push into AI, recent controversies, and the state of photography in general. We had a chance to sit down with Maria Yap, Adobe’s Vice President of Digital Imaging, to give the company a chance to respond.
The scientists say that when a camera operator appears on screen, it “detracts from critical game moments” and could lead to “revenue losses for broadcasters because of viewer dissatisfaction.”
The chief technology officer of OpenAI thinks that the advent of artificial intelligence will mean “some creative jobs maybe will go” but adds that “maybe they shouldn’t have been there in the first place.”
Since Google overhauled its search engine, publishers have tried to assess the danger to their brittle business models while calling for government intervention.
“It potentially chokes off the original creators of the content,” Mr. Pine said. The feature, AI Overviews, felt like another step toward generative A.I. replacing “the publications that they have cannibalized,” he added.
The image provenance system will soon be available as an option to “those who require a photo editing workflow that is compliant with the C2PA standard.”
Tucked at the bottom of its GFX 100S II announcement, Fujifilm says that it is joining the Content Authenticity Initiative to bring verification to its interchangeable lens series cameras.
OpenAI, Google and Meta ignored corporate policies, altered their own rules and discussed skirting copyright law as they sought online information to train their newest artificial intelligence systems.
At Meta, which owns Facebook and Instagram, managers, lawyers and engineers last year discussed buying the publishing house Simon & Schuster to procure long works, according to recordings of internal meetings obtained by The Times. They also conferred on gathering copyrighted data from across the internet, even if that meant facing lawsuits. Negotiating licenses with publishers, artists, musicians and the news industry would take too long, they said.
The huge list includes thousands of artists’ names; among them are scores of photographers — both living and dead — whose styles Midjourney apparently wanted to copy so that users of its AI image generator could make AI pictures in the style of that photographer.
“I think part of the reason news organizations are now looking so carefully at OpenAI is because they have 20 years of history indicating that if we’re not careful, we’ll give away the keys to the kingdom,” said Andrew Morse, the publisher of The Atlanta Journal-Constitution, the flagship newspaper of Cox Media Group, which is not in talks with OpenAI.
Many photographers and photo editors have been faced with a picture they don’t know the location of but would like to. And while this new AI technology helps with that task, others have pointed to privacy concerns.
Last week, Lightricks CEO Zeev Farbman candidly shared how AI is impacting Lightricks’ photo editing app business. For one, AI tech makes some of the work that used to require specialized software and expertise trivial, according to Farbman in last week’s interview with Axios.
This drives us to the core issue: what is that trust’s value? Does a photograph that contains that certificate have more value than one that doesn’t? And if so, how much more?
The rise of AI imagery is of real concern to public discourse: The actress Rosie O’Donnell shared this AI image from TikTok of a Palestinian mother dragging her children and belongings down a rubble-strewn road. She believed it was real stating that it was not an AI image — but she later deleted the post.