Given that the detection of manipulation in 2014 and 2015 occurred in different rounds, it’s impossible to tell whether there has been an aggregate increase, but a few key issues remain:
I think most would agree that “material addition or subtraction” from a still frame is a blatant affront to viewers and to the truth. We should all be alarmed that twenty percent of final-round images had some element of outright fabrication
A large number of entries were found to have been manipulated or post-processed carelessly
In a statement, John Moody, the executive editor of Fox News, said that after careful consideration the network felt that giving its audience “the option to see for themselves the barbarity of ISIS outweighed legitimate concerns about the graphic nature of the video. Online users can choose to view or not view this disturbing content.”
So look. Then turn away. If you don’t need to look, I’m with you. If you need to look, you don’t have to apologize. Whatever you do, realize that the stakes are higher than had been imagined. For the same reason, know that it becomes all the more important to understand why ISIS exists at all, and how to break the cycle of violence and the downward spiral that serves them all too well. For that, we need many other images, and much more as well. Not least, we need to appreciate how civilization is a way of seeing
99% of the conversation regarding what can and cannot be done to a photograph is about post processing, after the image has been taken . Little, or none, is about before or when the image is taken.
Cry Me an Exoneration: After Officer Kills Civilian, Police Use Video as Sympathy Propaganda, Media Bait
A police shooting in Montana and the subsequent use of that video, however, not only raises disturbing questions but opens a Pandora’s Box of new concerns over how and how much these videos can be selectively edited and distributed as propaganda … and how much the media can collude with the state to distribute these versions.
Slate’s Ben Mathis-Lilley points out that the image—used to adorn multiple stories this month on the NRA’s political advocacy site—sure looks like this stock photo of a “campaign rally” from Getty
One ultra-orthodox Jewish newspaper decided to cover the story a little differently, though: it’s front page photo was a manipulated one that left out female world leaders.
Most legacy news media organizations said Wednesday that they have no plans to publish or broadcast photos of Charlie Hebdo cartoons portraying the Muslim prophet Mohammed, while many new digital outlets are running the images.
instead of giving snapshots of what the industry is doing and how policy varies from one desk to another, why doesn’t the World Press follow-up with a 5 point document that clearly define what is acceptable/not acceptable in photojournalism today and tomorrow and politely asks for everyone making a living ( or not) from this profession to approve it and implement it
In response to the increasing ambiguity over acceptable levels of manipulation in photojournalism contests, World Press Photo commissioned a report entitled “The Integrity of the Image: Current practices and accepted standards relating to the manipulation of still images in photojournalism and documentary photography.” It’s 20 pages long, so here’s the tl;dr:
What is current practice, and what are the accepted standards internationally, when it comes to the manipulation of still images in photojournalism? Earlier this year, the World Press Photo Academy commissioned Dr. David Campbell to conduct research on “The Integrity of the Image”, and to assess contemporary industry standards worldwide. The report of his findings is now available.
Every digital image must be touched by software before you see it. But when each pixel is affected, who decides what is true?
The picture is gut-wrenching. It also tells the story. That’s why we chose to run it
Feminists in France are demanding that a statue based on Alfred Eisenstaedt‘s iconic ‘VJ-day in Times Square’ photo be taken down. They say that the original image it was based on is one that portrays sexual assault.
Its unusual composition and the fact that the militants’ silhouettes seem out of proportion to other elements led some AFP clients to call the agency to check it was real. Of course, it was.
So Google’s algorithms took the two similar photos and created a moment in history that never existed, one where my wife and I smiled our best (or what the algorithm determined was our best) at the exact same microsecond, in a restaurant in Normandy.
The episode “should be treated like a sex crime, a privacy invasion taken to an extreme,” said Jules Polonetsky, executive director of the Future of Privacy Forum, an advocacy group based in Washington. “Sites allowing the sharing of these pictures can and should be taking proactive action to remove these pictures.”