Photographer Defends Hyper-Real Radio Times Covers Against AI-Generation Accusations
British photographer Robert Wilson shot 210 RAW stills at 30fps to create hyper-real Radio Times covers that viewers widely accused of being AI-generated.

British photographer Robert Wilson is pushing back against a wave of AI-generation accusations after his 12-cover Radio Times campaign for the TV show Last One Laughing went viral for all the wrong reasons. Wilson, who produced a looping video assembled from his hyper-realistic portrait stills, says the assumption that his work is machine-generated is "frustrating" and "categorically not the case."
The campaign, promoted by Radio Times on March 9, features comedians from Last One Laughing, a UK show in which performers try to keep straight faces while their colleagues do everything possible to make them crack. Radio Times billed it as "this year's legendary line-up of comedians spotlit across 12 covers, as they compete to be the Last One Laughing." The resulting portraits, shot individually against a RadioTimes backdrop, include a woman with red hair, a man with glasses, and a man in a blue suit, each holding an expression somewhere between composed and about to break.
The controversy stems from the looping video Wilson assembled from those stills. Its razor-sharp, hyper-real quality triggered widespread accusations online that the images were AI-generated. Wilson's technical explanation is worth reading closely, because the workflow is genuinely unusual for a stills photographer.
He locked off the camera and shot RAW photos at 30 frames per second, treating the session as a high-volume still photography capture rather than video production. The key step came in post: rather than grading the files as video footage, he processed them in photography software specifically to exploit the wider dynamic range available in RAW files. "I graded the RAW files with photography software rather than for moving footage as there's so much more latitude in the RAW file," he told PetaPixel. Once graded, those individual frames went back into a 30 FPS timeline. "Once graded, the images were placed back into a 30 FPS timeline and voilà, you have moving footage that looks exactly like the stills/print images that have a slightly hyperreal feel."
In total, Wilson said he took 210 high-resolution photographs to build the video, a detail he shared directly with a skeptic on Instagram. The response he got back was blunt: "Then that's an awful lot of work to produce something that looks exactly like AI."
That exchange captures the bind Wilson is in. The very quality that makes the work technically impressive, the way RAW-graded stills retain a crispness and tonal richness that standard video grading can't match, is precisely what's triggering the AI suspicion. The hyperreal feel Wilson describes is an artifact of the photographic process, not a departure from it. For working photographers, the irony is sharp: a workflow that demands more technical precision than most editorial shoots is being dismissed as a shortcut.
Know something we missed? Have a correction or additional information?
Submit a Tip
