Skynet Merges with Photoshop: The Future is Calling
No, Adobe isn't sending Terminators back in time to crush the competition, but they are adding more artificial intelligence features to Photoshop. Does the future belong to AI?
According to the Terminator movie timeline, Skynet first achieved consciousness at 2:14 am Eastern Time on August 29th, 1997. The following year, Adobe released Photoshop version 5 that brought several new features, including the Magnetic Lasso tool, that simplified creating selections and made the process much quicker and intuitive. It was uncanny. Somehow, it knew where the edges were in an image. By 2003, they introduced the Creative Suite that linked multiple Adobe applications on your desktop to share settings and integrate capabilities, becoming stronger than the sum of its parts. Ten years later, through the Creative Cloud subscription model, Adobe connected every Photoshop copy on the planet to the Internet. Skynet now spanned the globe.
By 2029 in the movie franchise, Tech-Com and John Connor were on the verge of winning the war, but Skynet sent a T-800 Terminator back in time to 1984 to kill Sarah Connor. Less than two years earlier, John Warnock founded Adobe in his garage with Charles Geschke after both left Xerox Palo Alto Research Center. PARC specialized in innovating in the field of computer technology and hardware.
Coincidence? Well, yes, of course. But whether Adobe or whoever your current favorite purveyor of digital darkroom editing software is, artificial intelligence is popping up all over. Computers are becoming more powerful, and competition drives companies to find ways of leveraging them.
Somewhat overshadowed by the pandemic coverage and the US presidential election this week, Adobe has just released a major update to Photoshop. Yes, to ring in 2021, Adobe is giving us a long list of really cool new features. If you're on the Creative Suite subscription plan, you should have already gotten the update for free. If you're not on a subscription, naughty you. Adobe doesn't really give this stuff away for free, you know. We pay via our subscription fees.
For Photographers, the top feature would probably be Sky Replacement. With just a few simple clicks, you can create an amazingly good selection mask of the sky in an image. From there, you can either replace it with one of your own or choose the perfect one for the occasion from Adobe's preset library. Expect to see those skies pop up in shared photos everywhere soon. Skylum Luminar has had an AI sky filter for a while now, but Adobe seems to have taken the idea to the next level. Skynet, here we come. But for now, we have Adobe Sensei, the AI engine that powers all of these new tools.
But for sheer media buzz, the top feature has to be Neural Filters. Simple sliders perform intelligent edits, driven by Sensei, in ways that would be impossible for all but the most experienced users. Most sliders are targeted toward portrait photographers, although some are more generally applicable, and it's clear that Adobe intends to add more Neural Filters in subsequent releases. For now, I played with a photo of myself, quickly turning my handsome visage into that of an extremely old man with long hair and a big grin. The sliders are even labeled with names like "Happiness," "Facial Age," "Surprise, and "Anger." They mostly do what you would expect with no manual selection. At the extremely comical ends of the spectrum, it did get confused a bit over my glasses. But Adobe is working on a slider to remove glasses. I'm personally not sure what I would do with many of these, but they are fun regardless.
Adobe Sensei relies heavily on internet access to provide input to the AI engine. But response times seem quite acceptable regardless, especially given the quality of the results. Even if I may be able to do better sometimes by hand without, the truth is, it's frighteningly good, and the manual method would take me far, far longer to realize those gains. As a technology, call me impressed.
Careful readers have no doubt picked up on the fact that I have mixed feelings about Sensei and AI editing in general. Such things can and surely will be abused, misused, and overused. Just stare in wonderment at some of "HDR" images if you don't believe me. And AI tools can also unquestionably be powerful tools that to make fast work of nearly impossible and time-consuming editing drudgery in front of a compute screen when you could be out shooting. When used with restraint and a skilled eye, these tools will be revolutionary.
And photographers have been modifying their images in various ways since the first shutters were clicked. For example, read about how Ansel Adams changed the way he shot, knowing what he could later do with that negative in the darkroom. Photographers have always looked for ways to overcome limits and take their craft to the next level. When I shot film, I carried a stack of colored filters to alter how specific images were rendered. One of my early purchases included a stack of Cokin colored filters, including the dreaded sunset filter. Once you've seen what they do, you can easily spot others taken with one. They all look the same unnatural orange, even in the shadows. By the time we got to the digital darkroom, the tools improved, but the concept remained the same. Each new generation of software gives us even more. Sensei is just the next step. It all depends on what each of us does with the tools we have, tempting though it may be to push things too far. It's been said that if you're good enough at Photoshop, you don't even need a camera. But what would be the fun in that, I ask you?
Anyway, just some thoughts I've had this week. It will likely take some time to work out how useful these tools are for real editing. And in the meantime, Adobe is sure to be secretly working on their next updates. To paraphrase from the Terminator movies: A person can go crazy thinking about this. The unknown future rolls toward us, but I face it with a sense of hope. We are in uncharted territory now, making up history as we go along.
AI, here we come.