Happy New Year and welcome back to the Photo AI newsletter! I’ve been on hiatus for a few reasons I’ll get into further down, but first let’s tackle the latest computational photography kerfluffle that has (re)ignited photographers’ ire with Adobe.
Adobe Using Your Photos for Its AI Isn’t New
This week’s uproar in the photography world is news that Adobe uses images uploaded to its cloud services to help train its machine learning features. I was a little surprised, to be honest, because this isn’t something new: Adobe has been doing it since at least June 27, 2018 based on this Wayback Machine archived version of Adobe’s Machine Learning FAQ. Let’s quickly break down what’s going on, why photographers are upset, and how to easily opt out of the feature if it makes you uncomfortable.
Feeding the Machine (Learning)
I first saw a note about this in a Mastodon post by Baldur Bjarnason, as well as articles at DPReview and PetaPixel.
I think what most people are concerned about is twofold: that their photos are going to be the basis of Generative AI applications like DALL-E that draw on existing imagery to create photorealistic scenes that don’t exist; and that the feature is opt-out, meaning your photos are already being scanned if they touch Adobe’s cloud services in any way. (The option is automatically enabled for customers in the United States; some people in other countries report that the feature is not enabled by default.)
Which images does this include? Lightroom desktop (the newer one) and the Lightroom mobile apps sync photos to Creative Cloud by default, which enables you to view and edit your images easily on any device. Lightroom Classic can sync individual collections (albums) to the cloud, but you have to specifically mark them for upload. Photoshop also has the ability to save its files to the cloud (and it annoys me how aggressively the app pushes the option, but that’s a conversation for another time). Also, Adobe includes anything you share to Adobe Stock or Behance, or content you submit as Lightroom tutorials or to Adobe Express or Adobe Live.
One key part of machine learning systems is data... lots of data. You and I have been ingesting visual data all our lives and discerning objects and scenes from blobs of light and color. We look out a window and see shapes we identify as “sky” and “trees” and “dog” and “that damn raccoon again.” At some point someone told us what those things are, or we deduced them based on interactions with other similar things.
Computational photography needs to do the same thing, but hasn’t been at it for as long as we have. In my first Smarter Image column for Popular Photography, I wrote:
In ML [machine learning], a software program is fed thousands or millions of examples of data—in our case, images—as the building blocks to “learn” information. For example, some apps can identify that a landscape photo you shot contains a sky, some trees, and a pickup truck. The software has ingested images that contain those objects and are identified as such. So when a photo contains a green, vertical, roughly triangular shape with appendages that resemble leaves, it’s identified as a tree.
“Machine learning” really is a superior term to “AI” (artificial intelligence), because computer models are fed thousands or millions of images, and developers identify the items in them. And then, as the model processes more information, it’s able to pick up on similarities (a pine tree and a maple tree look different, but they have similar structures, for instance) to expand its “knowledge.”
The most exciting features in photography right now involve ML-assisted features: the new masking tools in Lightroom and Lightroom Classic; portrait retouching in Luminar Neo, ON1 Photo RAW, and Retouch4Me; preprocessing in DxO PureRAW 2; to name just a few examples. Those features work because they can identify that you want to illuminate a person’s eyes and not their whole face, or adjust the exposure of a sky interrupted by trees. And they do that because the ML models have been fed lots of photos.
I’ve been meaning to write about where all this imagery comes from, but it’s not always easy to pin down. Adobe uses the millions of photos synced via Lightroom (which automatically uploads your images to Creative Cloud) and Lightroom Classic (collections you’ve chosen to sync so you can edit the images on a phone or iPad), which is what this current brouhaha is about. I assume Apple does the same thing with iCloud Photos, but my efforts to get an answer from Apple haven’t proved fruitful (I’m ever optimistic though). (And no, I didn’t realize I was making a fruit pun until I re-read that sentence.) Some companies use public-domain image collections, such as images marked with Creative-Commons rights on Flickr, or purchase image libraries for machine learning training. I assume Facebook harvests everything, but haven’t yet dug into the fine print of their legalese to confirm.
Why Photographers Are Upset
The snarky side of me wants to say, “Well, it’s Adobe.” Photographers seem to be either all-in on Adobe, or forever resentful that the company switched to a subscription model (which has turned out to be enormously lucrative). The comments on the DPReview article above include predictable entries like, “I’m still using Lightroom 6 and I’ll never contribute any more money to Adobe’s evil machine!” (I’m paraphrasing).
I think there’s some photographer-ego involved, too, because people think Adobe is directly stealing and repurposing their precious images. Will that awesome sunset you shot appear in a big Adobe ad campaign? Highly doubtful. Will pieces of your images show up in AI-generated compositions? Also highly doubtful—but not entirely out of the question!
This is where things get muddied, because even if it’s unlikely that a person’s photo or a recognizable piece of one will appear in a generated image, that photo is still being used outside the boundaries of what the photographer intended. This article from The Verge includes a section called “The input question: can you use copyright-protected data to train AI models?” that is a good read.
So while I want to wave my hand and say that, practically, having my photos involved in a massive training dataset won’t affect me, there’s still the principle of it. It’s Adobe using its customers’ images in ways most probably don’t realize. And I’m sure a lot of the ire is because the usage is already happening without photographers opting in to the program, regardless of whether Adobe has been doing it since 2018.
How to Opt Out
If you’re uncomfortable with Adobe’s usage of your imagery, you can take action. First, log in to your Adobe Account and click the “Account and security” heading. Then click “Privacy and personal data” in the sidebar. Under “Content analysis,” turn OFF “Allow my content to be analyzed by Adobe for product improvement and development purposes.”
Clicking Learn More brings up Adobe’s Content Analysis FAQ that describes in more detail how the company is using the data.
Photo AI and The Smarter Image
This is the first Photo AI email since September. Initially the newsletter served as a quick announcement whenever I published a new Smarter Image column at Popular Photography. Unfortunately, in October the owners of PopPhoto killed all freelance contributions, laid off my fabulous editor Dan Bracaglia with one day’s notice, and shifted the mag’s focus more toward revenue-generating link lists and some photo news. So that was the end of The Smarter Image at PopPhoto.
However, I made a point when the column began to retain The Smarter Image name, so I plan to rebrand this newsletter with that (better) title and turn it into its own thing. Thank you to my existing paid subscribers for supporting this effort!
As that gets spun up, I’d love to get your opinions on the types of material you’re most interested in for this newsletter. Do you read this in email, or in the Substack app? And, shamelessly, are there topics, focuses, or features that would encourage you to be a paying subscriber? Having The Smarter Image stand on its own as a revenue source is a primary goal for me as an independent freelancer.
Email me at jeff@jeffcarlson.com and let me know what you think. Thank you!
Great take, Jeff, unsurprisingly. Also, thank you for showing how folks can opt out. We'll share this, too.