More companies should follow Adobe’s lead on AI

OPINION: Adobe recently jumped on the generative AI bandwagon with its own text-to-image generator, called Adobe Firefly. Here’s why I think more companies need to take a leaf out of Adobe’s book when it comes to AI art and generative AI.

Whether you think it’s a good thing or the beginning of the end, it seems that we’re fast heading toward a future in which AI is interwoven into both our personal and professional lives. 

Generative AI in particular has seen a huge increase in popularity with the launch of the OpenAI-developed text generator ChatGPT and text-to-image generator Dall-E 2 in 2022, making it possible for anyone to create a story, essay or artwork in just a few clicks.

Since then, a large number of companies and apps, from Bing and Google to Snapchat and now Adobe, have launched their own AI chatbots and image generators, some of which are based on OpenAI’s AI models and others not. 

Adobe Firefly’s biggest competition in the AI art space is unarguably Dall-E 2 but where OpenAI has started debates for using existing works scraped from the web to train its AI model, Adobe hopes to introduce a level of consent that I think should be the bare minimum for all generative AI models. 

Adobe Firefly

Before continuing, it’s important to note that AI is nothing new for Adobe. Photoshop has been benefitting from Adobe Sensei-powered features like the Sky Replacement tool and Content-Aware Fill for several years now. The same goes for Premiere Pro, After Effects and other Creative Cloud programmes that benefit from Sensei’s AI frameworks.

However, Firefly is Adobe’s first set of generative AI models. This means that users – regardless of experience or skill – will now be able to go a step further and use a string of words to describe and generate images, vectors, audio, video and 3D content, as well as tools like brushes and gradients that can be used to produce works. 

The most interesting part of Firefly is Adobe’s “Do Not Train” content credential tag, which allows artists to request that their content not be used to train models. Adobe says that this tag will stay associated with the artwork wherever it is used, stored or published. 

“With industry adoption, this will help prevent web crawlers from using works with “Do Not Train” credentials as part of a dataset”, explained Adobe during its 2023 Summit

Not only that but Adobe will also automatically mark works created or modified with generative AI as such, allowing viewers to more easily identify AI-generated designs.

While Adobe claims that Firefly is structured to help creators work more efficiently, there are still some obvious points of contention when it comes to the validity of AI art and whether it should be considered and treated as art at all. This is especially poignant when considering how generative art models can take jobs from working artists. 

However, if we’re heading toward an AI-powered future, the least companies can do is help people to identify when works are AI-generated and ensure that AI models aren’t trained on the work of unconsenting artists and effectively stealing and transforming their works for profit.


Source link