Adobe enters the age of generative AI with creative and enterprise apps

Reality? What reality?

Soon, apparently, everything from brand assets to corporate videos to illustrations will be created differently using Adobe’s creative and enterprise apps and its generative AI tool Firefly. You’ll literally be able to speak to paint (or make video).

If you can say it, you can draw it

At Adobe Summit, the company also announced Adobe Express for Enterprise, a new tool to create, share, and collaborate on high-quality brand content such as multimedia assets, social posts and more. It’s about to introduce generative AI tools within Photoshop, After Effects, Premiere Pro, and its customer experience tools. And it rolled out a new content creation tool for enterprise users.

For many users, the most interesting set of announcements involve the company’s intentions concerning artificial intelligence (AI) augmentation within its creative products.

AI isn’t new to Adobe. It’s powerful Sensei platform drives powerful tools such as the Neural Filters in Photoshop, but the new Firefly tools promise to raise this game very highly.

The company explains that its tools will let creative types speak to create images, videos, illustrations, and 3D images using Firefly. It will create vectors, brushes, textures, graphic designs, video, and social media posts. It can run 3D models, make brand assets, and probably more.

In the future, Adobe hopes to add more contextual intelligence to its machine.

The company also said it hopes it will allow people with creative ideas to bridge any gap in technical or drawing skills by providing all the power of creative apps in a visual form. It isn’t clearwhetherif the software will work faster on Macs, thanks to the inclusion of dedicated on-chip machine learning resources in the Neural Engine.

Creativity at the speed of speech

“Generative AI is the next evolution of AI-driven creativity and productivity, transforming the conversation between creator and computer into something more natural, intuitive and powerful,” David Wadhwani, president of Adobe’s Digital Media Business, said in a statement.

Adobe has been quite open about how it trained Firefly.

  • It used images held inside Adobe’s Stock photos catalog, to which it has a license.
  • It did not make use of unlicensed images of any kind.

The news clarifies why Adobe founded the Content Authenticity Initiative four years ago. And in support of the initiative, it will introduce a “Do Not Train” tag creatives can add to their work to prevent the AI analyzing it. (This reminds me to some extent of Steve Jobs, who famously said “Great artists steal” — except this time, Adobe says, “Great artists don’t steal.”)

Though they may use AI.

Toward an ethical AI?

There are so many concerns around AI, and Adobe has to its credit been brave enough to discuss some of them. To support this work,  it has set out a framework of AI ethics and is applying a formal review process within its engineering teams to try to ensure the AI it pumps inside its products reflect company need and human values.

A blog post on the company site notes some of the challenges that still surround AI — particularly around inherent bias within AI models. And it’s committed to constant testing of the models it creates to check for “safety and bias internally and provide those results to our engineering team to resolve any issues.”

Pledging its commitment to developing creative generative AI “responsibly,”  the company said: “Our mission is to give creators every advantage — not just creatively, but practically. As Firefly evolves, we will continue to work closely with the creative community to build technology that supports and improves the creative process.”

Adobe will integrate Firefly directly into its industry-leading tools and services, so users can include it within existing workflows. The company says it is also working to add context-aware image generation to the software.

But we need to regulate this stuff

As generative AI appears to be exploding into wider consciousness at a rate faster than the internet itself proliferated, Adobe’s warning that all stakeholders should work together to develop these technologies responsibly demands to be heard. “As it continues to evolve, generative AI will bring new challenges and it’s imperative that industry, government, and community work together to solve them,” the company said.

“By sharing best practices and adhering to standards to develop generative AI responsibly, we can unlock the unlimited possibilities it holds and build a more trustworthy digital space.”

It’s also interesting that Adobe’s jump into the now hugely hot generative AI space is taking place today. Given Adobe and Apple have long been working together on augmented reality within Project Aero; might Apple have been quietly developing its own implementations of these exciting (if also frightening) new technologies in secret for some time?

All eyes will be on WWDC to find out.

Adobe is currently beta testing Firefly. If you’re interested in finding out what it can do, you can join the beta here.

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2023 IDG Communications, Inc.


Source link

مدونة تقنية تركز على نصائح التدوين ، وتحسين محركات البحث ، ووسائل التواصل الاجتماعي ، وأدوات الهاتف المحمول ، ونصائح الكمبيوتر ، وأدلة إرشادية ونصائح عامة ونصائح