CMU School of Drama


Thursday, February 08, 2024

Apple made an AI image tool that lets you make edits by describing them

The Verge: Apple researchers released a new model that lets users describe in plain language what they want to change in a photo without ever touching photo editing software. The MGIE model, which Apple worked on with the University of California, Santa Barbara, can crop, resize, flip, and add filters to images all through text prompts.

4 comments:

Theo K said...

As a person in prep week right now I can not help but think what a big difference MGIE would make to my paperwork and rehearsal information. The most obvious example I can think of this technology's use is to edit out large margins or issues in scans. Although I think MGIE has such wonderful potential applications for the theatrical world. As with any new AI software I am concerned about how AI technology will affect artists. Like chat GBT I believe that MGIE can be used responsibly in our industry making menial tasks more efficient, however I also see the potential for this technology to be easily abused by the public. I hope Apple installs some protective measures to protect artist from having their work altered by AI without their permission before this technology gets released to the public. I am looking forward to seeing how this technology is handled.

Karter LaBarre said...

Okay this is Iconic for apple. I'm the type of person who can kind of describe what I want to be changed in an image but not be able to change it myself. I genuinely think if I could tell other people to edit videos for me but have them do exactly what I want down to the letter it would work out well. but Apple has made an AI for that! Now for more selfish reasons this sounds like it would be fantastic! however I also have to think of the application of this to others. When I read this I immediately thought of my parents who are both blind. This would be really helpful for them if I ever wanted to edit a photo or anything. they can describe what they want to top and instead of having to go through all the different tools and learning everything. I hope it's accessible and everyone can use it equally. I remember so many different apps and formats that are just inaccessible for people with disabilities, specifically blindness, because my dad was a lawyer.

Carolyn Burback said...

I never know how to feel about the AI articles because it’s always the same “wow that’s cool” and then as the article continues on to describe how advanced it’s getting I go “that’s horrifying.” I think AI is going to consume our lives in the next few decades in the way portable devices/technology did to us from the early 200s to now. As opposed to other image generative technologies, I like that Apple’s as of now has the least amount of plagiarism built into it. By this I mean with some AI tools you can demand a photo to be made from scratch which is usually the result of the AI sourcing the internet for other people’s art and photographs to create a new one. Apple’s technology seems to be more for just editing what you already have. However I’m sure it is still pulling from other people’s intellectual property because in the end AI is artificial and requires intelligence that has already been established…at the moment anyways.

Gemma said...

Another week, another AI article about a new technology that is pushing the limits of things we only dreamed about only a handful of years ago. The concept behind this AI tool is definitely interesting - in concept, but as with the vast majority of technologies nowadays, it’s potential scope of use is absolutely in the hands of the end user and tools like this are definitely going to need to be carefully safeguarded against bad actors. As with any AI tool this is a powerful concept and the way its rules work are going to define what abuses of its system/structure it’s able to protect against. Something that can edit photos based on a description only is incredibly powerful and would be a very cool and intuitive photo editing tool, but in the same vein, if the descriptions that are used are guiding it in a direction that falsifies information or sensitive information, that would be an incredibly negative contribution of the tool. Like most AI tools it’s standing on the edge of the sword, and I’m curious to see how Apple navigates this double edged one.