Photoshop fans will be glad to know that a very interesting new AI imaging tool has been unveiled by a group of talented researchers from Google, and it’s here to revolutionize photo editing. Working alongside the Max Planck Institute of Informatics, they have developed DragGAN, a user-friendly point-based image manipulation tool.

The best part is that you can now edit photos like a pro, even without any prior experience. With DragGAN, you can easily move multiple points of an image along a desired path defined by you. And here’s the really cool part: the AI ensures that the final result looks realistic and natural, so no more worrying about distorted images.

All it takes is a simple click of your cursor, and voila: your photos can be transformed exactly the way you want them to be. Currently, DragGAN is still in the research phase, but the overwhelming interest in this tool has caused the team’s homepage to crash multiple times over the past couple of days — yup, that’s how excited people are about it.

Unlike the traditional photo editors that simply pull and warp images, DragGAN takes a whole new approach: it regenerates the entire object in the image to accommodate the changes you make. This means smoother edits and a more seamless editing experience overall.

But that’s not all! DragGAN can also work hand-in-hand with text-to-image generative AI tools like Midjourney or Runway. If the results from those tools don’t quite meet your expectations, you can use DragGAN to edit them quickly and efficiently, surpassing the capabilities of even professional editing suites.

The research paper showcases some fascinating examples of what DragGAN can do; you can change the height of a mountain, move models to different positions, resize their clothes and even make a lion appear to roar by opening or closing its mouth. And if something is missing from an image, don’t worry — AI can cleverly fill in the gaps for you.

While there’s no specific release date for DragGAN’s mainstream availability just yet, the team has hinted that the code will be made available on their Github page in June 2023. I hope you’re as excited as we are to try it out!

Filed in General. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading