I work for myself and I am cruel boss so I don’t take too much time off. But I make an exception on Thanksgiving so yesterday I let myself get creative and play around a bit with some machine learning models. The end result of playing around is the image above which composites a still from the movie JAWS into a painting of a shark attack from 1778 by John Singleton Copley called Watson and the Shark. You can see the two source images below.
Like lots of people, sharks freak me out. Copley’s “Watson and the Shark” is a large painting (72 1/4 x 90 3/8) and it made an equally large impression on me as a child visiting the museum. Learning that the person getting attacked by the shark in the painting was a 14 year old boy didn’t make me feel much better at the time. Interestingly, the boy (Brook Watson) survived minus one leg and eventually became Lord Mayor of London and an original committee member of Llyods of London eventually becoming an early chairman. It was Watson who commissioned the epic painting from Boston born painter John Singleton Copley.
Around the same time I saw this painting I also watched JAWs for the first time. The two experiences have been intermingled in my mind ever since. So I thought it would be fun to try and use a combination of neural style transfer and inpainting to combine them.
First step was to find a film still from JAWS that looked like it would fit with the image. I lucked out and the first image from Google that came up felt like it could be a good fit. I then opened the images in a software package called Runway ML created by my friend Cristóbal Valenzuela to “bring the power of artificial intelligence to your creative projects with an intuitive and simple visual interface.” In Runway I selected the Ada-IN Style-Transfer model and loaded the JAWS film still first and then loaded the Copley painting second as a source for the style transfer. The results were pretty solid right off the bat!
I then opened the original image of “Watson And The Shark” in Photoshop and extended the canvas space on both sides (which was initially blank) and crudely placed the two men on the left side and the shark on the right. Though as you can see from the screen shot of Runway ML the style transfer did a great job, I tweaked the levels and saturation a bit to get it even closer to the painting - but honestly not very much.
Then I used Photoshop’s “content aware fill” tool which is essentially an inpainting algorithm that pulls information from surrounding areas to fill in holes or areas where there is no content. To do this you just mask off the areas you want to fill using inpainting and then activate the content aware fill tools. I like to mask the seams between images to smooth out the transitions.
And that is basically it! Actually, I ran the whole thing through the style transfer one more time with my new collaged image as the destination and the original painting as the style source, this helped a lot with making the whole image look and feel unified covering up color and texture differences. Final image below.
I did some other cool/fun stuff with Runway and Photoshop including this layered Inpainting which is created from a few dozen layers and creates a really cool output image.
I don’t usually do tutorials but if you dig this let me know and maybe I will do some more (or bug my friends who are real experts to see if they’d be willing to share a tip or two).