A creative technologist at The MET is finding intimate new uses for neural networks.
GIFs via Vimeo
What if an oil painting could come to life? Thanks to increasingly accessible machine learning software, Metropolian Museum of Art creative technologist Chewy Wu has implemented a style transfer neural network that does just that—without the art industrial complex behind a film like Loving Vincent. This cousin to Google's puppyslug-oozing Deepdream allows Wu to endow a potpourri of beach, timelapse, and performance footage with the color and texture of oil paints.
As a creative developer at AKQA NY, a founder of OFCOURSE.io, former participant at ArtAHack, and Parsons and School for Poetic Computation alum, Wu was able to use code and instructions found on the internet to give her film the texture of a Monet and the irridescent color palette of a glich artist. Seeing timelapse and custom performance footage is a fairly new step for style transfer, which has been applied in spades to famous paintings and iconic photos. Wu's transformation of her friends, her hometown of Shanghai, and her own body are intimate and refreshing. Enjoy her video art experiment below.
See more more Chewy Wu on her website.