Arthur I. Miller applies the Page 99 Test to ‘The Artist in the Machine’

Amazing concept, as defined by Ford Madox Ford: ‘Open any book at page 99 to reveal the quality of the whole’!

Here’s what happens when you open The Artist in the Machine at p. 99 …

Translating one image into another … is like translating between languages, like between English and French. They are two different representations of the same world,” says Phillip Isola.

Isola and his coworkers invented a variation on GANs that he calls conditional generative adversarial networks (CGANs). They are conditional because instead of starting the generator network (G) from noise, from nothing, they condition it by using an actual image. Rather than feeding the discriminator network (D) on huge caches of images, they use pairs of images, such as a black-and-white image of a scene and the same scene in color. Then, they input a new black-and-white scene into the generator network. Initially D rejects the new scene, so G colorizes it. In other words, the output is conditioned by the input, which is what GANs are all about. As a result, Pix2Pix, as Isola calls his system, requires a much smaller set of training data than other supervised learning algorithms.

Thus Isola discovered how to translate an image of one sort into another sort: Pix2Pix, pixels to pixels. As he puts it, all those “little problems in computer vision were just mapping of pixels to pixels.” While style transfer transfers the style of one image onto another, creating an image “in the style of” a painting by Picasso, for example, Pix2Pix goes further. Like Leon Gatys, who invented style transfer, Isola is interested in perception, how we see. […]

Read full article at The Page 99 Test