infinite-mondrian
infinite-mondrian copied to clipboard
Multi-platform web experiment (WebGL, WebRTC, HTML5) to generate Mondrian-esque world to take snapshots from
Infinite Mondrian
"By virtue of the grid, [Mondrian's work] is presented as a mere fragment, a tiny piece arbitrarily cropped from an infinitely larger fabric." —Krauss. Infinite Mondrian marries the infinity supposed by computation and Mondrian.
Infinite Mondrian is a multi-platform (desktop, mobile, desktop+mobile, Google Cardboard VR) web experiment that utilizes WebGL (THREE.js), WebRTC (PeerJS), and other new web API (gyroscope, device-orientation, fullscreen, pointerlock, etc.) to create a procedurally generated 3D world that resembles a Mondrian painting. Infinite Mondrian allows users to create their own "Mondrian painting" by taking a picture of a cross section of this infinite world then claim it as their own to share it with the social media world.
The present page talks more of the technical aspect of Infinite Mondrian. For the art aspect see the blog post.
Example shots taken in app
Usage
- On a desktop: use mouse to navigate the world, and keyboard keys to manipulate the camera (letters on the button below instructions).
- On a mobile device: use the phone's gyroscope to navigate the world.
- On a desktop + mobile: Open the app on a desktop, and access the url under Use a Remote Controller on a mobile device (use another tab to test if unavailable). The mobile device will remotely control the app on desktop; its gyroscope will navigate the world, the buttons on the mobile device will manipulate the camera. Requires Android (maybe one day other platform will support WebRTC). The mobile device acts as an avatar for a "real camera" into the world.
- On a Google Cardboard: On a mobile device, access the app and click the enable button. Make sure the device is in landscape mode.
How it works
THREE.js is used to generate blocks and bars at random. A camera is placed within an object whose quaternion, referred here and in the code as the cameraman, can be manipulated by the mouse. The cameraman can also be modified through a mobile device's gyroscope. WebRTC, through PeerJS, is used to transfer a mobile device's gyroscope values with low latency in the case of a remote control.
The cameraman is moved forward every frame in z-space, which is oriented by the aforementioned quaternion. Every frame, all blocks/bars (400) are checked against the position of the camera. If a block is behind the camera (based on the angle from camera's normal to block) and is greater than some distance between the camera and beyond the fog, it is moved to a random space in fromt of the camera beyond the fog.
Cardboard view is done using THREE.js' stereo effect.
Gyroscope control is done based on THREE.js' device orientation.
Blurring is done using effect composer with vertical and horizontal blur based on this StackOverflow quesetion.
Running your own
- Download the files:
$ git clone https://github.com/jssolichin/infinite-mondrian
- Install dependencies:
$ npm install; bower install
. - Edit public/js/config.js and change host and port as necessary. Default should work fine on local machine.
- Run the app:
$ node app.js
.
Special Thanks
Originally done for Jennifer Steinkamp, with John Brumley as T.A, at UCLA DMA.