Format of sequence: MINERAL-MAMMAL-MACHINE, delineates evolution form incoherent scratches, to mammalian cries and human level language, to machine language.
(Read more to see video)
Critters. Fauna. Fibonacci.
A baked shadow render bestiary. Alphabet ecosystem. Cross-referenced between languages. Technically: how many polygons can Unity hold if textures are baked? Will a smaller version play in Yurt if exported in FBX format to VRUI?
Source files available on request.
So as it is…
Clumps, clots, clouds, flocks, herds, cells, knots
in the blood brain earth sky stars.
In-between the tube videos will be clump videos: these will be easier occasionally technically since each cloud/flock/knot can be isolated on its own screen, no need for synchronized parallel renders unless they cross the edge of a screen.
Prototype Image: O meets E
In this image a clump of Os meets a clump of Es. Animation occurs by placing a static bend transformer on each cloud, then moving the clouds together through their constraining deformers (instead of changing the deformer, changing the object relevant to its deformer). Ambient occlusion with global illumination and final gather. 220 passes. No default lights. Source file available on request.
Organisms are tubes that eat and excrete, inhale and exhale, thinking using neuron paths, absorbing nutrients through gut tunnels. Toothpaste comes in tubes, as does astronaut food. Electricity and water travel through tubes. Languages and digital info travel through fiber-optic tubes between people.
Tubes made of language can therefore be playfully considered as organisms. Worms made flesh.
Collier and MK will apply erasure techniques to Basic Law Article 1-23 Chapter 1-2 as a first test and give word documents to Leoson.
(basic law) technical solution for collier.
capture each chapter/poem as one texture and map in a 3D plane. I split the planes into separate 3D object.
Thus, each word could be animated and each word is treated as one object. like following the curve, more random, dynamic…etc.
The unselected words will be disappeared and and selected words will be animated as in many ways in terms of floating, waving…etc
Cantonese: je1: 遮
Putonghua: yue saan, used for umbrella movement b/c saan also means separate/together: 雨傘
Saan tsui: separate/together, made of Putonghua umbrella character and together: 傘聚
Chater Garden: 遮打花園 je1 da2 fa1 yuen4. “Chater” is je1 (umbrella) da2 (hit): the park in Central Where originally Occupy Central was supposed to be.
Police man: tsi1 mo5 (慈母 = “kind mother” literally), “bringing weapons” same sound: 持武. After Umbrella Movement, govt says “police are like kind mother!” Protesters say “police are bringing weapons!”
Use emPolygonizer to morph & mutate a short poem.
Begin to learn the software with a set of words based on a mutated great chain of being. Distribute the following words around the cylinder and morph in between the english-cantonese. 5 second loop video. To be placed in-between other poems.
Mineral – Plants – Animals – Humans – Ghosts – Gods – Multiverse
矿物 – 植物 – 动物 – 人类 – 鬼 – 神 – 多元宇宙
- I have a single-seat license for previous version of emPolygonizer . Upgrade and install on Toshiba.
- Produce simple film as a test using “Mineral – Plants – Animals – Humans – Ghosts – Gods – Multiverse 矿物 – 植物 – 动物 – 人类 – 鬼 – 神 – 多元宇宙”
- Experiment with interactive jumps in film: is it smooth… etc…
- See https://vimeo.com/89911203
Use appropriated images (from http://jhave.tumblr.com/ ) to form stacked trptychs with Cantonese and English words in gaps between the images (see https://ello.co/jhave2 for examples and sample phrases) .
Same technique for remixing the images and words from http://jhavehk.tumblr.com/ Example : healer image uses title of photo beneath it. Stanzas could go on either side.
Similar technique for words from https://twitter.com/jhave2 . Note: tweets since November 2014 are stored here in a doc file that automatically appends recent tweets. No images needed. Professional translation required.
- Use some sort of automated extraction to download all of these sites.
- Use batch auto-translate to perform rough english-to-cantonese
- Dynamic load images
- Dynamic resize
- Simple animation slides
- Dynamic text
- Randomize load but empty array (see every image and every text before repeat)
- Interactive trigger to shuffle image-text
- Add soundtrack from https://glia.bandcamp.com/ (experiment in consultation with Jhave)
- Long-range goal (Jhave + Leoson todo together) : run machine-learning python script to extract patterns in text ( see http://bdp.glia.ca/t-sne-classification-of-10557-poems/ ) and a similar script to analyze images. Then dynamic load based on proximity.
Idea for micro-prototype. Status: Early test.
- Render from Maya a video loop of the word “Curved 弯”so that the word appears to be circling the 360 screen.
- Insert into Unity using video plugins
- Remap onto cylinder geometry using material texture captured every frame.
- Render a longer video with variations of speed (acceleration-deceleration)
- Render with English-Cantonese morphings.
- Add sound synchronized to speed.
- Make video documentation using camera on a tripod (or on a turntable rotating in centre of gallery). Potentially add a few people walking in opposite direction. Change the people as the camera rotates so that the swapping out of people is not visible. Single long shot: 1 minute. Cast: core collaborators. Date: last week of August.
AVPro Windows Media… a multi-thread video decoder: http://www.renderheads.com/portfolio/UnityAVProWindowsMedia/