DAT501 – t3X: 3D Interactive Twitter Visualisation
Brief: You are asked to create and present an artwork or performance in public exhibition. Your project should explore a creative use of technology, be it digital, mechanical, biological or not yet invented. It should be a response to work and ideas which you have encountered during the module either in the presentation sessions, on the field trip, or in your own research.
t3X: 3D Interactive Twitter Visualisation
Artist’s Statement
t3X is an interactive, audio-responsive, three-dimensional visualisation of Twitter data. The tweets displayed are based on a randomly selected search criterion relating to the theme of ‘chance events and randomness’ as these ideas might be applied to art. Users can manipulate the three-dimensional space using physical gestures and an analogue controller.
The aim of t3X is to provide a unique and playful representation of information that we are accustomed to interacting with via a flat, two-dimensional interface and manipulated through the ubiquitous computer peripheral/mobile device screen. A Kinect camera and dial controller capture the user’s physical movements, allowing them to navigate the three-dimensional interface using gestures. t3X is, in part, an aleatory piece with the final presentation of the work being left to chance. In this case, the random motion of the tweets and the appearance of the structural geometry is influenced by the level of sound in proximity to the piece.
t3X is designed to captivate and engage an audience. The dial controller unit features a flashing LED which invites interaction and experimentation, allowing the user to zoom in and out of the three-dimensional scene. The Kinect camera establishes and then tracks the closest point to it; software then uses this information to drive the data visualisation. This technique means that any body part (or any object) can be used as a controller. Hold out your hand and you can manipulate the Twitter visualisation á la ‘Minority Report’. Incline your head towards the camera and you can look around the scene, peering behind, above and below the gently swaying tweets. Clapping your hands or shouting will cause the tweets to jump out at you before slowly falling back away from you. At the same time, the structural geometry used to link temporally-related tweets and define the central status box will appear to vibrate in time with the sound, much like the vibration of guitar strings.
The inspiration for this project derives from research carried out into the work of artists such as John Cage, Jonathan McCabe, Paul Prudence and Sabrina Raaf. Specific Twitter visualisations that inspired me were ‘Just Landed’ (Thorp, 2009) and XZZ’s ‘We Day’ (XZZ, 2014). In the work of all the artists listed above, there is the incorporation of elements of chance and randomness.
John Cage (Cage, 2017) is well known for creating works of art and performances where the audience became part of the composition and where random factors, such as background sounds, determined the finished piece.
Jonathan McCabe is a generative artist living in Canberra, Australia. He is particularly interested in theories of natural pattern formation and their application to computer art and design. “Computers are seen to be logical and deterministic, but I believe there is potential for surprise there. It is known that completely deterministic systems can be unpredictable” (McCabe, 2017).
Paul Prudence is an audio-visual performer working with generative video environments and abstract soundscapes. “I can control the amount of randomness, I drive it, but it also has a mind of its own. I like to be surprised during the performance” (Prudence, 2017).
Sabrina Raaf’s ‘Translator II: Grower’ (Raff, 2017) is a work that is activated by chance factors. A robot draws green lines at the base of a wall. The height of the line is based on the level of CO2 in the room. The act of observing the artwork provides the chance stimulus that drives the artistic process.
Ultimately, the aim of t3X is to engage the audience. During the exhibition, I hope to see people having fun experimenting with the gestural control mechanism and the audio-responsive aspects. The output will be projected onto a side wall and hopefully this will attract and encourage others to have a go with t3X Twitter visualisation.
References
- Cage, J. (2017). John Cage :: Official Website. [online] Johncage.org. Available at: http://johncage.org/ [Accessed 18 May 2017].
- McCabe, J. (2017). com. [online] Jonathanmccabe.com. Available at: http://www.jonathanmccabe.com/ [Accessed 18 May 2017].
- Prudence, P. (2017). Paul Prudence. [online] Paulprudence.com. Available at: http://www.paulprudence.com/ [Accessed 18 May 2017].
- Raff, S. (2017). Sabrina Raaf :: New Media Artist. [online] Raaf.org. Available at: http://raaf.org/projects.php?pcat=2&proj=4 [Accessed 18 May 2017].
- Thorp, J. (2009). Just Landed: Processing, Twitter, MetaCarta & Hidden Data | blprnt.blg. [online] Blog.blprnt.com. Available at: http://blog.blprnt.com/blog/blprnt/just-landed-processing-twitter-metacarta-hidden-data [Accessed 21 May 2017].
- XZZ (2014). XZZ – WeDay Twitter Visualization. [online] Xzz.ca. Available at: http://www.xzz.ca/projects/weday/ [Accessed 21 May 2017].
Leave a Reply