System Diagram:
Interaction Flow:
Dimensions:
Prototype Testing
ICM Framework
Yuguang Zhang’s Life During and After ITP
System Diagram:
Interaction Flow:
Dimensions:
Prototype Testing
ICM Framework
An After Effect animation project with fellow ITP student Xiaotong Ma.
For the next animation project, I’ll be working with a fellow ITP student Xiaotong Ma to tell a story about a dinosaur. The dinosaur was born in, and later broke away from ITP. It escaped from the Tisch building, crashed its way through the New York City, and eventually took down the Statue of Liberty.
In telling this story, we are trying to mimic scenes from different video games, recreate them using photo/google earth images, incorporate them into the animation and make our dinosaur go through these scenes. Below are the storyboards of this story:
Notes: the shape with a D letter inside represents the main character – the dinosaur – in our storyboards.
Board 1
Board 2
Board 3
Board 4
Board 5
Board 6
Board 7
Board 8
Board 9
Board 10
Board 11
Board 12
Board 13
Board 14
Board 15
Board 16
Board 17
Board 18
Board 19
Board 20
Board 21
After collecting feedbacks for the second version of the bird cage device, we discovered the following issues regarding its design:
After some discussion with Nick during the weekend, we decided to throw away the idea to make a physical representation of Twitter API by weighting how many people tweets on two given topics, and switch to focus on one of the actions that are most important for a person to own a bird cage – to put a bird into the cage. Based on this, we made a cardboard prototype to simulate a bird luring process. By throwing in hashtags of a topic on Twitter, we will lure an invisible bird that represents the Twitter data on this topic into the bird cage.
Following this design and a quick feedback from Tom, we further discussed what are the possible interactions that can both take advantage of people’s bird keeping behavior, and at the same time connects it to some intellectual activities that cannot be achieved by merely keeping a physical bird from the nature. I realized that since the bird is a representation of Twitter data, it is also a representation of the public’s opinion of a topic.
In the real world, by feeding the bird with some food and observing how the bird react to that food, we can know whether the bird likes it or not. In the same sense, if we feed the public data bird with a particular kind of thinking, by observing the bird’s reaction, we can have a physical glimpse into the public’s opinion or attitudes towards that thinking.
Further, I realized that this provides an opportunity for us to measure or rethink how our perceptions of social phenomenon are different from the larger public. Living in a physical environment, the opinions we possess are inevitably influenced by the people we engage with on a daily basis. For instance, the result of the recent midterm election in New York turned out to be quite different from a lot of people’s prediction at NYU. As someone who lives in a liberal community inside a liberal state, it is not uncommon that she/he’s gauging of the public’s attitude is somewhat biased. And this provides a good opportunity for us to reveal this bias through our feeding action with a public data bird.
So for version 3, we’re planning to create an invisible bird that allows users to guess and reflect on the guessing about public’s attitudes towards controversial topics on social media, as shown below:
After collecting feedbacks from the lovely ITP community, I realized that the previous design about aligning AR with the physical world is a bit farfetched, in terms of the following aspects:
Given the above comments, I realized that emphasizing on the “interaction between moving AR objects and physical objects” is perhaps not a very good idea, or at least, it requires other forms of designs to make them working together really well.
After a discussion with my ITP classmate Nick Tanic, I realized that instead of focusing on the “movement relationship” between AR and physical world, maybe focusing on their differences in “visibility” is a better idea. As we know, physical objects are present all the time, while for AR objects, we still need an interface or a medium, like an AR-enabled phone, to actually see them. Since Nick is having an idea about using bird cages to visualize tweets, it rang my bell as it could become a wonderful stage for playing with this difference in visibility. So, we decided to collaborate on our finals, and. here comes the design: an AR-enabled bird cage system that visualize the quantitative differences in any given two topics/items/phenomenon and reveals trends of people’s topic selection.
Major Functional Elements
The concept of this installation originates from a idea about making a tool to measure social phenomenon. It will be comprised of four major parts:
How it works
The bird cage installation is made to measure social trends on twitter. The audience can come up with two hashtags they want to compare, (e.g. pizza vs burger, or Trump vs Hillary), and assign these hashtags to different cages. The up and down movements of the two bird cages will be synchronized with number of people who tweet about each topic. To see what’s being compared visually, the audience will pull out their phones and enable AR to see what’re inside the birdcages and being compared.
Since the audience will be doing two tasks: the topic assignment and the AR observation, it’ll be boring or less surprising for them to perform these two tasks in this order, since the assignment process somewhat gives away the AR contents. On the other hand, it will be interesting to reverse the sequence, and interpret it under a larger audience context. A possible workflow will be:
And this creates an interaction among the installation and the audience flow over time.