top of page

Magic Windows & Mixed up Realities

Week 3

How would I augment my space?

I imagine an interactive exhibit that features characters and objects that aim to tell a specific story. There would be a large gallery space that features both 2-D images, (photographs/collages) as well as 3-D printed objects and sculptures. The images on the walls would act as framed scenes. When a person  wearing a headset approaches a specific image, that image will then begin augmented, creating an experience where the scene comes alive and begins to perform. When objects are approached, a brief explanation and example of the object’s purpose would be depicted. In order to make the installation interactive, aside from using AR, in another space within the gallery, visitors would be

Reference video here.

This video depicts a gallery space filled with 2-D images on the walls, and people viewing those images through AR on their phones. On the wall, the images appear to be static, with no movement. When the images are looked at through the phones, you begin to see each image become digital, involving motion, glitching, etc. I think this idea can be pushed, by literally having the images come off of the wall and into the space.

Here is another reference for how this idea could be expanded, along with stills from the video for reference.




 

Screen Shot 2020-09-23 at 5.33.10 PM.png
Screen Shot 2020-09-23 at 5.32.52 PM.png
Screen Shot 2020-09-23 at 5.33.00 PM.png

Week 4

Practicing with AR Foundation and Image Tracking

After days of getting that particular scene in Unity to work, I wanted to try and track the image below.

Justice.jpg

The idea was to set the above image as the target and then for text to appear from the megaphone. Text shown below.

Screen Shot 2020-10-07 at 6.02.04 PM.png

Week 6

Final Project: MWMUR

For our final project Patrick and myself worked on plane mapping using Vuforia Engine inside of Unity. Initially, Patrick was interested in using AR video portraits to tell a story around gentrification.

By using image targets to create ghost like images, we wanted to create a space for displaced people that once occupied historically impacted neighborhoods. We decided to reframe the idea by using image targets to video portraits of students on the ITP. We were interested in having in filming students occupying their workspace, and assisting the video with audio that narrates their inner thoughts around being on school grounds under the circumstances of Covid-19.

We continued with this idea by setting up a short mock-up to illustrate our narrative. We used a 5D Mark III to shoot the video footage of myself working at the ER, and synced it with audio from my voice memos on my iPhone. In the long run, the concept would develop by featuring more students that frequent the floor, and using AR to digitally populate the floor with ghost-like video.

 

bottom of page