Samantha Lam


This application offers a set of five AR widgets that render themselves on specified target images from Vuforia. Pointing the camera at the target images should render the widgets on the image. The widgets display real-time information that updates every 30 seconds. The 5 widgets are: a calendar, clock, thermometer and hygrometer, weather vane, and weather condition display. The calendar and clock widgets take the system date and time to display, respectively. The calendar widget is controlled by the astronaut marker, and the clock widget uses the drone marker. The other three widgets get information online via an API. The thermometer and hygrometer widget displays the current temperature (Fahrenheit) and humidity (percentage) in 3D and with text. This widget is rendered on the fissure marker. The weather vane shows direction (one of the cardinal or ordinal directions) and wind speed (mph), and it is controlled by the oxygen marker. To use it, line up the red arrow so that it points North. The last widget is the weather condition display, and uses the Mars marker. It shows 1 of 9 main conditions as a 3D icon. Each of the conditions has a unique sound that loops and the rain is animated to fall for three of the conditions. Pressing the left and right arrow keys on a keyboard changes the 3D icon to cycle through all 9 conditions.


The project source code can be found on Github here.

The application was made with Unity and Vuforia Engine. With Unity, the supported version number is Unity 2019.4.1f1, and for Vuforia, the supported version number is 9.4. To build the application, download the proper Unity Hub and get the proper Unity version. Then, download the supported Vuforia version from its website and run the installer. Open the project in Unity Hub and press play in the top middle of the screen. Then, pointing the webcam at a target image will render a widget on top of it.



With people regularly wearing AR glasses, these widgets would be quite popular. They save space, could be highly customizable, and potentially reduces spending. Instead of a physical object, having an AR widget that does the same job may eliminate some clutter in spaces and make them look a lot neater. Other objects could go through the AR widgets, so a virtual desk clock won’t block mouse movement or need to be pushed aside to place a cup down. The AR objects are also collapsible, making them super portable. Instead of carrying a weather vane around, carrying a piece of paper with the AR object target would suffice. People would not have to carry several bulky instruments to measure things in the world around them.

In addition, these widgets are highly customizable because they are virtual. The entire look of a widget could be changed with a click of a button. This allows the same widget to grow with someone as their tastes change. The objects could also be user-specific or communal, with everyone being able to see them. Different objects could appear in the same place for different people. If everyone could customize how objects looked to them, they could easily alter the aesthetics and atmosphere of rooms to suit their tastes.

The AR widgets are also harder to break. Knocking the target image onto the floor won’t break a decorative vase or mercury thermometer widget. It just moves the widget to the floor. The item would not need to be replaced and people would not have to spend money unnecessarily after that small incident. AR widgets also reduce the overall amount of objects that people need to physically own. More than just being able to change how a clock looks, people have the ability to use the same space and target image to render another object the next day. This then impacts consumerist culture in that people could alter what they already own instead of buying new things. In addition, all the AR widgets would be all virtual, so material sourcing and production labor would be impacted as well.

For me, how useful these widgets are depends on the way they display information. I think that widgets that just show information textually like the calendar or clock are not that useful to me. This is mostly because I am usually on my laptop or have my phone nearby, so the widgets would not be providing me anything new. However, they do have one great advantage over physical objects in that their appearance can easily be altered depending on the mood. On the other hand, I would actually like widgets that show 3D representations of the information. The weather one would be nice to have because it shows rather than tells what the weather is with 3D models. It is also unique in that a physical item that shows the weather and updates in real time is hard to create. Another widget I would like to have is a timer that would show a physical representation of how much time remained, like an hourglass. As a physical item, an hourglass would be impractical, especially if a specific time cannot be set easily. However, it would make a perfect 3D widget. I would mostly use these widgets at my desk, but they could easily be used anywhere.


This application provides a virtual playset with a figure in her bedroom. The playset itself renders on top of the Astronaut Target Image from Vuforia. Pointing the camera at the target renders the figure and all bedroom furniture. The figure was made using MakeHuman with added textures for clothing. The bedroom has two chairs, desk, bed, bedside table, table lamp, floor lamp, bookshelf, dresser, and ottoman, all built out of Unity primitives and using the textures listed in sources. The figure talks on the phone in her bedroom indefinitely. When the virtual button by her feet is pressed, there is a crash, and the figure is startled. She tentatively asks “What was that?”, before looking around. Finding nothing of too much concern, she shrugs, and goes back to talking on the phone. All of the animations were imported from Mixamo, as listed in the sources. In addition to that, whenever the scene is clicked, red balls fall out of the sky. If any of those balls happen to collide with the figure or any furniture objects, the balls will hit them and roll off.

Another target image, the Drone from Vuforia, can also be used to define the edges of the table. Lining the image up with the bottom left corner of the table will allow the red balls to roll off the edge of the table relatively organically since they can tell where the edge is. This can be used in combination with the table top playset scene so that balls hit the ‘floor’ of the bedroom instead of falling into oblivion every time.

There is also the life sized playset that is a recreation of the table top playset but bigger. This playset is rendered on the ground using Vuforia’s ground plane mechanic. After running the application on a phone, pressing a space on the ground will render the life sized playset there. This life sized playset currently does not have the same functionality as the table top playset.


The project source code can be found on Github here.

The application was made with Unity and Vuforia Engine. With Unity, the supported version number is Unity 2019.4.1f1, and for Vuforia, the supported version number is 9.4. To build the application, download Unity Hub and get the proper Unity version. Then, download the supported Vuforia version from its website and run the installer. Open the project in Unity Hub and choose which scene to play: the table top playset or the life sized version. Then, press play in the top middle of the screen.

When using the Playset Scene, pointing the webcam to the Astronaut Target Image will make the table top playset appear on top of it. The Drone Target Image, when placed at the bottom left corner of the table, will map the edges of the table so that balls can roll off the edge properly.

With the Life Sized scene, download the appropriate phone build support from Unity Hub by adding the correct module to the version of Unity. Then, go to File/Build Settings in the project in Unity. Choose the correct platform, and then click Switch Platform. For Android, go to Edit/ProjectSettings/Player/Other Settings and select Android 4.4 Kitkat for Minimum API Level. Connect the Android phone to the computer. On the phone, enable USB debugging under Developer options. Then, back in Unity, go to File/Build Settings and Refresh. Select the phone in the dropdown menu, and hit Build and Run. The application should run on the phone automatically. Using the phone, look around until the reticle appears. Clicking that should render the life sized playset.



I think these playsets have the potential to be very popular several years in the future when people are regularly wearing AR glasses. They are very easily portable, so children could have interactive playsets with them anywhere. I can see these playsets being very similar to Ipads now, except the playsets are in VR. The interaction level between children and these playsets are about the same as with an Ipad as it provides only visual and auditory feedback. The themed sets could function like different apps, and pressing virtual buttons could mimic tapping on an Ipad screen. The playsets have an added feature in that they display things in 3D rather than just 2D. This combines normal playsets with applications that have a lot more functionality than regular playsets do. For example, having a Paw Patrol play set is great, but being able to press a button and actually see Chase running around and barking in a virtual playset would be a lot more interactive and fun.

Children would definitely be interested in interacting with a little figure that looks like them. Much like American Girl Dolls that are tailored to resemble the children who get them, the high customizability of these VR figures would be a big hit. This is also really great for representation because the figurines are customizable. Therefore, children would not have to search for dolls that resemble them. Also, children love playing make-believe and having their own little figure that they can put into any scene is a perfect toy. The only problem with this is that, right now, it does not provide any physical feedback. This could be a downside since younger children might not have as much fun interacting with something virtual. Although, increasing the interactivity beyond just one button press could definitely make it more appealing, especially since it could mimic how young children play with Ipads by just tapping things on screen.

In addition to just children’s toys, these playsets could be a big collectors item as well. Just as people collect Gundam models or anime figurines, these playsets could serve as displays or trinkets that are interactive. For example, with the press of a virtual button, a magical girl figurine could transform.

It is a lot different seeing myself as a small figure than life size in AR. This is mostly because the model that I created is not the most accurate to how I look in real life, so making the figure bigger makes those flaws more apparent. Increasing the realism in AR figures is a tricky task since people are really good at distinguishing what is and what is not human. AR figures have gotten really close to what actual humans could look like, but any slight disproportion on the human frame could lead to an unwanted trip to the Uncanny Valley. I want the rendering of my image to be more lifelike, but not so much so that my doppelganger hailing from the Uncanny Valley haunts me.


This application allows the user to walk around and explore a classroom completely in virtual reality. In a corner of the classroom, there is a little library with 3 bookshelves and a table with many books. There are also two comfy chairs in the room, a jukebox that plays accordion music when touched, and a flower pot.

A model of me sits in the classroom, sitting idly at a desk. Around me, there are ten small objects that the user can grab and fall when dropped: my laptop, notebook, phone, textbook, skateboard, soda, food, radio, mouse, and cactus. As the user gets closer to my desk, I can be heard tuning the radio and trying to find something good to listen to.

There are also five other characters in the scene: the professor and four other students (Jason, Jake, Jocelynn, and Jade). Jason sits in the library, writing notes. Jocelynn stands in the back of the classroom, pondering the material, Jade sits in the corner idly, and Jake stands in the VIVE area while talking on the phone. They are all appropriately social distancing and have their own colliders.

Within the scene, there are two sets of lights: the overhead ones and the four lamps scattered throughout the room. The overhead lights are controlled by two light switches at the front of the room, while the four lamps are controlled by the switches at the back of the room. The lamps give off an orange glow to contrast with the white overhead lights. These two sets of lighting work independently of each other.


The project source code can be found on Github here.

The application was implemented in Unity version 2019.4.1f1 and VRTK 3. To build the application, download Unity Hub and get the proper Unity version. This can be accomplished by going to, downloading UnityHubSetup, and launching it. Then, within the Hub, click Installs and Add a new Unity version. Find Unity 2019.4.1f1 in the download archive and get it. Once Unity Hub and the correct version of Unity are both downloaded, open the project in Unity Hub and open the EVL scene from Assets/virtualuic-evl/Scenes by dragging it by the default scene. Then, press play in the top middle of the screen.

Within the scene, controls are in the top-left of the screen. Movement and interaction are both available with keyboard and mouse, so no VR headset is required. WASD are used to move, and the mouse is used to look around the room. Using keys alt, ctrl, and shift will toggle what the mouse buttons do, allowing the user to reposition their ‘hands’ and actually interact with the various objects in the scene.



Seeing and interacting with other people and objects in a virtual environment is something that can already be done with the many virtual reality games and applications available. As the realm of virtual reality technology advances, these applications may become more integrated into modern life as a standard for certain interactions. This comes with many advantages as well as disadvantages.

One great advantage of this VR technology is that it allows people to interact with each other in 3D. This is especially advantageous in a time of social distancing due to the pandemic. VR can simulate an actual in-person meeting while individuals actually remain safely quarantined. This expands on typical Zoom or phone calls so that interactions may feel more real and interactive. A team could work on or manipulate the same thing as if it were in real life. One application that could benefit from a team virtual experience is Quill, that allows users to create virtual 3D animations while in VR. A team could all virtually work together in creating 3D animations, regardless of physical location.

Another great advantage of VR interactions is that situations that may be improbable are readily available. This allows people to experience and do things they might not otherwise have. Especially in times of quarantine, public activities are extremely restricted. A couple could get a short preview of their dream honeymoon destination before committing to it. People could swim with extinct animals without actually knowing how to swim. Or gamers could team up and run games as a virtual party. VR takes away the need to be physically present in all of these interactions and experiences. It can connect people to limitless possibilities and allow them to experience a whole world they could otherwise never know in real life.

In addition, experiences can be repeated as many times as wanted in VR. This technology could help people relive some of their fondest memories, act as exposure therapy, or help give specialized training using minimal resources. People could experience and interact with objects within those synthetic worlds together in a more natural way than if they were restricted to watching through a screen.

While these are great benefits to having VR interactions, there are still many disadvantages to be addressed. One such disadvantage is, with the advancement of technology, it may be increasingly difficult to differentiate what is reality and what is virtual. This could lead to really reckless behavior that people would be fine partaking in virtual reality, but not real life. An example may be that someone gets away with robbing a bank in virtual reality and then thinks they can in real life. Even worse, they could run the simulation over and over until they do get away with robbing a bank in VR and then successfully pulls it off in real life as well due to the practice.

Another disadvantage is that people may be accustomed to virtual reality interactions and prefer it over real life ones. Just as people have more anonymity behind a computer screen, people could prefer that over in-person experiences. This may be a grave problem in that the health of people living in VR could be negatively impacted. There are also some real life experiences that cannot be adequately recreated in virtual reality

Finally, a negative of complete VR interactions is that there is limited haptic feedback. Currently, a majority of VR is limited to visual and auditory input. This takes away a huge part of what makes real life interactions great. Moving certain experiences to a completely VR space would strip away that connection.

These are just some of the biggest advantages and disadvantages to VR interactions with other people and objects. While it could be great for certain things, it definitely should not be the only interaction that people have in their entire lives. While virtual reality has many great uses, there is still something that real life currently has to offer that cannot be emulated in VR.


Quill is a painting and animation tool that runs on Oculus Rift headsets. The application allows users to create paintings and animate in a virtual reality space. It was originally developed by Oculus Story Studio for their production of Dear Angelica, a story that was painted and plays out in VR. It was then developed beyond its initial internal usage to be commercially available on the Oculus Rift store.

Along with the Oculus Rift headset, hardware requirements include the Oculus Touch controllers, which allow users to utilize the virtual tools and interact with the objects on the infinite canvas provided.

Within the application, many typical tools of digital art are available. Like other platforms, Quill allows users to create different layers to build upon elements while keeping them separate. It also has a variety of different brushes available. These are elevated from the usual ones found on other platforms to allow painting and creating models in a virtual 3D space. In addition to painting tools, the application offers editing tools such as colorizers to help create works of art.

Traditional animation tools are also available within the application, some of which were elevated in order to work better in a 3D canvas space. Quill supports frame-by-frame animation, using keyframes, and puppeteering. Editing tools for animation creation are also available, allowing users to zoom or fade in and out, cut and paste different scenes, and edit the video just like with a traditional, 2D animation platform. Audio can also be separated into different spaces, mimicking a real life experience. Different regions of space can provide different soundtracks, playing them directionally. For example, looking at different sides of a building plays different audio in accordance to what the audience can see from that side.

For animation, Quill allows users to move and pose drawn 3D models manually after they have been established. This can mimic a frame-by-frame approach to animation without having to redraw each of them.

Within Quill itself, users can capture videos and photos. The entirety of production can be done within the application, from sketching and storyboarding to final editing and viewing. Even so, if users want to use external applications for functionality not provided by Quill, it supposedly integrates well with other applications such as Photoshop, Unity, and Maya, allowing easy import and export of assets.

 Dear Angelica


Quill is a good use of VR because it allows creation, editing, and viewing of virtual reality animations all in one application. Because artists can create their works within Quill, they can see exactly what the audience will be able to see. They can imagine and create worlds and settings easily because they are immersed within them. Being within a setting can help with more expressive painting because artists could understand the mindset of the characters and their interaction with the world better.

They can also view their works from several vantage points in order to ensure cohesive modeling and perfect elements that may otherwise be difficult to do. After working on something for an extended period of time, it is sometimes difficult to see mistakes. Being able to move around the space allows a fresh point of view with a tilt of the head. It gives them freedom and flexibility to create animations that can be viewed from any angle that they can think of. It may even inspire them to try out angles they may not have initially thought of because they can see their works from many different angles as well.

This way of animating and storytelling is super immersive for the audience, as all VR can be. Even if the models are not super realistic, people can feel as if they are actually within a cartoon or comic book while watching the animations. There is depth within every scene that is shown because the audience is allowed to pause and explore things in their own time, creating a unique experience from each person’s own vantage points. In this way, the animations created can bring an experience beyond just a movie or story.

There is also an added level of detail that artists can put into their works. Because the audience can explore what is presented to them on their own, artists can hide little details for viewers to find or notice at their own pace. They can also add these details in easier by, for example, pausing the foreground and working solely on the backgrounds or focusing on a specific layer.

 Goro Fujita
Work by Goro Fujita from Quill website


The intended users of this application are artists, animators, and storytellers that want to create 3D virtual works to share with others. While it can be used by anyone who wants to create 3D animations, this application seems to be for more casual artists and independent animators rather than for companies like Disney that undertake large projects. Artists as well as people with little drawing experience can use Quill to create little 3D animations, but it may be difficult to create super polished works animation companies want.

 Alex's Sci-Fi World
"Alex's Sci-Fi World" by Matt Schaefer from Quill website


Quill is free on the Oculus Rift store. This is great because it is widely accessible to everyone who would want to try it out. However, users are limited by the hardware requirements. The application is only available on the Oculus Rift. The headsets and Touch controllers are about $300 together, with varying prices and costs of shipment in other countries. If users did not have an Oculus Rift to begin with, it is unlikely that they are buying one just to utilize Quill. The headset also requires an additional device to connect to that would run the actual Quill application. While many people own a computer, phone, or console, it may not be optimal to run the application to their expectations.

An advantage of using Quill is that it eliminates the need for traditional CG knowledge in creating 3D models and animations. This is great for independent artists that may be interested in creating 3D animations, but do not have interest in or time for learning the rigging or curve manipulation needed in traditional CG animation. More casual artists can pick up Quill and create 3D animations without more training in using different applications for similar effects. Smaller teams can also benefit since they do not have a 3D modeling team that they can collaborate with or hand the project off to.

According to reviews, Quill is easy to use and quite intuitive. From screenshots of the application, the interface and menus look very similar to other digital art and video editing software available. However, people who have absolutely no experience with software like Photoshop or video editors may find some difficulty using Quill to its full functionality initially. Some reviewers say that having a tutorial to get started is extremely helpful. Just as any new piece of software, there are going to be difficulties learning the full extent of what can be accomplished with it. Other reviewers complain that the interface is confusing and hard to use, ruining the experience of the application. Overall, it seems once people get some experience using Quill, they enjoy the application, even if they are not great at painting or drawing.

Just as with any other application, users have different needs and backgrounds that influence their experience with the application. There is a unique visual style that animations made with Quill have, which might not suit certain projects. More professional and polished animations may be especially difficult to achieve because of the more sketchy feel that paintings in Quill tend to have. Some reviews also say that very detailed work is hard to achieve. In my own personal experience, it is hard to be incredibly specific and precise while doing anything in virtual reality, so it is easy to see how drawings may not have as clean lines as some may expect.

Whenever I use virtual reality, I get incredibly motion-sick, which is something that reviewers have pointed out. This is a problem with a lot of virtual reality games and applications for me, so it may be difficult for people who are prone to motion sickness to use Quill for creating 3D animations and watching them in VR.

In conclusion, Quill is a really cool application for storytelling, painting, and creating animations. There are still many issues with it, but with more development, it could be a great medium for artists and animators to create captivating and immersive VR experiences for audiences. Some features that would really add to the VR animation experience is a way to cater to senses other than auditory and visual. Providing haptic feedback when touching things could add to how real and immersive these stories are. As with a lot of technology, there are always many improvements that can be made, and VR is no exception. However, Quill could shake up the world of entertainment creation.