Experiential Design - Task 3: Project MVP Prototype

03.06.2025 - 06.07.2025 (Week 7 - Week 11)
Lew Guo Ying / 0365721 / Bachelor of Design in Creative Media
Experientail Design
Task 3: Project MVP Prototype

Index

    2.2 Progression
    2.3 Submission

Lectures

Week 7

Fig1.1 Week 7 Lectures

This week’s session focused on exporting Unity projects to iOS and Android devices for testing AR experiences. Mr. Razif walked the class through the entire setup process, including switching build platforms, configuring Player Settings, and handling device connections.

For iOS, he demonstrated how to switch platforms in Build Settings, set the company and app names, and adjust the minimum iOS version (at least iOS 15 due to Vuforia requirements). He emphasized disabling Metal API Validation and adding an AR Camera usage description. Then in Xcode, he showed how to open the exported project, sign in with an Apple Developer account, and correctly set the Bundle Identifier (e.g., com.name.app) before building to the device.

For Android, the session covered enabling USB Debugging through Developer Mode, ensuring the Android device is recognized in Unity, and removing the Vulkan API from Graphics settings. The minimum API Level (such as 31) must be selected, and key settings like IL2CPP scripting backend and ARM64 architecture were reviewed. He also advised avoiding “Both” input handling to prevent export errors.

Overall, this was a technically intensive class emphasizing that early testing is crucial. Platform-specific issues like SDK version mismatches, permission requests, and build errors are common, so it’s important to troubleshoot ahead of time. Mr. Razif encouraged students to test on actual devices as early as possible to avoid last-minute surprises.


Week 8

Fig1.2 Week 8 Lectures

1. Importing Vuforia Package and Target Database

The session began by walking us through the steps to set up a new Unity project and re-import the Vuforia Engine SDK. Two key components are required:

The Vuforia package itself (Unity-compatible)
The Image Target database downloaded from the Vuforia developer portal
Once imported, both the SDK and database become the foundation for enabling either Image Target or Ground Plane features.
“Remember, both the package and the database need to be installed before proceeding.”

2. Setting Up the AR Camera and License Key

After importing the packages, we inserted the AR Camera into the scene (via Vuforia Engine → AR Camera) and applied the developer license key in the inspector under the Vuforia Configuration section.
This key is essential for enabling AR functionalities within the app.

3. Ground Plane Setup in Unity

Next, the tutorial focused on implementing the Ground Plane feature using:
Plane Finder
Ground Plane Stage

After adding both elements into the scene hierarchy, the Ground Plane Stage must be assigned to the Anchor Stage field inside the Plane Finder’s Inspector. This tells the system where to anchor the virtual content.

To test object placement, a small 3D cube was added as a child under the Ground Plane Stage, and its scale was reduced to fit naturally within the AR environment.
“Only objects parented to Ground Plane Stage can be spawned onto the ground when tapped.”

4. Build and Test on Mobile Devices

For iOS:

Switch platform to iOS in Build Settings
Update Player Settings:
Set company and product name (no spaces)
Disable Metal API validation
Input a short Camera Usage Description
Set minimum iOS version to 15 or above
Upon building, export to Xcode, complete signing via an Apple Developer account, and run on a physical device.

For Android:

Switch platform to Android and connect your device with USB debugging enabled
In Player Settings:
Remove Vulkan from Graphics APIs
Set minimum API level to 31
Enable ARM64 architecture
Update Input Handling to either "Old" or "New" (do not select “Both” to avoid errors)
Use Build and Run to export the APK directly to the device



Week 9

Fig1.3 Week 9 Lectures

In Week 9, we advanced the AR interaction by adding tap-based scaling and ensuring stable placement using real-device testing. First, we implemented a RoomScaler script that defines both scale-up and scale-down functions, using Vector3 values for precise control. This script was attached to a Canvas, and its methods were linked to two UI buttons via the OnClick inspector.

A critical improvement was unchecking "Duplicate Stage" on the Plane Finder, which resolved issues where scaling didn’t apply to the correct anchored object. We also included a floor plane under the room prefab and reset the walls' anchor positions to avoid floating.

During testing, the team discovered and fixed a missing TouchEnabler script on the AR Camera to ensure tap gestures spawned the content correctly on ground detection. We verified the functionality both in the Unity Editor with the Vuforia emulator and on iOS/Android devices. The ground-plane-anchored room scaled reliably, and the experience felt smooth and spatially consistent, a big step toward our MVP.


Week 10

Fig1.4 Week 10 Lectures

In Week 10, we introduced video playback within AR scenes and gaze-based interaction. Here are the key steps:

  1. Video Player Setup

    • Imported the video clip into Unity, created a Plane inside the AR “room,” added a VideoPlayer component, and assigned the clip.

    • Initially enabled Play on Awake to test playback, then disabled it for manual control.

  2. World‑Space UI Button

    • Created a World‑Space Canvas and a "Play Video" button.

    • Attached the button to the video plane, with OnClick → VideoPlayer.Play() and set the button to deactivate itself (GameObject.SetActive(false)).

  3. Gaze‑based Activation

    • Attached a custom RaycastDetectObject script to the AR Camera.

    • Set up a LayerMask (“interactable”) on the video plane (and optionally walls).

    • The raycast detected when looking at the video plane, printed to console, and activated the Play button when pointed at.

  4. Highlighting Interaction

    • Imported improved scripts (InteractableBase, InteractableObject, CameraRaycast) to modularize behavior.

    • Assigned InteractableObject to wall or plane objects—when gazed at, they changed color or triggered button visibility, via OnInteract() and OnRevert() methods.

  5. Flow and Logic Control

    • Player must first aim at the video plane → Play button appears → tap to start video → button auto-hides.

    • Optionally, looked away stops video and hides button using added logic in InteractableObject2.

This week emphasized layered interaction flow ("look → reveal UI → act"), bringing AR tutorials closer to real-world UX and enabling efficient prototyping before exporting to real devices.


Instructions

MIB For Experiential Design Module

Requirement:

For Task 3, we are required to develop a working MVP prototype to test the key features of our experiential design project. The prototype doesn't need to be fully visually designed, but it must demonstrate core functionality and help us identify potential issues and solutions.

Deliverables include:
  • A screen prototype created in Figma
  • A basic interactive MVP version of the app
  • A walkthrough video and presentation
  • A reflective post on our e-Portfolio


Figma File


MYSTIAR Prototype


Creating  MVP Features of MYSTIAR

Here’s a breakdown of the MystiAR MVP prototype I’m developing together with my teammate Ho Winnie. Below are the features I personally worked on:

1️⃣ Feeding the Pet (Done by Me)
Feeding is a core interaction. Users can tap to spawn a food bowl, and the pet will happily approach and eat. This simple action fosters daily emotional connection, as the pet’s health or mood bar increases, reinforcing user impact.

2️⃣ Playing Fetch (Done by Me)
Tapping the ball icon spawns a 3D ball. Users can swipe or drag to throw it in AR, and the pet will chase after it, wagging its tail or jumping with excitement. This joyful interaction boosts the pet’s mood bar and deepens the bond between user and pet.

3️⃣ Sleep Mode with Music (Done by teammate Ho Winnie)
When users want quiet companionship, they can switch to Sleep Mode. The pet lies down and softly sleeps, with calming music in the background—ideal for studying or relaxation. The music can be toggled on or off.

4️⃣ Potion Crafting (Done by teammate Ho Winnie)
MystiAR includes a fantasy-inspired potion system. Users collect and drag three ingredients into a virtual cauldron. With sparkly animations, a potion is brewed and can be fed to the pet to improve its health, mood, or traits.


What’s Still in Progress:

1️⃣ Voice Command Feature
We plan to add a voice button so users can call the pet by name, making it walk into the visible AR scene—no scanning required. This will make the interaction smoother and more intuitive.

2️⃣ Sound & Particle Effects
We aim to add more audio and particle feedback across features. For instance, heart particles after feeding, or “Zzz” visuals in Sleep Mode, to enhance immersion.

3️⃣ Dynamic Mood & Health UI Bars
Currently, these UI bars are static. The next step is to implement scripts that update them in real time based on interactions like feeding or playing fetch.


Progression:

A. Plane Finder

Fig2.1 Plane Finder

For the initial AR placement in MystiAR, I utilized Vuforia's Plane Finder and Ground Plane Stage to serve as the entry point for summoning the pet.

At first, the plan was simple: the user taps anywhere on the detected surface, and the cat appears at that point. However, several key issues emerged:

  1. Unwanted Object Duplication
    By default, Vuforia's Plane Finder has the "Duplicate Stage" option enabled. This caused the Ground Plane Stage to be instantiated repeatedly, leading to multiple versions of the cat and objects. I disabled this option to ensure that each object spawns only once, maintaining a single, clean environment.

  2. UI Buttons Not Responding
    Initially, none of the UI buttons worked. The root cause was that the Plane Finder's input listener was intercepting all screen taps, overriding interactions meant for UI elements. To solve this, I created a script that disables the Plane Finder right after the first tap and pet placement. This ensures all future taps are passed to UI controls.

  3. Precise Placement at Center
    To avoid cats spawning off-center or out of view, I placed the cat directly at the center of the Ground Plane Stage, using transform.position + offset. This creates a more intuitive experience where the pet appears exactly where the user expects.

This implementation resolved multiple interaction conflicts and laid the foundation for a smooth and controlled AR environment.


B. Ground Plane Stage & CaController

Fig2.2 Cat

I. Ground Plane Stage & Core Cat Behavior with CatController

In MystiAR, the Ground Plane Stage is the centralized stage where all major AR objects—including the cat, food bowl, ball, potion cauldron—are placed. To ensure spatial consistency and easier coordination, all objects are located near the cat by default.

🐾 CatController: Main Pet Behavior Script

Fig2.3 CatController

I developed a custom script called CatController to manage all the cat’s behaviors, and paired it with an Animator Controller to handle animation transitions.

To avoid issues with early object placement, I set the cat and all related objects to inactive on start, and only activated them via script after a successful Plane Finder placement. This ensures the pet always appears exactly where the user expects.

The Animator uses four key parameters to control behavior:

  • Speed – for walking and running

  • IsEating – when the cat is eating

  • IsDead – sleep/rest state

  • IsDrunk – special effect after drinking a potion

By default, the cat starts idle and immediately begins patrolling randomly within a set radius. This movement is handled by scripts that pick random targets and animate transitions between idle, walk, and run seamlessly.


II. Feeding Interaction & Animation Debug

Fig2.4 Food and Ball

When the user taps the Feed button, a food bowl appears in front of the cat. The cat walks toward it and begins the eating animation. After a few seconds, the food disappears, and the cat resumes patrolling.

A major issue I encountered was that the eating animation didn’t play correctly—the animation state would freeze, or delay several seconds before activating. I tried multiple fixes including transition conditions, toggling “Has Exit Time”, adjusting Animator settings—but nothing worked.

Eventually, I discovered that the conflict was between the Animator state transitions and how the script triggered them. After adjusting both the animation logic and coroutine timing, the issue was resolved. Now the animation plays smoothly, food disappears after a delay, and the cat returns to its normal idle/patrol state.


C. Feeding System

Fig2.5 Feeding

My original idea was to allow users to drag the food from a UI button (on Canvas) into the AR scene, so that the cat could chase and eat it. However, I encountered two major issues:

  1. Canvas vs World Space
    The cat lives in 3D world space (Ground Plane Stage), while the food placed on the UI button exists in 2D Canvas space. This made the cat unable to detect or interact with the food.

  2. Object Scaling Conflict
    Plane Finder scales all 3D children based on the floor size. Food placed in Canvas would have mismatched sizing, breaking visual consistency and causing interaction bugs.

So I simplified the interaction:

  • When the user taps the Feed button, food appears directly in front of the cat

  • The cat rotates toward it, plays the eating animation

  • After a few seconds (via coroutine), the food disappears and the cat resumes its idle or patrol state

This approach avoids all spatial and animation conflicts while maintaining a fluid and responsive experience.


D. Play Ball

Fig2.6 PlayBallManager

The fetch system is an interactive way for users to bond with the pet. I wrote a separate script PlayBallManager with the following key components:

🔹 Drag-and-Throw Mechanic

  • After tapping the “Play” button, a ball appears in front of the cat (within the Ground Plane Stage)

  • Users can click and drag the ball to simulate throwing

  • The drag uses raycasting with a horizontal plane constraint to keep the ball from flying off into the sky

🔹 Cat's Reaction and Animation

  • Once the user releases the ball, the PlayWithBall() method is called

  • The cat runs toward the target, then circles around the ball for a few seconds

  • After the animation, the ball disappears and the cat resets

❗ Issues and Fixes

  1. Cat didn’t chase the ball: Caused by ball being inactive or drag not yet completed → Fixed by ensuring MouseUp or TouchEnd triggers the behavior

  2. Ball floating in air: Due to unrestricted drag direction → Solved by locking movement to a flat horizontal plane

  3. Cat running off screen: Because of unrealistic ball Y-axis → Enforced Y-lock during ball movement to prevent stray behavior


E. User Interface

Fig2.7 UI

I. Minimalist UI Layout for Focused Interaction

Our UI is purposefully kept clean and minimal, ensuring users stay focused on their pet rather than getting distracted by cluttered controls.

Layout Highlights:

  • Top Left: Pet name with stylized gradient panel and SDF font

  • Top Right: Status bars for health and mood

  • Bottom Center: Three main interaction buttons — Feed, Potion, and Play

  • On Start: A two-step instruction overlay guides users through basic actions


II. Canvas Configuration Details

To ensure responsiveness across devices, the Canvas is set as:

  • Render Mode: Screen Space - Camera

  • Render Camera: ARCamera (instead of Main Camera)

  • Scale Mode: Scale with Screen Size

  • Reference Resolution: 1080x1920

  • Match: 0.5 — supports both portrait and landscape

For canvas settings, we used Scale with Screen Size to maintain consistency across devices, with a reference resolution of 1080x1920 and a match value of 0.5 to support both portrait and landscape modes. Instead of the default main camera, we assigned ARCamera as the render camera to ensure full Vuforia AR compatibility. Although we considered using world space UI, screen space (camera) provided more convenient control over UI layout and scaling.


III. Font & Graphic Settings


Fig2.8 Fonts

Font: Cinzel Decorative, converted via TextMeshPro SDF importer
  • All UI assets (panels, bars) are exported as PNG

  • Set to Sprite (2D and UI) type

  • Used Sprite Editor (9-Slicing) to prevent corner distortion

  • Increased Max Size and removed Compression to avoid visual degradation on scaling

We used Cinzel Decorative as the main font, processed through TextMesh Pro and converted into SDF format for sharper rendering. All text elements were managed using Content Size Fitter, allowing dynamic adaptation to content length, which ensures good layout behavior even in localization or dynamic text updates.


IV. Art

Fig2.9 Art
All graphic assets like panels, buttons, and status bars were custom-designed and exported as PNGs. These were imported into Unity as Sprite (2D and UI), and carefully sliced with Sprite Editor using 9-slice scaling to avoid distortion during resizing. Max size and compression settings were also adjusted to maintain visual clarity.


Fig2.10 Instruction Panel

Status bars were designed as dual layers: a grey background and a colored fill bar that dynamically adjusts to reflect the pet’s stats. Each bar was paired with a label and arranged using Vertical Layout Group and Layout Element to ensure consistent spacing and alignment.

Initially, the bottom main interaction buttons (Feed, Potion, and Play) were placed inside a horizontal Scroll View to allow future expansion. I added empty GameObjects at both ends to prevent edge clipping and restricted scrolling to horizontal only. However, I encountered scroll reset and vertical drift issues, so I eventually removed the scroll functionality and replaced it with script-based control. This not only improved stability but also made future feature handling more flexible.

The instruction flow is managed through a script called InstructionUIManager, which switches between two animation-driven stages. Step 1 shows a welcome message and basic information, while Step 2 prompts the user to tap the screen to place the cat. Both stages include fade-in/out animations controlled by Animator, creating a clearer and smoother user experience.


V. Trigger

Fig2.11 Animation Trigger

To manage the appearance and disappearance of the instruction panels, we used Unity’s Canvas Group component to control the panel’s alpha transparency. This allowed us to animate fade-ins and fade-outs smoothly, without having to constantly toggle the active state of UI objects. Combined with Animator, this approach made the transitions look more polished and visually appealing.

Interaction-wise, we designed two button types: one that triggers an action and ends automatically (like proceeding to the next step), and another that acts as a toggle, where one click activates and another click deactivates the feature. All of these actions are linked using onClick listeners in the scripts. To optimize performance, we made all target objects public so they can be directly linked through Unity’s Inspector using drag-and-drop. This not only simplifies the workflow but also reduces the overhead of runtime object searching, resulting in a smoother and faster app experience.

By organizing all UI logic into dedicated manager scripts and tightly coupling UI buttons with their corresponding events, we ensured that every interaction remains consistent, responsive, and easy to maintain.


Submission

Google Drive Link: https://drive.google.com/drive/folders/1xlEuyDF4LUV_oF5FDu7LrbL311e7fsg3?usp=drive_link


Project MVP Video Walkthrough:

To make our contributions clear in the MVP video, here’s a breakdown of which scenes were done by me (Guo Ying) and my partner (Winnie):
0:00 – 0:08: Login scene animation and transition to homepage – Winnie
0:08 – 0:15: AR pet plane detection and ground placement – Guo Ying
0:18 – 0:33: AR pet feeding interaction – Guo Ying
0:37 – 0:48: AR pet playing fetch with the ball – Guo Ying
0:50 – 1:09: Ambient sleep mode with music toggle – Winnie
1:20 – 1:44: Potion brewing system with draggable ingredients – Winnie


Canva Presentation Slide:

Experiential Design Task 3 by Winnie Ho


Presentation Video:



Feedback

Week 10:

During our prototype review, Mr. Razif gave positive feedback on the current state of our project, mentioning that the prototype already feels quite complete in terms of functionality and visuals. He also suggested that for the final polish, we could further enhance the user experience by adding sound effects and particle animations to create more dynamic and immersive feedback during interactions.


Reflections

Experience

At the beginning of this project, I felt quite overwhelmed. I had very little experience in Unity or AR development, so most of the time I could only follow Mr. Razif’s instructions step-by-step, hoping to figure things out along the way. Whenever I got stuck, I turned to the internet—especially YouTube—for tutorials and real examples that helped me understand how to proceed.

We started by building the prototype in Figma, then gradually tried to translate those designs into Unity. But when I first encountered Plane Finder and Ground Plane Stage, I had no idea what to do. With the deadline getting closer day by day and the class progress still catching up, I began to feel anxious. That’s when I decided to at least start integrating smaller parts of the system while continuing to explore the main functions.

The process was full of unexpected results. What we envisioned often didn’t match what we got. So we shifted our goal—to simply create a functioning MVP first. Once we had that, we worked as a team, brainstorming and figuring out how to implement more of our ideas step by step. Debugging was by far the hardest part, especially because we were using AR Camera. We had to build and run the project on a real phone every time we wanted to test something, which added layers of delay and complexity.


Observation

One of the biggest lessons I learned is that small details matter more than we think. For example, getting the UI buttons and the pet's behavior to work together smoothly was more difficult than expected. Sometimes animations would freeze, or the button wouldn’t trigger correctly. A single misplaced state or incorrect variable could cause everything to break.

We also encountered several crashes due to unnoticed component conflicts or improper script bindings. These issues were tricky to spot but taught us how important clean structure and precise linking are in Unity development. Only by going through repeated trial-and-error could we start to understand how everything fits together under the hood.


Findings

Although we didn’t manage to implement everything—such as voice control, screenshot features, or live-updating status bars—we did complete most of the core interactive experience. When we showed our current build to Mr. Razif, we were surprised when he said our prototype was already “very close to the final product.” His feedback gave us a huge boost of confidence.

It made me realize that we had achieved more than we thought. And that’s the most rewarding part—knowing that our efforts, stress, and persistence have paid off. Debugging was definitely the most draining part, especially scripting conflicts and component errors. But overcoming each challenge gave me not just technical improvement, but also a deeper respect for what it takes to build a polished interactive experience.

Comments

Popular posts from this blog

TYPOGRAPHY Task 1: Exercise ( Type Expression and Text Formatting)

Application Design 2 - Task 1: Self- Evaluation and Reflection

Advanced Interactive Design - Final Task: Completed Thematic Interactive Website