Experiential Design - Final Project: Completed Experience

08.07.2025 - 29.07.2025 (Week 9 - Week 14)
Lew Guo Ying / 0365721 / Bachelor of Design in Creative Media
Experientail Design
Final Project: Completed Experience

Index

    1.2 Submission

Instructions

MIB for Experiential Design

Requirement: 

For our final task, we are required to complete both a Final Project and an E-Portfolio, which together account for 40% of our individual grade. The Final Project involves integrating the knowledge from previous tasks to develop a fully functional product prototype. We need to refine the design, enhance the user experience, and deliver a working version of the application. Additionally, we must submit an E-Portfolio reflecting on our teamwork, social competencies, and learning journey throughout the design studio. This portfolio should include online posts, evidence of our reflections, and a video walkthrough to present the final product.

To complete the submission, we need to upload all required files to Google Drive and share the link, ensuring that the main folder is named after us. The submission package should include: our Unity project files and folders in a zipped format, the app build files for both Android (APK) and iOS, and any images used for our image targets. Additionally, we must provide a video walkthrough presenting our app, along with a direct blog post link to our project documentation. If we are working as a group, we should also include our partner’s blog link.


Recap of Completed Features from Task 3

Together with my teammate Winnie Ho, we successfully built the MystiAR MVP prototype in Unity. Below is a breakdown of the features we completed:

1️⃣ Feeding the Pet (Implemented by Me)
Feeding is one of the core interactions in MystiAR. Users can tap to spawn a bowl, and watch as their AR companion happily eats. This simple action builds a sense of daily care and bonding. As the pet eats, its health or mood bar visibly increases, giving immediate feedback and making the experience more engaging.

2️⃣ Sleep Mode – Soft Ambient Music (Implemented by Winnie Ho)
Companionship can also mean quiet moments together. In Sleep Mode, users can switch their pet to a peaceful resting state. The pet lies down and gently sleeps in AR while soft ambient music plays, creating a soothing environment—perfect for relaxation or focus. Users can toggle the music on or off with a button.

3️⃣ Potion Crafting – Find 3 Ingredients (Implemented by Winnie Ho)
To bring a touch of magic, MystiAR allows users to craft potions by collecting and dragging three mystical ingredients into an AR cauldron. Animated effects and sparkles make the brewing process immersive. Once completed, the potion can be given to the pet to boost health, mood, or special traits, enhancing the interactive care element.

4️⃣ Playing Fetch with the Pet (Implemented by Me)
Playful moments are essential to bonding. Users can spawn a ball and throw it in the AR space, prompting the pet to chase and retrieve it. The pet’s joyful reactions, such as wagging its tail or jumping with excitement, make the experience feel real and responsive while lifting its mood bar.


Final Enhancements Completed in This Task

1️⃣ Voice Command Feature (Developed by Me)
We successfully implemented a voice command button that lets users call their pet by name. When activated, the pet now automatically walks into the visible AR scene, removing the need to scan the surroundings. This improvement makes the interaction feel far more natural and seamless.

2️⃣ Enhanced Sound & Particle Effects (Developed by Winnie Ho)
All core features have been upgraded with sound cues and particle animations. For example, after the pet finishes eating, cute heart-shaped particles appear, while in sleep mode, soft “Zzz” effects and ambient visuals enhance the calming atmosphere. These additions make every interaction feel lively and immersive.

3️⃣ Dynamic Health & Mood Bar (Developed by Me)
We completed the logic for dynamic health and mood bars. Now, the bars respond in real-time—increasing when the pet consumes a potion and boosting mood after playing fetch. This feature adds meaningful feedback, making the pet feel more responsive and alive.

4️⃣ Screenshot Feature – Capture the Moment (Developed by Winnie Ho)
The app now includes a screenshot function, allowing users to capture special moments with their AR pet. By tapping the camera icon, a snapshot is saved directly to the device gallery, making it effortless to share memories with friends or on social media.

5️⃣ Detail Optimizations – Arrow Indicator, Voice Feedback, Lighting, and Reset Functionality (Developed by Me)

Several fine-tuned enhancements were added to improve usability and visual quality:

  • Arrow Indicator: A dynamic arrow is now fixed to the screen edge, constantly pointing toward the pet’s location. Since the AR pet moves freely, this feature ensures users can always find it quickly.
  • Voice Feedback with Text Display: To enhance voice command usability, a text status indicator appears on the left side of the screen when the voice button is activated. It informs users whether the app is listening or if the command failed.
  • Lighting Optimization on Startup: The opening scene lighting has been adjusted to better highlight the 3D pet. The new directional white light enhances contrast and makes the pet’s colors appear more realistic.
  • Resettable Features: Previously, potion crafting was a one-time interaction requiring an app restart to repeat. Now, all features can smoothly reset, allowing users to reuse them multiple times without interruption.

Progression of Final Project

1. Voice Command Feature

Fig1.1 Wit.ai

For the voice control feature in MystiAR, I integrated Wit.ai, a natural language processing platform developed by Meta. Wit.ai allows developers to build apps that understand voice or text commands and convert them into actions. In our project, Wit.ai serves as the bridge between user speech and AR pet behavior. By processing audio input, Wit.ai identifies user intent and sends structured data back to Unity, enabling our AR pet to respond accordingly.


Fig1.2 Intent and execution

The workflow of the voice command system is as follows:

  1. Intent Setup in Wit.ai

    • In the Wit.ai dashboard, I created an intent called move_pet.

    • Various utterances (e.g., "kiki come here", "come here", "kiki 来这边") were added and trained under this intent to teach the model how to recognize commands.

    • Using the Understanding page, I entered example phrases and used Train and Validate to improve recognition accuracy.

  2. Integration with Unity

    • The Wit.ai Server Access Token is required to authenticate API calls.

    • In Unity, I wrote a script named WitTest to handle voice input.

    • The script records the user’s speech, converts it to audio data, and sends it to Wit.ai’s speech API (https://api.wit.ai/speech).

    • Wit.ai processes the audio, identifies the intent (e.g., move_pet), and returns the result to Unity.

  3. Execution of Commands

    • Unity parses the response and executes the corresponding action—such as making the AR pet walk into the scene.

    • This process enables a hands-free and natural way to interact with the AR environment.


Fig1.3 Voice feedback UI

The next step in implementing the voice command feature was to connect Wit.ai with the in-game Voice Control Button in Unity. This button acts as the user’s gateway to activating voice recognition. When the button is tapped, it immediately turns red, visually signaling to the user that recording has started.

To make the feedback even clearer, I added text indicators that dynamically update during the entire process. Users can now see messages like "Recording…", "Processing voice input…", and the final transcribed command or error feedback directly on the screen. This ensures users always know whether the system is listening, processing, or if an error occurs.


Fig1.4 Script

The functionality is powered by two main scripts:

  • WavUtility: Handles audio data conversion, transforming Unity’s recorded AudioClip into the proper PCM WAV format required by Wit.ai’s API.

  • VoiceCommandManager: Manages the button’s behavior, handles the microphone recording, sends audio to Wit.ai, interprets the response, and executes the appropriate in-game action.

Once a voice command is successfully recognized, it not only displays the recognized text back to the user but also triggers the AR pet to return to its spawn point. This feature elegantly addresses the common issue where the pet may wander too far or become hard to locate. The interaction simulates calling your cat in real life, creating a deeper sense of immersion and emotional connection in the AR world.


2. Dynamic Health & Mood Bar

Fig1.5 Hierarchy and Asset

When building the final version of MystiAR, one of the critical enhancements I wanted to achieve was creating dynamic, responsive status bars to represent the pet’s Health and Hunger in a visually engaging and functional way. These bars are not just decorative UI elements; they play a central role in the gameplay loop, communicating the pet’s needs to the user and influencing how the user interacts with the AR environment.

During Task 3, I implemented status bars by stacking two rectangular layers—a background and a fill layer—while using a Horizontal Layout Group to handle alignment. Although this method was straightforward, it introduced significant limitations:

  • The fill animation was jerky and lacked smoothness, breaking immersion.

  • Adjusting the fill dynamically in response to pet behavior felt inefficient and difficult to fine-tune.

  • The overall UI lacked flexibility when scaling or reusing across different states.

These drawbacks became more apparent as I began connecting the bars to real-time events like feeding, potion crafting, and playing fetch. It became clear that a more refined approach was necessary to ensure both smooth animation and precise control over the bar’s behavior.


Fig1.6 Health and Hunger Bar

To overcome these issues, I redesigned the bars entirely in Figma, giving them a modern and polished look with rounded corners and subtle gradients. This not only improved aesthetics but also provided a foundation for better technical control.

After exporting the assets, I implemented them in Unity using:

  • Sliced backgrounds, ensuring the bar edges remain sharp while allowing the body to scale seamlessly.

  • Filled color layers using Unity’s Image.FillMethod, which enables smooth, real-time changes as the value adjusts.

This method proved far superior, as it allowed the fill to dynamically scale based on percentage values without distortion. Once I perfected the first bar, I duplicated it to create the second and converted them into prefabs, streamlining adjustments and guaranteeing consistency.


Fig1.7 Status bar manager

To make the bars functional, I developed a dedicated script, StatusBarManager, which controls their behavior dynamically. The script connects directly to the CatController, ensuring that changes in the pet’s state—such as hunger or health loss—are immediately reflected on the UI.

All critical parameters were intentionally kept public, allowing me to tweak values in real time during testing. This flexibility made balancing the game mechanics much smoother.

Core Parameters & Behaviors

  • Initial Values: Both Health and Hunger start at 100.

  • Hunger Depletion: Every 3 seconds, the hunger bar decreases by 5 points.

  • Sleep Trigger: When hunger drops to 20, the pet enters a temporary sleep state, signaling it needs to eat.

  • Critical State: If hunger reaches 0, the pet falls into a permanent sleep animation, and at this stage, the Health Bar starts depleting—5 points every 5 seconds—to indicate a worsening condition.

  • Health Depletion: Only begins when hunger is already at zero, emphasizing the urgency to feed the pet.

To balance the status decay, I implemented three recovery mechanics, each tied to one of our core features:

  • Feeding: Restores 50 Hunger and triggers a future-ready slot for eating effects (e.g., particle effects, animations).

  • Potion Crafting: Restores 20 Health, reinforcing its magical utility.

  • Playing Fetch: Restores 10 Health, rewarding playful interaction.

These recovery actions give players multiple strategies to care for their pet, making the game loop feel dynamic and interactive. The bars update in real-time as these actions are performed, offering instant feedback that feels satisfying.


Fig1.8 Scripts

To make the experience more immersive, I prepared hooks for particle effects and animations. For instance, when the pet eats, I can easily add glowing particles or a “happy” animation above the bar. This modularity ensures the system is future-proof, allowing me to expand visual richness without rewriting core logic.

Additionally, the bar’s behavior was carefully synchronized with the pet’s animations and states. For example, when the pet is in sleep mode due to low hunger, the bar’s slow decrease reflects this passivity. When the user intervenes with food or potions, the bars jump back up in a way that feels both rewarding and natural.

Through multiple iterations, the dynamic Health and Hunger bars evolved from simple placeholders to a polished, functional, and immersive UI component that not only informs players but also drives the emotional core of the experience. Watching the bars slowly decline creates a sense of responsibility, while boosting them through care actions delivers positive reinforcement, enhancing the bond between the user and the AR pet.


3. Detail Optimizations – Arrow Indicator, Voice Feedback, Lighting, and Reset Functionality

A. Arrow Indicator

Fig1.9 Arrow Indicator
In MystiAR, the AR pet has complete freedom to move around, which adds to its lifelike personality. However, this freedom also creates a common problem: players often lose sight of their pet when it walks out of view. Searching for it by rotating the device or walking around can quickly become tiring, breaking the immersive and relaxing feeling of the game. To address this, besides the voice command feature that calls the pet back, I decided to add an arrow indicator that helps players visually track their pet’s location.

At first, I considered implementing a screen-edge glow effect, where soft gradients would appear on the side of the screen pointing toward the cat. Although this idea was visually appealing, it was technically difficult to execute cleanly across different screen sizes and AR scenarios. Therefore, I opted for a more feasible and dynamic solution: a gradient-colored arrow that orbits the edges of the screen and constantly points to the pet’s exact position. This keeps the interface elegant while providing clear guidance.

Fig1.10 Script
The feature is powered by the CatPointer script, which:

  • Continuously calculates the cat’s world position and converts it to screen coordinates.

  • Detects if the pet is off-screen and clamps the arrow to the edge while pointing in the correct direction.

  • Dynamically rotates the arrow to make its orientation intuitive and easy to read.

This approach ensures that the arrow always feels responsive and accurate, no matter where the pet moves. Combined with its gradient design, it adds a subtle yet effective visual cue without overwhelming the screen. Thanks to this feature, players never feel lost or anxious about their pet’s location. Instead, they can relax and enjoy the experience, knowing that they can always rely on this indicator to guide them.


B. Voice Feedback with Text Display

Fig1.11Voice Feedback

While implementing the voice control system with Wit.ai, I realized that users could easily become confused if there wasn’t clear feedback about what the system was doing. For example, after tapping the voice button, users might wonder if the microphone is recording, whether the system is processing the command, or why nothing seems to be happening. To address this, I added voice feedback with text display to make the entire process transparent and user-friendly.


Fig1.12 Script
The improvements are simple yet effective:

  • When the voice button is tapped, it turns red to visually confirm that recording has started.

  • At the same time, a text label appears on the screen showing "Recording...", giving the user immediate confirmation.

  • Once the audio is sent to Wit.ai for transcription, the text changes to "Processing voice input...", indicating the system is working.

  • If the command is successfully recognized, the recognized phrase is displayed back to the user as confirmation.

  • In case of failure (e.g., unclear speech or network error), specific messages like "I didn’t catch that. Try speaking clearly." or "Network error. Please try again." appear, guiding the user on what went wrong and how to fix it.

This feature ensures that players always know what state the system is in—recording, processing, success, or error. By combining visual (button color) and textual (status messages) feedback, the experience becomes more interactive, intuitive, and stress-free.


C. Lighthing Optimization on Startup and Resettable Features

Fig1.13 Light improvement

In the final stage of development, I focused on polishing the visual experience and solving usability issues related to feature reusability.

Firstly, I optimized the opening scene lighting. The cat model performs a smooth 3D movement toward the player to create the feeling of it leaping into the user’s arms. However, due to lighting inconsistencies, the cat’s white fur sometimes appeared discolored. To fix this, I added a Directional Light aimed directly at the cat, ensuring proper illumination and restoring its natural white tone.


Fig1.14 Font
Next, I addressed typography consistency. The project uses the Cinzel Decorative font, which is not included by default in Unity. I manually downloaded the font, imported it, and converted it into an SDF (Signed Distance Field) Font Asset to be compatible with TextMeshPro. This ensured that the game’s text maintained its distinctive style across all UI elements.


Fig1.15 Script
On the gameplay side, I solved the reusability problem for certain features:

  • Feeding and Playing Fetch already worked with reusable spawning mechanisms, so they required no changes.

  • However, Potion Crafting and Ingredient Collection were problematic. Initially, ingredients were randomly spawned, which sometimes caused them to disappear or become hidden behind objects. I redesigned this by spawning them at fixed elevated positions, requiring players to look up, down, or around to find them—adding a light physical engagement while avoiding bugs.

  • Previously, these objects became inactive after use and could not be reused without restarting the app. To solve this, I implemented reset functions (ResetPotion() and ResetIngredient()) that reposition objects to their original coordinates and reactivate them when needed. The cauldron also resets and disappears appropriately after potion use, ensuring smooth repeat interactions.

With these adjustments, all features became reusable and player-friendly. The system now allows endless interaction without forcing a restart, while also maintaining visual quality and user immersion.


Final Submission:


Presentation Video:

Walkthrough Video:



Presentation Slide By Winnie: 

Feedback

Week 14:
In Week 14, we finally presented our completed version of MystiAR. During the review session, Mr. Razif provided very positive feedback, complimenting not only the technical completeness of the project but also how faithfully it delivered on the original vision we proposed at the beginning of the semester.

He highlighted that every planned feature—from core interactions like feeding, playing fetch, and potion crafting, to the enhanced mechanics such as voice command, arrow indicator, and dynamic status bars—was successfully implemented and polished. According to him, the project felt cohesive and immersive, offering exactly the interactive AR pet experience we aimed to create.

Receiving this feedback was extremely rewarding. It validated the countless hours of iteration, debugging, and design refinement we invested throughout the development process. More importantly, it confirmed that our efforts in combining technical features with user-friendly interactions paid off, resulting in an experience that feels alive and engaging.

This moment not only marked the successful completion of our final project but also became a milestone in our learning journey—proving that with persistence, teamwork, and creativity, we can turn an initial concept into a fully realized AR experience.


Reflections

Experience

At the start of the project, the development felt overwhelming. There were many different systems to integrate, from AR interactions to UI logic and audio processing. Initially, I struggled with understanding how all the parts would work together. However, by following the lecturer’s guidance step by step and supplementing it with online tutorials, I gradually built confidence. Over time, the project transformed from a rough prototype into a functioning AR experience. Each phase—planning, coding, testing, and refining—was a learning opportunity, teaching me not just technical skills but also patience and persistence.

Observations

During development, I observed that creating an AR experience is not just about visuals but about how each interaction feels to the user. I noticed that when certain features lacked feedback—such as the voice control not showing status messages—users would feel confused. Similarly, small design decisions, like adding the arrow indicator or ensuring the health and hunger bars update smoothly, had a huge impact on usability. I also realized that bugs and unexpected behaviors were inevitable, and solving them often required rethinking my approach rather than just fixing code.

Findings

From this project, I learned that experiential design focuses less on perfect visuals and more on creating a smooth and engaging flow. Our lecturer also emphasized this point, praising our project for achieving a complete and cohesive experience rather than just aesthetic appeal. Even though not every feature was flawless, the final product successfully combined technical functionality with user-friendly interactions, fulfilling the vision we had from the start. The biggest takeaway is that persistence and adaptability are essential when turning an idea into a working product.


Comments

Popular posts from this blog

TYPOGRAPHY Task 1: Exercise ( Type Expression and Text Formatting)

Application Design 2 - Task 1: Self- Evaluation and Reflection

Advanced Interactive Design - Final Task: Completed Thematic Interactive Website