C++
Blueprints
Javascript
UE4
Babylon.js
JSON
Analytics
VR
Previously, training for the PRISMAX 2 device was conducted in person by therapy specialists. However, Baxter recognized an opportunity to streamline their training process and enlisted the help of Cassette to develop virtual training tools. The objective of this virtual training system was to provide accessible training throughout Western Europe, without the need for Baxter training specialists or ICU nurses to be physically present.
To accomplish this, Cassette created nine fully interactive training and assessment modules that cover the six most commonly used dialysis therapies in Baxter’s Western European markets. The virtual training is available in ten languages and can be accessed through both web and VR platforms. The training is hosted within Baxter's LMS system, allowing users to select from a series of modules and assessments, while their progress is tracked throughout their educational journey.
I played a crucial role in various aspects of the project as one of the Unreal Engine developers on the team. My responsibilities included managing data, content population, and working with the Art team to investigate optimizations. Additionally, I was responsible for publishing and delivering the project and ensuring the smooth flow of UI behaviors. This project was particularly significant to me as it was the longest project I had worked on, and I had to bear a significant responsibility as the sole remaining developer towards the end of the project's lifetime.
Programming
UI
In my role as the developer responsible for managing most of the UI, I had several key tasks. One of my primary duties was to use UMG to create UI behaviors and animations that would enhance the user experience. However, this project presented a unique challenge, as all the UI had to interact directly with VR hands, rather than a laser pointer. To ensure that the user received adequate feedback, we implemented sound effects, visual effects, and controller vibrations.
To achieve a seamless and intuitive user experience, we relied heavily on the UI designs provided by our UI designers. We used state machines to determine the flow of the UI, which helped us to create a cohesive and user-friendly experience. Furthermore, much of the UI was dynamically driven based on data retrieved from data tables, which added an additional layer of complexity to the project.
Object Interaction
As part of a team, my focus was on collaborating with my colleagues to enhance the user experience by working on specific screens that facilitated loading of different sections of the modules, adjusting settings, and exiting the experience. While I contributed to these interactions, the bulk of the responsibility for this aspect of the project fell to other team members.
Onboarding & Statemachines
As part of my role, I utilized state-machines to design the user flow that guided users through the experience. This involved creating various states to provide instructions on how to interact with screens and use controllers, along with corresponding UI changes and voice-overs.
The main experience of the project was primarily driven by a state-machine that I and other team members developed. It was responsible for managing object loading/unloading, setting focus points, playing voice-overs, updating UI, and other key functionalities. The state-machine also played a critical role in progressing the player through the experience. The system's data was managed through dynamic content population, which I will discuss in another section. This aspect was essential to the overall system's success.
CSV, Data Tables & Content Population
As the sole person responsible for data management, I faced one of the most challenging and time-consuming aspects of the project. The project relied on data tables to dynamically generate text, images, audio, and animation in the experience. With content for over 8 languages and multiple modules, each with specific experience flows, it was a difficult task to ensure that the data was formatted correctly for direct import from CSV/JSON files overseen by the client team and producers. Once the data was imported, I optimized its loading by writing a script that only imported necessary data and removed everything else from the data table. Lastly, I designed state machines and systems to determine the user's flow, ensuring that everything was set up appropriately within the experience and populated with the necessary data.
Learning Management System, xAPI and cMI5
To monitor user progress and engagement data, we implemented methods in our project that communicate user progress based on the xAPI standard for learning, integrated with the client's LMS. This enables us to retrieve information about user module progress and scores. We also utilized cMI5 to determine module launch and authorization requirements.
Other
Deployment & Packaging
As part of my responsibilities, I assisted in packaging and configuring all builds for Android shipment to the Oculus Quest 2. In addition, I supervised the sideloading and deployment of the application on 100 Oculus Quest 2 headsets that were sent out to our clients through Oculus for Business. This app was successfully installed on all 100 headsets, ensuring timely and efficient delivery to our clients.
Mixed Reality Capture
To produce captivating marketing material for the project, I collaborated with the Art team to generate mixed reality footage and images using Unreal Engine. Our approach involved setting up a green screen and camera, which was connected to our machines running the project and the Compusure utility plugin. This enabled us to seamlessly integrate the actors into the virtual environment. However, since the actors had to perform the experience, I had to modify the project slightly for recording purposes, such as concealing the virtual hands to allow the actors to use their real hands instead.