CAR-T Therapy Training

 
Kymriah+Banner.jpg
 
 
 
C++
Blueprints
UE4
JSON
Analytics
VR
 

Synergy and Cassette joined forces to develop a virtual reality (VR) training tool for Novartis, with the goal of educating healthcare professionals (HCPs) on post-CAR-T therapy patient care. The team leveraged Cassette's Pathway platform to create four training scenarios for HCPs and two additional scenarios exclusively for nurses.

The VR tool equips users with the essential knowledge and tools to evaluate and treat patients, including medication administration and answering inquiries. To optimize return on investment (ROI) and extend reach, the training scenarios were also replicated as immersive web experiences.

Moreover, a custom dashboard was crafted to gather and analyze data from both the VR and web training programs.

Being the most recent addition to Cassette and a part of this initiative, I played a vital role in multiple aspects of the project. From developing the user interface (UI) and gameplay to facilitating deployment, I made significant contributions. This project is particularly significant to me as it marked my first step into a professional work environment.

 
 
 

Programming

UI

My role as the UI developer involved several crucial responsibilities. Firstly, I utilized UMG to create UI behaviors and animations. Secondly, I configured the interactions between UI elements using the widget interaction component, making it easy for users to register click and scroll events, especially in Virtual Reality. To address issues with blurriness and aliasing around text borders on the Original Oculus Quest, we also implemented Unreal Engine's stereo layers.

Throughout the development process, we relied on UI designs provided by our UI designers. To ensure a seamless and intuitive user experience, we utilized state machines to determine the flow of the UI. This approach helped us create a more cohesive and user-friendly UX, improving overall usability. Additionally, most of the UI was dynamically driven based on data retrieved from data tables.


Object Interaction

As a developer, I contributed to expanding our object interaction capabilities by implementing advanced functionality that enabled objects to have a more intricate relationship with each other. For example, I worked on implementing the behavior of inserting a syringe into the infusion pump or dropping a medicine onto a tray based on pre-defined object locations. To achieve this, I introduced several methods of collision testing, including sphere collisions and line traces, to accurately determine the distance between objects.

In addition, I developed a system to handle object behavior such as dropping items on the floor and automatically resetting them to their original position after a specified amount of time. By adding these features, we were able to create a more realistic and immersive experience for our users.


Onboarding & Statemachines

In my role as the developer in charge of the onboarding experience in the VR application, I employed a state machine to regulate the behavior of actors and UI elements within the scene. The onboarding experience involved users completing predetermined activities in each state before advancing to the next instruction, which necessitated a range of interactions with widgets and objects within the environment. Moreover, the onboarding experience included the provision of instructions for familiarizing users with the headset and controller.


Analytics

To fulfill the project's requirement of data tracking and analytics reporting to the client, I utilized a game save object to monitor the player's progress. At specific milestones, such as loading into the experience or completing a module, the data was converted into a JSON format and transmitted to a remote server. In the event that transmission was unsuccessful, the data would be temporarily stored on the headset until it could be resent.


CSV, Data Tables & Content Population

The data within the experience is managed through the use of data tables to determine the states of actors within the scene. By utilizing structs within these data tables, the conversion of data into JSON format was a straightforward process. Additionally, all data was pre-defined, making updates and additions easy to implement. In fact, this approach allowed for seamless integration with other systems in the project, facilitating smooth data management throughout the development cycle.

 
 
 

Other

Deployment & Packaging

As part of my duties, I was responsible for aiding in the packaging and configuration of all builds for Android shipment to the Oculus Quest. Moreover, I oversaw the sideloading and deployment of the application on the headsets that were sent out to the client.


Mixed Reality Capture

I assisted in devising a mixed-reality demo as a means of presenting our work to clients. This entailed configuring a green screen setup, illuminating the actors on stage, and synchronizing a virtual camera within Unreal Engine with a physical camera. This process enabled us to fabricate the impression that the actors were physically positioned within the UE4 environment, thereby producing a visually engaging and immersive experience for viewers.