Contact

Cart Cart

MyMarble AI (2021)

MyMarble AI (2021)

In 2021, I co-founded MyMarble AI with neuropathologists at UHN with one question in mind: what if we could understand a child’s story before they take their first breath?

 

(0.1) Background

The University Health Network (UHN) hosts one of the world’s largest and most comprehensive datasets of in-utero sonograms. This resource provides an unparalleled foundation for applying advanced machine learning to fetal development monitoring and early diagnostic research.

In 2021, Marie co-founded MyMarble AI in collaboration with three neuropathologists at UHN to build a next-generation platform for real-time fetal development tracking and predictive diagnostics. By leveraging this large-scale dataset, the platform was designed to detect early biological signals associated with conditions such as triple-negative ovarian cancer, as well as a broad spectrum of genetic and neurological disorders before birth.

The project sought to bridge state-of-the-art machine learning with clinical practice, creating tools that could turn early fetal imaging data into actionable insights for preventive care and early intervention.

 

(0.2) The Process

The project was designed in two distinct phases to integrate longitudinal clinical imaging data with real-time user data streams, creating a novel framework for early ovarian cancer risk detection.

Phase I: Retrospective Imaging Analysis

The first phase involved the development of an internal machine learning pipeline capable of analyzing the entire UHN in-utero sonogram dataset, which spans from 1987 to the present. This dataset represents one of the most comprehensive longitudinal fetal imaging archives in the world. The objective was to train and validate models that could identify subtle anatomical or developmental markers potentially correlated with ovarian pathology manifesting later in life.

This required establishing robust data preprocessing protocols, including image standardization, anonymization, and temporal indexing, followed by the application of advanced deep learning methods to detect subclinical patterns across decades of imaging data.

Phase II: Prospective Data Collection and Model Integration

The second phase focused on the development of a consumer-facing mobile application designed to collect prospective, real-time user data and integrate it with insights derived from the retrospective imaging models.

This phase was grounded in a one simple biological premise: females are born with their ovaries, which provides a unique window for early biomarker detection.

By linking real-time data from a living cohort with historical imaging, the system aimed to support longitudinal modeling, enabling early identification of ovarian cancer risk factors and other gynecological or genetic conditions well before clinical onset.

My Role

As an experience architect and project co-founder, I led the foundational research and design phase of MyMarble AI. Over the first three months, my work focused on deep user analysis, information architecture, and the systematic translation of clinical complexity into clear and testable frameworks.

This phase began with structured stakeholder interviews involving clinicians, researchers, and patients to map the human, technical, and ethical dimensions of the product. I used contextual inquiry, journey mapping, and service blueprinting to understand how fetal development data was collected, interpreted, and acted upon in clinical settings.

From these insights, I developed the core experience architecture of the platform using principles of human centered design, minimal cognitive load, and high trust user experience. These principles are critical in clinical environments where confidence and clarity are essential. I also applied Fitts’ Law, Hick’s Law, and Gestalt principles to structure information hierarchies and reduce friction in high stakes diagnostic workflows.

In parallel, I collaborated closely with the machine learning team to define data interaction models and to establish a feedback loop between interface and algorithm. Using high fidelity Figma prototypes, interaction simulations, and iterative usability studies, I shaped interfaces that could surface complex model outputs in interpretable and clinically meaningful ways.

My background as a usability lab facilitator and voice interface researcher informed the conversational structure of the system. I designed interaction flows that prioritized interpretability, error recovery, and transparent model behavior, aligning with emerging best practices in AI human interaction.

Throughout this phase, the design strategy followed four guiding principles:

  1. Human trust over algorithmic opacity
  2. Simplicity over noise
  3. Context over abstraction
  4. Interpretable intelligence over black box automation

This approach ensured that every product decision, from data visualization to interaction models, was scientifically grounded, ethically sound, and clinically relevant.

(0.3) The Results

The following are the mobile application concepts designed for the consumer. Every element, from branding to design systems, was carefully reviewed and approved by UHN stakeholders to ensure clinical integrity and user trust. The goal was to build not just an application but an experience that could bridge complex medical science with the emotional realities of expectant parents.

The design process began with user journey mapping, translating the clinical workflow into simple and intuitive interaction patterns. The interface focused on clarity and calm, avoiding visual noise and prioritizing information that truly mattered. A modular design system was developed to support both clinical accuracy and scalability, ensuring that the interface could grow with new data models, diagnostic tools, and patient use cases.

The mobile app was structured around three pillars: trust, transparency, and empowerment.

Trust was established through clear visual hierarchies and language that avoided medical jargon. Transparency was achieved by making AI insights explainable and traceable. Empowerment came from giving users meaningful access to their health data without overwhelming them with technical complexity.

Every screen, from onboarding to real time fetal development visualization, was designed with accessibility in mind. This included adaptive typography, simplified navigation, and gentle motion patterns to support emotional reassurance during what can be a deeply vulnerable time for parents.

The result was a high fidelity prototype that combined scientific rigor with thoughtful design. It demonstrated how an intelligent mobile interface can make medical data feel understandable, approachable, and actionable, turning complex diagnostics into a clear narrative of care.

Back to blog

Your audience remembers.

marie@arcadian.ai