top of page
places title img w.jpg

PLACES is a web app that streamlines the research process for Princeton University’s campus building performance evaluation. The tool guides researchers through complex, multimethod workflows, all-in-one.

My Role

user experience design (web)

Team

Princeton PLACES project team @ZGF Architects​ 

(Consultants: the Decision Lab, Daylight, Nitsch, Branchpattern)

Timeline

Jun - Aug 2023

OVERVIEW
During my time at ZGF Architects LLP, my team and I built a research framework for Princeton University’s building performance evaluation (BPE), to be used across all campus facilities. I focused on translating the tool into an intuitive web app, 0-to-1.

As the only UX designer at the phase of the project, I quickly developed an understanding of the wide range of research goals and methods, thereby synthesized intricate workflows into actionable, user-friendly experiences.​ ​

timeline p.png

project timeline (planned)

OUTCOME

Start a New Evaluation

The PLACES tool experience starts with a dashboard for all BPE projects at Princeton, which can be toggled between list and cards view.

Researchers can start a new project from here.

two views-new.gif

Select Evaluation Objectives

The PLACES tool comes with 4 categories of evaluation objectives:

- occupant experience (E)

- occupant behavior (H)

- building performance & sustainability (B)

- site performance (S)

From this pool, researchers can choose a subset of objectives to include in their new evaluation, based on factors such as building age and function.​

selecting objectives.png

Generate Research Methods

Given the customized objective list, the PLACES tool auto-generates a set of methodologies for researchers. Each methodology indicates the objectives it attempts to evaluate and the estimated implementation time. 

Researchers have the freedom to edit the specifics within each method.

methods.gif

Synthesize Insights

Researchers reflect on the information they gather, synthesize and input insights and observations on the PLACES platform. 

Observations are paired with corresponding objectives. Researchers can also link evidence (survey responses, checklist results, interview quotes...) for each observation. 

insights.gif

CONTEXT

Buildings can’t be evaluated by structures and materials alone. It is also about human experience in the space.

PLACES aimed to make significant innovations over traditional BPE frameworks, driven by a human-centered value. We first asked: what is important to the occupants of buildings? In collaboration with our consultants, our team unearthed an initial list of evaluation objectives.

BPE-POE_Mind-Map final-01.jpg

A DIVERGENT WORKFLOW

To measure these objectives, researchers need both numbers and human stories. The diversity of methodologies posed a challenge to structure into a cohesive tool.

We proposed a pool of methodologies to capture both qualitative and quantitative data. Examples included focus group interviews, surveys, in-situ observations, and numerical measurements. Researchers could select unique combinations, tailored to each research objective.

methodologies.jpg

methodologies overview

engagement strategies example.jpg

example: a methodologies combo, with engagement activities

snapshot of survey.png

example: question list for survey method

CLIENT'S PAINPOINT

Our client? Excited, yet also concerned. 

In weekly client meetings, we presented our extensive list of objectives and research methods to the Princeton committee. Seeing the comprehensive framework, they began to worry researchers would be overwhelmed by the tool, and be left unclear of how to conduct the evaluation step-by-step on their own.

Committee said.jpg
DESIGNER’S SOLUTION
The tool needed to translate a dense, multi-layered framework into a clear human experience for researchers. 

Just like our ambition for PLACES! Even though the project was at an early stage, as the designer, I wanted to structure the process to empower users from start to finish. 

i said.jpg

UNDERSTANDING OUR USERS

By mapping out a step-by-step process, I started transitioning from framework to user experience.

The framework is intended to give flexibility, meaning decisions and actions by users happen at each step. 

44 Project Structure 1.jpg
Our team tested it ourselves. We conducted pioneer-studies on the Princeton campus to validate the framework and deepen user empathy.

By experiencing the process firsthand, with a heavy focus on qualitative methods like engagement activities, we were able to step into the shoes of future researchers and further refine the methodologies.

popup.jpg

pop-up engagement activity

loveletter.jpg

campus "love letter" activity

site.jpg

example site visit

42 Cross-Team View of Objectives-01.jpg

breaking down the test study from goals to granularity

campusdata.png

synthesis of collected data

flyer.png

on-campus survey flyer

engagement week schedule.jpg

engagement week schedule

WIREFRAMING

Considering users’ key interactions with the digital tool, I identified 8 stages of the user journey.
stages.jpg
Multiple user groups are involved in the research process. The wireframe outlined, in a bigger scope, how each group is assigned different levels of access. 

While my proceeding screen-by-screen design focuses on the researcher flow, the bigger scope of the project is to value feedback from all stakeholders. An external report would eventually be released to the entire Princeton community. This establishes a positive loop, where input from the public prompts real changes both in the process and on campus.

wireframe.jpg

OPTIMIZING FEATURES

To address the client’s concern, I made design decisions that help reorient researchers whenever they risk losing sight of the bigger picture amid executional details:

Objectives and findings side-by-side

Constantly keeping track of selected objectives is the best way to prevent researchers from getting lost in the process. Correspondance in display provides immediate context and enables faster, more informed decisions.

sidebyside.jpg

Progress bar

Real-time progress visualization helps researchers stay motivated.

progress bar.jpg

Anytime revisit on the framework

Foreseeing potential confusions along the process, instant access to the framework ensures each step stays aligned with the project’s goals.

check back.jpg

Team collaboration

As the tool is designed to support ongoing collaboration, it allows team members to log updates and track changes.

team updates.jpg
home.png
login.png
internal dashboard card.png
internal dashboard list.png
create new proj.png
selecting objectives.png
defining methodologies.png
survey.png
survey expanded.png
0 insights.png
new insights.png
insights.png

all screens

bottom of page