Savvy by Cohesion
Building a Data Product for Real Estate Decision-Making
Executive Summary
https://www.cohesionib.com/products/savvy
Savvy is Cohesion’s analytics and AI platform designed to turn fragmented building data into clear, actionable insights for real estate leaders.
I led the development of Savvy as a data product, defining its strategy, data architecture, analytics layer, and AI capabilities. This included building the semantic data model, designing analytics workflows, shaping the AI assistant, and driving go-to-market execution.
The result was a platform that:
Centralizes data across building systems, sensors, and business tools
Enables both self-service analytics and AI-driven insights
Helps customers make decisions around cost, occupancy, space, and operations
Establishes a foundation for digital twin, predictive analytics, and automation
What Savvy Is
Savvy is a unified analytics product that combines dashboards, AI, and automated insights into a single interface.
At a product level, it consists of three core components:
1. Customizable BI Dashboards
Interactive dashboards embedded directly into the Cohesion platform.
Portfolio, building, and tenant-level views
Metrics across occupancy, space utilization, energy, and operations
Configurable per customer, with dashboards enabled based on use case
2. AI Assistant
A conversational interface that allows users to query their building data in plain language.
Backed by a custom AI agent
Translates natural language into structured queries against the datamart
Returns contextualized answers combining multiple data sources
3. Automated Insight Summaries
Pre-generated insights that surface key patterns and anomalies without requiring user input.
AI-generated summaries embedded directly into the platform
Highlight trends like underutilization, cost inefficiencies, or unusual activity
Designed to proactively guide users toward decisions
Why this matters
This structure allowed Savvy to serve both:
Analysts → deep, self-service exploration via dashboards
Non-technical users → fast answers and guidance via AI and insights
Instead of forcing users to “go find answers,” the product meets them at multiple levels of sophistication.
The Problem
Real estate teams weren’t lacking data, they were overwhelmed by it.
Data lived across:
Access control systems
Occupancy sensors
Reservation platforms
Energy + utility systems
Financial and leasing tools
But:
It was inconsistent and fragmented
Required manual analysis
Was not accessible to decision-makers
Savvy’s core premise was simple: Bring all building and business data into one place, and make it usable for real decisions.
My Role
I operated as the Product Manager for “data as a product”.
That meant working across:
Product, engineering, data, and design
Executive leadership (CEO, CTO, VPs)
Sales, marketing, and implementation
Enterprise customers
Key responsibilities:
Defined data product strategy and roadmap
Designed semantic data layer + datamart
Led analytics and dashboard design
Contributed to AI assistant (prompt + query logic)
Drove GTM, demos, and customer adoption
Approach: Start With Decisions
Most analytics products start with data and build dashboards. My approach started with the user benefit:
Start with the decision (or automation) → work backward to the data
Example:
Decision: Should we reduce space?
Data Needed: Occupancy (badge swipes or occupancy sensors) + reservations + cost
Insight:
Department attendance
Cost per occupant
Underused space
Comparisons across office portfolio
Example:
Decision: Where are we overspending?
Data Needed: Energy and utility usage and spend + operational line items + occupancy
Insight:
Energy use per occupant (across portfolio)
Operational cost per occupant (rent, office investment, etc.)
Inefficiencies across portfolio normalized by occupancy
Building the Data Foundation
The hardest part of Savvy wasn’t dashboards or AI, it was building a reliable data layer underneath them.
We built a full analytics stack centered on Databricks as the warehouse, with structured pipelines, a governed data model, and a Power BI semantic layer on top.
Sources → Raw Tables → Datamart (Databricks) → Power BI → Dashboards + AI + Automation
Ingestion & Transformation
Databricks served as the central data platform.
We ingested data from:
APIs (access control, utilities, sensors)
IoT streams (occupancy, IAQ)
Internal systems + CSV uploads
We structured this using a medallion architecture:
Bronze: raw data
Silver: cleaned and standardized
Gold: analytics-ready datasets
This ensured data was consistent, traceable, and reusable across use cases.
Datamart: Structured for Analytics
On top of this, we built a Databricks datamart with two layers:
Canonical Tables
Raw + normalized fact and dimension tables
Examples: badge swipes, reservations, service requests
Joined via keys like Building_Code, Company_Code, Local_Date
This formed a snowflake-style model with shared dimensions across all datasets.
Aggregated Tables
Pre-calculated datasets (occupancy, utilization, visitors, etc.)
Designed specifically for analytics performance
This pushed heavy computation upstream and made dashboards fast.
Power BI: Semantic Layer & Measures
Power BI sat on top of Databricks and handled:
Table relationships (building, company, date)
Business logic via measures
Final aggregation and filtering
Key principle:
Databricks = heavy computation
Power BI = flexible analysis
For example, metrics like utilization were calculated as measures (e.g. total occupancy ÷ total capacity), so they scaled correctly across filters
Data pipelines ran daily overnight, with selective near real-time updates for specific use cases
Turning Data Into Insight
The system was designed to balance:
Standardization (shared datasets across customers)
Flexibility (custom analysis per client)
We combined:
Pre-aggregated datamart tables
Dynamic Power BI measures
Dashboards were modular and enabled per customer, but all built on the same core data model.
AI Layer: Making Data Accessible
Savvy’s AI assistant made the platform usable for non-technical users.
Instead of dashboards alone, users could ask:
“Which buildings are underutilized?”
“How can I reduce energy costs?”
“What’s driving high operating costs?”
Behind the scenes:
Structured prompt engineering
Example query training (200+ cases)
Context-aware SQL generation
Validation + retry logic
Go-To-Market & Adoption
A strong product alone wasn’t enough, we had to make it understandable.
I led and supported:
Product demos to customers
Early-access rollout + feedback loops
Sales training and enablement
Documentation, FAQs, and guides
Data storytelling for marketing
This ensured:
Sales could clearly explain value
Customers could quickly adopt the product
Leadership could position Savvy strategically
Impact
Product Impact
Established data as a core product pillar at Cohesion
Enabled cross-system analytics across buildings and portfolios
Created foundation for AI + digital twin strategy
Customer Impact
Reduced reliance on manual analysis
Enabled faster, better decision-making
Delivered insights across:
Cost
Occupancy
Space utilization
Operations
Technical Impact
Standardized data modeling + governance
Improved pipeline reliability and performance
Created scalable architecture for future analytics
Closing
This work sits at the intersection of:
Product strategy
Data architecture
Analytics
AI
My role was to connect all of those into a single, usable product.
And more importantly: Turn complex, messy building data into something people can actually use to make decisions.