



Improve Design Strategies Through Better Benchmarking
Data Analyst and Strategist Emily Gaines and Laboratory Planning Director George L. Kemper, AIA examine the role of benchmarking in early lab planning, and the conditions under which it supports better design decisions.
by George Kemper and Emily Gaines
Originally published in Commercial Construction + Renovation, April 2026
High-performing laboratory environments increasingly depend on the quality of data used to inform early planning decisions. Organizations such as Syngenta, BASF, The Ohio State University, and Duke University rely on operational data to guide laboratory design, improve efficiency, and build stronger stakeholder consensus during the planning process. Advances in digital tools and other technological trends in the industry now allow design teams and owners to move beyond assumptions and improve how space is measured, benchmarked, and evaluated.
Benchmarking helps improve design decisions, but it does not play the leading role. Benchmarking is based on historical data. In an ever-evolving industry that relies on cutting-edge technology, historical data may not always reflect current trends and best practices. The value of benchmarking lies in merging historical data with expertise in current trends and best practices to holistically inform design decisions. The use of data in design decisions allows lab designers to minimize the number of ideations required, as the data provides a foundation to build on. It also helps identify whether a client’s space falls within a standard range, provides insight into what makes a lab effective, and enables designers to compare similar projects and determine a given client’s particular needs.
Benchmarking strengthens laboratory design decisions by providing a data-driven foundation that’s grounded in previous projects, while assisting designers in interpreting that data through the lens of the owner’s goals, vision, and current industry trends and best practices. For example, historical data establishes realistic baselines such as typical lab-to-support ratios, module sizes, and researcher densities. These baselines allow designers to begin planning with informed assumptions instead of starting from scratch. When comparing a client’s program to these benchmarks, teams can quickly determine whether proposed space allocations fall within common industry ranges or are outliers that require further investigation. In some cases, these differences are justified based on the type of research or operational model.


Digital tools can be customized to support only the data values identified by the team, company, or client, thereby adding a layer of data validation. Bad data is a significant barrier to the successful utilization of data to inform decisions. Designers can collect cleaner data and ultimately streamline the data transformation process by using data validation practices within digital tools. In turn, a more efficient data transformation process creates more time for data analysis and increases confidence in the data.
Data analysts spend 80% of their time cleaning and prepping data and only 20% analyzing it. The most value comes from the 20%, but those insights are only beneficial if the other 80% is performed correctly. Using digital tools for data validation eliminates common causes of bad data that typically occur with manual data entry, reducing the time required to clean the data, and shifting some of the 80% time back toward the 20% for more in-depth analysis.
Heat mapping provides invaluable insights as well. Creating heat maps from occupancy analytics can help designers understand how people move through a given space, compared to how designers planned for people and materials to move through it. Heat maps can measure the effectiveness of designs and inform potential modifications to future designs. It’s essential for design firms to consider what data exists in the organization’s day-to-day work. For example, the design industry doesn’t inherently generate data the same way the sales industry might, nor does it utilize data in the same way the financial sector does. Clients can consider the project’s goals and evaluate whether existing data related to those goals can be leveraged to inform design solutions.
The design industry generates significant data across the sources that architects, designers, planners, engineers, and contractors use daily. Construction management software, enterprise resource planning (ERP) software, and building information modeling (BIM) software, for instance, all store immense amounts of data, in addition to traditional benchmarking data contained in spreadsheets. It’s crucial for organizations to consider questions such as: how does this data facilitate better benchmarking and decision-making, and does knowing that these are significant data sources change how thoroughly items are documented in each software system?


Data is not part of the traditional design and planning process, so integrating into the design stage requires strong leadership support and a structured change-management approach. From a client standpoint, it’s vital to ensure leadership is aligned with using data to inform decisions, because insights from analysis may lead teams to rethink previously held assumptions. Openness to uncovering unique insights hidden within the data starts from the top down.
For design firms, organizations need to determine how data-related efforts are accounted for in the design fee, since they are not typically included at this stage. The standard fee structure from a decade ago does not capture the time spent collecting data or incorporating benchmarking insights into design solutions. Identifying how to accommodate this additional layer of detail requires strategic consideration from firm leadership to align fees and timelines with the demands of a more data-centric environment. Understanding the value proposition of incorporating data into a project’s design requires buy-in from leadership and project stakeholders, including the design team, clients, and contractors. Value-engineering a design aspect that was supported by data can undermine the purpose of data-driven decision-making.
Ultimately, making decisions based on data has a positive impact on end users by reassuring clients that they’re making informed choices, especially when those decisions carry financial implications. Designs driven solely by data, however, fail to account for best practices. The prevalence of bad data presents another ongoing challenge. Combining utilization analysis, benchmarking, and structured user feedback creates a more accurate picture of how laboratories function in practice. Understanding patterns from past projects, combined with current industry trends and best practices, supports purpose-driven lab planning. The value of data-informed decisions is realized when quantitative data is layered with qualitative insights to meet the unique needs of each client and lab. When integrated into the planning process, these approaches strengthen decision-making, improve operational outcomes, and help teams achieve alignment on complex design choices.








