Brian Gue, manager of data science at PCL, speaks with Digital Journal CEO Chris Hogg at Edmonton Unlimited. – Photo by Jennifer Friesen, Digital Journal
A dense block of alphanumeric product numbers spans the screen. At first glance, they look nearly identical. They aren’t.
“There is no industry standard for the description of these components,” said Rowan Andruko, a computer science graduate working on PCL’s industrial data science team.
Andruko pulls up two descriptions of the same piping component. One comes from the engineering firm that designed the facility. The other has been standardized for use in PCL’s fabrication shop in Nisku, Alberta.
“You can see they’re quite different, but they contain the same information,” he said.
The example was shared during a community coffee at Edmonton Unlimited in a conversation led by Built World Tech lead, Zack Storms.
Brian Gue, manager of data science at PCL, and members of his team walked through how a large construction firm builds and tests internal digital tools. Headquartered in Edmonton, PCL is one of Canada’s largest employee-owned construction companies.
I’ve attended many digital transformation briefings, but this one stood out for the mix of people presenting.
Gue, who studied mathematics before moving into applied data science in industrial construction, was joined by a team spanning applied math, civil, electrical, mechanical engineering, product development, and design.
One member spoke about a background in fine arts and how that perspective informs visualization and usability. Another described building a machine learning model to simulate wildfire spread before applying similar techniques to construction data.
Gue’s team includes experienced professionals, but many of the presenters at this event were students from MacEwan University and the University of Alberta, and the quality of both their work and their delivery was hard to miss.
The discussion focused on a common problem in industrial construction: big projects generate massive amounts of technical information that have to be translated, standardized, and reconciled before any work can move forward.
In Canada, that process underpins billions of dollars in industrial work.
According to Statistics Canada, investment in building construction reached $24.5 billion in November 2025 alone. Of that, $6.9 billion was in non-residential construction, including industrial projects.
Billions of dollars move from design to procurement, fabrication, and installation every month.
For Gue, that scale reframes the industry.
“When we look down what the priorities for the country are, the priorities for the cities that we live in, everything passes through a capital project funnel,” he said.
The performance of that funnel impacts how quickly Canada can build what it says it needs. And that’s important because across the construction industry, productivity has not kept pace.
A June 2025 report from KPMG and the Canadian Construction Association found the industry has averaged just 0.4% annual growth since 1997 and fell to a near 30-year low in 2023. That means more spending and more labour are not producing proportionate output on site.
Labour pressures are also compounding the challenge.
Statistics Canada data shows that construction jobs are still going unfilled at higher rates than before the pandemic, and BuildForce Canada projects 270,000 experienced tradespeople will retire between 2025 and 2034.
A perfect storm forms when you also factor in the lag that comes from construction processes still being manual.
StatsCan data from 2025 shows fewer than 10% of construction businesses plan to adopt AI over the next year, suggesting most still rely on manual data reconciliation and spreadsheets.
That’s the backdrop for the work Gue and his team are trying to change.
If productivity has stalled and labour is tightening, the way information is managed inside a project becomes more than an administrative concern.
He described the problem as “information reconstruction from one party to the next,” adding that projects involve “many, many, many handoffs.”
Each handoff requires someone to reinterpret what came before. Across large industrial builds, that repetition compounds.
For Gue’s team, bringing order to project data is one way to reduce that drag, and the PCL team shared many examples of tools being built to standardize components, streamline reporting, and simplify complex plans.
From translation to schedule impact
The first presentation made the issue tangible.
Industrial piping systems are often fabricated in specialized shops before being transported to site. Fabrication teams have to interpret engineering intent and convert it into formats that align with procurement and shop systems.
The PCL team in Edmonton shared a slide titled “Boyle.ai — Universal Technical Translation,” showing how thousands of technical descriptions have to be converted into a consistent format before materials can be ordered.
Today, this work is manual.
“Normally done by hand, with a couple of people sitting in front of Excel sheets for hours, days, or even weeks translating one component at a time,” the slide reads.
Material can’t be purchased until the descriptions are aligned, so the PCL team is now using machine learning and AI to accelerate classification and standardization.
To bridge the gap between an algorithm and a project coordinator’s intuition, the team provides a “confidence score” for every automated translation.
“This confidence score is important for a user’s sense of trust for the tool,” Andruko said.
Gue’s team also regularly re-trains their machine learning models and treats their algorithms like athletes in a playoff, constantly replacing weak performers with better ones.
“We run a tournament for our machine learning models,” said Andruko. “Whenever we retrain them, we compare each [new model] with their previous iterations, and we allow the best-performing ones to continue forward, and we toss out the lower-performing ones.”
While things move quickly once learning models are in place, assembling the data for machine learning was a long, Herculean effort.
“It took us probably a year to get 80,000 records, and that is not very much data for machine learning models,” Andruko said. “So we had to do some work in synthesizing data.”
Synthesizing data involves creating artificial datasets that mimic real-world patterns to give the algorithms enough “practice” to improve their accuracy when historical records are sparse. Andruko said data acquisition is a major challenge in construction, and most of what PCL uses is internal data because there isn’t much else to go on.
On another slide in the presentation, project reporting and forecasting were positioned as “a lifeblood of capital projects’ project management,” covering cost, schedule, hours, and quantities.
Michelle Fribance, a data scientist and engineer who also has an arts background, is responsible for making that reporting “modern, insightful, and cognitively easier for its audiences,” so project leaders can see emerging risks sooner and adjust before small data gaps turn into field problems.
She shared a simple tip she uses to help keep people focused on the data, and not how pretty the dashboard is.
“These are our mock ups,” Fribance said, pointing to the screen. “You’re probably looking at these and wondering, ‘How does this chick with a fine arts degree create this and think it’s nice? She can’t draw straight lines, and can’t colour in the lines.’ This is actually intentional,” she said to a roomful of laughter.
Fribance said she designs with low fidelity because she doesn’t want construction leaders seeing a product in development and thinking it’s complete.
At this stage of development, experience tells her that people will focus on things they don’t like, such as colour choice, when what she needs is input on how data is visualized.
“We want our clients to focus on things that matter, and things we need input on,” said Fribance. “We need input on metrics, KPIs, and basics about the domain. That’s why we keep things intentionally low-fidelity, and then we build up from there when we have more input. Eventually we end up with the final product.”
Outside of design, some of the tools PCL is building originate from immediate project demands rather than long-term product roadmaps.
One example involved a system built by a summer intern for his own project work. The tool geolocates devices within a drawing set and links them to a digital twin of the facility. Instead of searching through layers of drawings, teams can see exactly where equipment is installed and how it connects to surrounding components.
The value comes from cutting the time spent figuring out what the original design was meant to do.
Another slide, “Beeline: Precision Electrical Optimization,” outlined a system designed to optimize, visualize, and communicate electrical plans at scale.
Beeline was born from the need to plan cable routing and fix a process that often leads to delays. Large industrial projects can involve hundreds or even thousands of kilometres of electrical runs that are divided into tens of thousands of segments.
By mapping and visualizing those routes before installation, the team aims to reduce cost and manage risk in electrical and instrumentation work.
Gue views these immediate problems as the entry point to solving massive industry-wide gaps.
“It’s usually a tip-of-the-iceberg situation,” he said. “People can say ‘I have a problem here that is going to take me a month or a year to solve, so can you help? But by the time we’ve solved it, we’ve found something that’s going to help the whole industry.”
Small teams inside a large enterprise
Once the demo ended, Gue shared details on how the team operates, saying a lot of the approach is similar to startup thinking.
“Work out in the open,” one slide reads.
“Forward-deploy helps: be as close to our clients/users as possible, work shoulder-to-shoulder,” reads another point.
And “Speed matters and momentum matters.”
Gue said roughly 80% of the team’s time is allocated to prioritized business needs, with about 20% reserved for experimentation and research.
That structure places most of the team’s effort inside defined project priorities rather than open-ended exploration.
The emphasis on working in the open and staying close to users means tools are introduced where they will be used, and their value is judged by whether they reduce manual effort or clarify decisions.
“We still have a long way to go in terms of making a digital-native decision process within a very traditional industry,” Gue said.
“But that’s up to us to make compelling solutions for our industry. The priority is bringing more of this algorithmic thinking, algorithmic support to our field teams in order to bring more certainty, and contextual and visual information to the field.”
Final shots
- Digital transformation in capital projects starts with disciplined data. If inputs are inconsistent, automation and analytics cannot scale.
- In complex environments, performance is shaped by how information moves between teams. Standardization and clarity upstream influence cost, schedule, and risk downstream.
- Small, cross-functional teams with protected experimentation time can drive meaningful change inside large enterprises when their work is tied directly to operational priorities.