First Proposal Automation: From 48 Hours to Real-Time at GAF Energy
The Stakes
GAF Energy sells solar roof shingles through roofing partners. To pitch solar to a homeowner, a roofer needs a proposal: a document showing the solar design for that specific roof, the expected energy production, warranty data, and the pricing. Solar production calculations are uniquely complex because even two identical roofs at slightly different latitudes or longitudes will produce different amounts of energy. The angle of the sun, local weather patterns, seasonal changes in the sun path, nearby tree coverage and the shade it produces, and even voltage drop across cable runs all factor into accurate production estimates. Each proposal had to account for all of these variables to generate reliable production figures and the warranties built on top of them. Before this project, generating a single proposal took the Design Engineering team approximately 48 hours. The process was almost entirely manual. Design engineers worked primarily out of Excel worksheets that handled the bulk of the production calculations and design constraint logic. These worksheets were partially integrated with both Aurora (the solar design software) and Salesforce, but the integration was incomplete, so engineers still had to manually move between all three systems throughout the process.
At 48 hours per proposal, the bottleneck was obvious. A roofing salesperson visiting a homeowner's property could not get a solar proposal ready during the conversation. They had to request one, wait two days, then follow up. In a business where the sale often happens (or doesn't) during that first site visit, a 48-hour delay meant lost deals. The Design Engineering team was also struggling to keep up with growing demand. Every new roofing partner meant more proposals, and the manual process did not scale.
My Role
I was the Technical Product Manager responsible for defining the automation requirements, designing the workflow architecture, and managing the engineering team through execution. This meant mapping every step of the existing manual process with the Design Engineering team, identifying which steps could be automated and which needed human judgment, defining the integration points between Salesforce and Aurora's API, and coordinating with the design engineers who would ultimately use the new system daily.
The project also required building trust with the Design Engineering team. These were experienced engineers who had developed their own workflows over months of hands-on work. Telling them "we are going to automate your process" could easily land as "we are going to replace you." The framing that worked was different: automation handles the repetitive data movement so you can focus on the design decisions that actually require your expertise.
The Real Problem
The manual process had roughly 12 to 15 discrete steps between receiving a proposal request and delivering a finished PDF. When I mapped them out with the design engineers, it became clear that the majority of those steps were not design work at all. They were data movement: copying information from one system to another, running calculations in Excel, formatting fields, triggering workflows, and exporting documents. The actual design decisions (reviewing the solar layout, adjusting for roof characteristics, validating that the system was buildable) represented maybe 20 to 30 percent of the total time. The rest was operational overhead that existed because our systems were not properly connected.
The root cause was that Salesforce, Aurora, and the Excel worksheets were operating as loosely coupled systems with design engineers filling the gaps. Salesforce held the lead and property data. Aurora held the solar design tools and simulation models. Excel handled the heavy calculation work (production estimates, design constraint logic, determining which rules applied to which systems) and served as a partial bridge between the other two tools. There was no fully automated data flow between them. Design engineers were the human integration layer, manually shuttling data across all three systems for every proposal.
The 48-hour turnaround was only for the first proposal, and it did not include changes. If a design engineer made an error, or if a roofing salesperson reviewed the proposal and noticed something off (new trees on the property causing shade, a system layout that did not match current site conditions), a change request could add another 12 to 24 hours. This compounded the capacity problem significantly. Every changed proposal was a proposal that could have been a new one. If a design engineer had a rough day and a few proposals needed corrections, the queue backed up for the entire team. The roofing salespeople felt this acutely: they were used to the speed of pitching asphalt roofs and the multi-day turnaround for solar proposals made the sales conversation much harder to sustain.
It is also worth noting that these proposals were not informal estimates. They got signed. They were an official part of the sales process where GAF Energy made guarantees to the homeowner about expected system production and warranty coverage. That meant the automation challenge was twofold: proposals had to be generated fast, and they had to be accurate.
The proposal PDF itself was another bottleneck. It was assembled manually from exported data, which meant formatting inconsistencies, occasional errors in production numbers or pricing, and limited ability to regenerate a proposal quickly if something changed.
This project later became the foundation for two related initiatives. The first was the Redesign ToolWe built an interface on top of Aurora's SDK inside the roofer portal, simplifying the tooling so roofing partners could make design changes on the fly and generate a revised proposal in roughly 20 minutes. This transformed the sales motion from a multi-day back-and-forth into an interactive conversation with the homeowner., which gave roofers the ability to make design adjustments themselves and generate updated proposals in near real-time. The second was the Design Automation system (ADC/ADV)ADC (Automated Design Creation) ingested address and roof data from Aurora and filled the roof with as many custom shingles as possible using a variation of Kadane's algorithm. ADV (Automated Design Validation) then checked the design against a large set of hardware and locality-based design rules. If a design failed validation twice, it was routed to the Design Engineering queue for manual review. Over time, this system automated nearly 90% of incoming designs., where we automated the solar roof design process itself. Both will have their own case studies.
Approach
The solution had three components, each designed to eliminate a category of manual work.
Salesforce as the single workspace
We moved the entire proposal workflow into Salesforce so that design engineers could manage everything from one place. This eliminated the need for the Excel worksheets that had been handling calculations and bridging data between Aurora and Salesforce. All lead data, property information, design parameters, production calculations, and proposal status now lived in Salesforce, and the workflow stages were enforced through custom objects and automation rather than managed manually across three separate tools.
Aurora API integration
We built an integration that pulled solar design data from Aurora into Salesforce via Aurora's API. Instead of a design engineer manually exporting a design from Aurora and entering the specifications into an Excel worksheet for calculation, the system pulled the relevant data (system size, expected production, panel layout, shading analysis) directly into the Salesforce record. This was the most technically complex piece of the project because Aurora's API had its own quirks and rate limits that we needed to work around. Some computationally intensive requests could time out when processing large or complex calculations, so we added a caching layer to ensure that if a connection dropped mid-request, we could resume without losing progress. We also established proxies between our systems and Aurora during this project. The original integrations had no proxy layer, which meant there was no reliable way to track API events or ensure secure connections. Adding proxies gave us event tracking, security, and better observability. This became a new company standard: any third-party connection would go through a proxy for security and monitoring.
Automated PDF generation
We spun up a dedicated service in GCP and built custom Salesforce packages that could generate a complete proposal PDF from the data already in Salesforce. The PDF was templated: it pulled field values (system size, production estimates, pricing, warranty data, homeowner information) from the Salesforce record and populated a designed template automatically. This was the first time GAF Energy had a system-generated proposal. Previously, every PDF was hand-assembled from exported data.
Execution
The project ran in phases across roughly four two-week sprints rather than as a single large release. The first phase focused on getting the Salesforce workflow operational so that design engineers could stop relying on the Excel worksheets. This was a quick win that reduced friction and built confidence in the new system before the more complex integration work landed. It also served as an important test of the working relationship between my team and the Design Engineering team. Early on, the feedback loop was too loose: design engineers would message us in Google Chat when they noticed an issue, and there was no guarantee we would see it quickly or have a clear sense of priority. By the end of this first phase, we formalized a cadence of starting and ending each sprint with dedicated sessions to gather feedback, review open issues, and align on priorities for the next sprint. That structure made a significant difference because the design engineers were the true experts of this process. They were the ones who noticed when something was slightly off or could identify improvements that were not obvious from the outside.
The Aurora API integration came next. We worked closely with Aurora's documentation and support team to map every field we needed, handle authentication, and build error handling for cases where the API returned incomplete data or timed out. We tested the integration extensively against real proposals to ensure the data flowing into Salesforce matched what design engineers were accustomed to seeing when they pulled the information manually.
The PDF generation phase was the most iterative. The first version of the automated proposal was functional but not polished. Design engineers and the sales team reviewed multiple drafts, and we went through several rounds of refinement on layout, wording, and data presentation before the output was at a level the team felt comfortable sending to homeowners. This was expected: a templated document needs to work for every proposal, which means edge cases in data formatting, varying system sizes, and different financing options all need to render correctly.
Throughout execution, I ran the rollout as a gradual transition rather than a hard cutover. We selected one design engineer to pilot the entire new process before rolling it out to the rest of the team. This made it easier to manage expectations and respond quickly when issues surfaced. If a bug made it to production, I could message the pilot engineer directly and give them a heads-up before it affected their work. Once confidence was high enough from the pilot, we expanded to the full team, and eventually deprecated the manual workflow.
Results
| Metric | Before | After |
|---|---|---|
| Proposal turnaround time | ~48 hours | ~2-3 hours initially, then ~1 hour |
| Design Engineering efficiency | Majority of time spent on data movement | Focus shifted to actual design decisions |
| Proposal PDF generation | Manual Excel export and formatting | System-generated from Salesforce data |
| Data entry between systems | Manual (Salesforce, Aurora, Excel) | Automated via Aurora API |
| Proposal accuracy | Prone to manual errors | Template-driven, consistent output |
The initial automation brought turnaround down from roughly 48 hours to about 2 to 3 hours. The remaining time was mostly the design review itself, which still required human judgment. Later iterations of the automation (including the Design Automation system that followed this project) further reduced the time to around 1 hour for straightforward proposals.
The business impact extended beyond time savings. With proposals available in roughly an hour, roofing salespeople could request a proposal in the morning and have it ready for an afternoon site visit. In some cases, the turnaround was fast enough that a roofer could request a proposal while on-site with a homeowner and pitch solar during the same conversation. That was not possible at 48 hours. The speed changed the sales motion entirely, and the dramatically shorter turnaround for change requests meant that roofers could iterate on proposals with homeowners rather than waiting days for each revision.
For the Design Engineering team, the shift was meaningful. Engineers who had been spending the majority of their time on data entry and document formatting could now focus on the work that actually required their expertise: evaluating roof geometry, adjusting panel layouts for shading, and ensuring designs were installable. As proposal demand grew with each new roofing partner, the team could scale without proportionally scaling headcount because the automation absorbed the operational overhead.
What I'd Do Differently
I would have invested more in error monitoring from day one. The Aurora API integration worked well in the happy path, but edge cases (timeouts, partial data returns, unexpected field formats) were harder to detect than they should have been. We caught most issues through manual QA during the first few weeks, but an automated alerting system would have surfaced problems faster and reduced the burden on design engineers who were reporting issues as they encountered them.
I also would have involved the sales team earlier in the PDF design process. We optimized the proposal document for accuracy and completeness, but the salespeople who actually presented it to homeowners had strong opinions about ordering, emphasis, and language that we only discovered during the later refinement rounds. Getting their input at the wireframe stage rather than the review stage would have saved at least one full iteration cycle.
So What
The design engineers on this project were the deepest experts of the proposal process, and that expertise was essential to getting the automation right. The biggest lesson from this project was about feedback loops and what it takes to migrate a complex, high-volume manual process while the team is still actively working in the existing system. It is something like fixing a plane while flying it, except the plane is already under heavy load and you are narrowing the fuselage while the passengers are still on board. Every change to the system had to be introduced carefully because the team could not stop shipping proposals while we built the replacement.
Expectation management turned out to be one of the most important parts of the project. On any team going through a significant workflow change, there will be people who are cautious about it. When a bug inevitably ships and it happens to affect the person who was already hesitant, it can hit team morale and create friction between the teams driving the change and the teams living with it. In a hardware-meets-software company like GAF Energy, that dynamic is especially important to manage. Choosing a single design engineer to pilot the system first gave us a controlled environment to catch issues early and build trust before expanding to the full team. It also gave me a direct line to flag known issues before they disrupted anyone's workflow.
The deeper pattern was this: roughly 70 percent of the proposal workflow had been consumed by operational overhead, specifically data movement between disconnected systems that had nothing to do with the actual design work. Once we connected those systems and automated the mechanical steps, the same team could handle significantly more volume at higher quality. That pattern (identify the operational overhead, replace it with actual integrations, and let skilled people focus on the work that requires judgment) became a template I applied to every platform project that followed.