Five key takeaways from Solarplaza Summit Asset Management North America 2026
Stay in touch
Subscribe to our newsletter for expert insights, company updates, and the latest in renewable energy management solutions.
Will Troppe, Senior Director of Product at Power Factors, shares insights from the Solarplaza Summit Asset Management North America, which took place April 1-2, 2026, in San Diego, California. ________________________________________________________________________________________________
I recently attended Solarplaza’s Asset Management North America Summit in San Diego. I left more energized than ever, inspired by the people who power our industry.
Here are my five key takeaways.
From dashboards to dialogue: Generative AI is bringing a paradigm shift to the user experience
Another year, another conference, another panel on AI, this time titled, “AI Meets O&M: Smarter Operations, Real Results.” The topic felt materially more substantive this time, but I’m admittedly biased: I joined the panel alongside senior leaders from Madison Energy Infrastructure, 60 Hertz, Raptor Maps, and NovaSource.
AI is still a buzzword, and it’s still following hype-cycle dynamics. But it’s also materially changing how we run companies, build products, and use software. It's hype, but it's not just hype, and that distinction matters.
When people say "AI" today, they typically mean large language models, generative AI, and AI agents. It’s distinct from the traditional domain-tailored data science and ML analytics guiding our industry for years, like soiling analysis, full loss disaggregation, economic yield optimization, and predictive failure detection. Those approaches are still enormously valuable and improving linearly.
In contrast, generative AI represents a paradigm shift in how users expect to interact with their software. It’s a total re-imagination of the human-machine interface. Forms and Excel tables were never exciting, but they've been the best way to update database content. For visualizing data, reports were great in the 90s and 00s, and BI Dashboards took the mantle in the 2010s. Now, generative AI is powering the next wave, thanks to its unique ability to deliver bidirectional, natural language interactions. It’s (somewhat ironically) more human-centered than anything we've built before. People no longer need to learn software; it meets them where they are.
Garbage in, garbage out: Data trust is the true AI differentiator
It’s an old maxim in our industry, even more applicable in the age of AI. Just as AI amplifies a skilled software developer (and proliferates spaghetti for the less structured), AI systems are only as good as the data they’re trained on and the context they draw from.
On our AI panel, we discussed our personal experiences “vibe coding” — building small prototypes and functional local applications. I told my own “tale of two prototypes:” one a surprising success, another disappointing let-down. The only difference? The first had a solid foundation: quality inputs, documentation, user stories, requirements, and instructions. The other was merely a backlog of ideas with great promise. The successful result is already in our product; the failure, a good lesson learned on a Saturday morning.
.jpg?width=6000&height=4000&name=55196784809_2f967f4af9_o%20(1).jpg)
Power Factors’ Senior Director of Product, Will Troppe, speaking at SAMNA 2026 (image courtesy of Solarplaza)
We’ve all heard that “AI is data-hungry.” But more isn’t always better. A bunch of bad data clouds models and results. Providing too much context can counterintuitively overwhelm AI systems, wasting power, diluting meaning, and yielding much less useful results. When building with AI, and when building AI-powered products, consider the right context for the job. You must be intentional about the agents, skills, tools, rules, and data accessed by AI models.
“Systems of Record” are becoming “Systems of Context.” In this new world, high-quality, structured, always-available, AI-ready datasets designed to scale are even more essential than before.
For software vendors in the renewable energy management space, this is a moat. 300+ gigawatts of high-quality operational data isn't something a startup can replicate in six months. The companies that have invested in platform integrity over years are the ones that will get the most out of AI.
Commiseration is therapeutic
The most popular breakout session of the week was a workshop, provocatively titled, “Solar F*** Ups.” Moderators invited participants to anonymously write their stories on slips of paper to encourage candor. They filtered them, read their favorites out loud, and invited the instigators to self-identify.
Like the room, the stories were overflowing. Modules installed upside down because a carport owner thought it looked better. Mallets used to align modules on racking, cracking them in the process. Mice creating homes in containers. Inverters' front panels exploding into schoolyards. A sewer main unknowingly encountered during construction and accidentally filled with concrete — discovered the hard way.
Why did this session draw such a crowd? Simple: field teams want to feel heard. Operations staff carry a lot of institutional knowledge about what goes wrong and why, and they don't always have a direct line to the people making procurement and design decisions. Getting in a room with peers and swapping war stories is part of how this industry learns. It gives them hope that their expertise will have a meaningful outlet.
There's a product lesson here, too. At its best, our software is a conduit between the field and the boardroom. It makes sure cloud truth equals ground truth. It shortens the distance between someone noticing a problem on-site and someone with budget authority knowing it needs to be fixed. As AI systems allow less-technical executives to synthesize raw field data better than ever before, the popularity of "Solar F*** Ups" is a signal that the industry still has distance to close.
Bess is a mess: Owners seek proven partners for success
Battery Energy Storage is no longer in its childhood. It’s reached its adolescence, with all the angst, growing pains, and existential drama that comes with it.
Historically black-box OEMs are beginning to open up to enable continued scaling, sharing the “servicing” love, and opening up market opportunities for advanced technician teams.
Owners are more and more successful negotiating access to more operational data, no longer locked out of the feeds needed to calculate the metrics that dictate the terms of their contracts and warranties. Counterparties know what to look for, what to expect, and what to demand.
Still, there are struggles. Sites that are several years old no longer align with today’s standards, constantly weighing down operating portfolios for their remaining decades of life. Operators are beginning to standardize KPI definitions and contract terms, hungry to translate them into operational workflows that scale.
Power Factors has over a decade of BESS experience across our Asset Performance Management and SCADA/EMS offerings. We've seen a lot. We've navigated the data access negotiations, the integration challenges, operational optimization, and evolving standards. For operators trying to establish what good looks like, having a partner who's been through it matters. BESS doesn’t have to be a mess.

Building in an AI-first world
I've been thinking about a pattern I keep observing among those on the cutting-edge of AI tool adoption: novice → overshoot → stabilize.
When you first start building with AI, you're a novice. You're prompting in chat, getting outputs, copying and pasting results between windows, and marveling at what's possible. But you’re still building line-by-line, step-by-step, component-by-component.
Then you see the light, and you overshoot. Multi-step agentic workflows take hold. You push further, one more tweak, one more feature, one more pull of the prompt slot machine. Variable outputs, variable rewards; an “infinite scroll” of building. I’ll go to bed soon. Wait, how did it get so late?!
You begin to generate faster than you can evaluate. You forget what grass feels like. Reviewing code becomes fatiguing. Outputs overwhelm. Quality and supportability threaten to degrade quietly. You begin to approach burnout as you burn through tokens. This is not sustainable.
Let’s stabilize. You optimize for outcomes rather than outputs. You re-balance between speed and quality. You use the right model for the job, structure your context carefully, tailor your agents, and review with discipline. This is where the real productivity gains live.
In this new world, judgment is the precious commodity. Taste dominates. The meeting about the feature takes more time than it takes to build it, but is even more essential. User knowledge and industry context matter almost more than AI context. Understanding what customers actually need, navigating organizational complexity, interoperating with real infrastructure, staying secure and compliant, making good architectural decisions, and maintaining trust across systems and teams become even more essential.
The Jevons paradox applies here: making something cheaper tends to increase total consumption, not reduce demand for the underlying expertise. We have over 300 gigawatts of operational history, decades of customer relationships, deep SCADA integration, and a team that's seen every failure mode in the industry.
There's never been a better time for smart, motivated people to learn and build. Asset Management North America e reminded me that our industry is full of exactly these types of people. See you out there.