If I’m using STC. I give a tolerance of 30% for inefficiency. I believe that’ll take into consideration the atmospheric conditions, possible shading etc. I have attended a training before where it was said that we should use NOCT. that’ll mean a bigger panel array in Terms of area. And more cost. Thoughts?
Derates are applied with the best data available for conditions and site characteristics. Long term weather data translated into insolation recieved in peak sun hours, which places us right at being able to use stc in reference to module output over time. System architecture matters also.
Without using any kind of long term data from reasonably near the site, you pretty much need to place a test module or an array and log it.
While a 30 percent derate may work in some areas and at some sites in those areas, a presumption based blanket derate would be irresponsible unless its a diy.....
When i look at my main array output, (nameplate 1200w) i am getting 1140w or better at the controller. That is energy at that moment...... i could never tell you how the weather or sun angle would affect that over a day, a month or a year, without long term data or simply firing it up for a few years and logging it.
Even that only goes so far. There are places here in michigan where a 20 minute drive will place you in an entirely different insolation zone, a snow and cloud belt leaving long term data for the region not very useful, so sombody has got to have a system or sensor in that area in order to accurately (or approximately in all honesty) determine array sizing or a derate to sizing for those sites.
We're all out of roofs. But we still have tiny ads:
2019 PDC for Scientists, Engineers, Educators and experienced Permies