r ransom wrote:
If we were further north or used high efficiency panels, angle matters more. But, those panels don't work here from the middle of October to the start of may, so there's no point changing the angle for winter. There is not enough sunlight for the panels no matter how perfectly they are angled. The world is too overcast.
So type of panels is a big part answering the question if it's worth changing the angle.
Personally, I find there is so much solar power information that is useless to my situation, the best thing to do is experiment before investing in a full roof installment.
As a side note, it used to be suggested that the angle of the roof follow the latitude too. This helps with rain, snow, and passive heating and cooling. But we don't do that so often anymore.
most of my work is off grid or grid tied with batteries in rural areas. At 45 degree latitude I almost always go with a 45 degree fixed angle ground mount due to costs, regulation, snow and year round efficiency. If you were only net metering, a roof angle of 4-6/12 is ideal as it maximizes yearly returns but by the time you get engineering on the roof, municipal permits to modify the structure and you factor in increased insurance costs and loss to snow cover the production gain and cost savings of roof mount are gone here. Adjustable racks are more expensive than fixed as well so with the price of panels you just add panels to a fixed angle to compensate. That is where my thinking is these days.Douglas Campbell wrote:Hi;
The standard recommendation for solar panel angle is your latitude;
ex. 45 latitude; 45 angle from horizontal
Panels work best when perpendicular to the suns rays, so 45 from horizontal is a
a compromise between summer (45 + 23 = 68) high angle sun with optimal panel angle of 22 from horizontal, vs.
winter (45-23 = 22) low angle sun with optimal panel angle of 68 from horizontal.
But I do not think the latitude angle works best for typical situations.
i) Grid tie: This depends upon the utility deal.
My jurisdiction pays 1:1 for energy exchange, cand cancels any excess export above consumption to 0 at the end of the year.
I should maximize annual production up to consumption, at minimum capital cost.
In my climate that means a lower panel angle, towards summer optimum, because that maximizes annual production per panel & racking.
Coincidentally, bungalow roofs are less than 45, and so approximate a good summer angle for 1:1 grid tie.
From the utility point of view, they would prefer me to maximize winter production when demand is higher here; eventually regulations will likely push that way.
ii) Summer use off grid: similar to grid tie; optimize near the summer angle to minimize investment in panels & racking required.
iii) Year round off grid: panels near winter angle will maximize scarce winter production, and shed snow better.
The cost of panels is now low, so 'wasting' panels in summer is not a big detriment.
But panels need racks, and rack costs have not decreased, even if home built.
so, there are a lot of variables. One thing i would try to figure out is if you are better off getting a second charge controller or upgrading your all in one to a larger unit that has more mppt capacity. Sell off the old one to help offset the cost. If you upgrade try for one with closed loop communication to get the most out of your panels and your battery. As to brands; that changes depending on where you are in the world. Most of the higher voltage chargers are made by two or three chinese companies then are rebranded into dozens of different names. I stick to gear that has been certified to a certain standard but it adds cost. As to losses the ways the all in one units work means you will burn 50-100 watts per hour just keeping it running. The mppt adds very little to that. Its the inverter portion of the unit that uses most of the stand by losses. Good luck on the journey.Ahmet Oguz Akyuz wrote:Hi David,
I am using Lithium (LiFePO4 to be specific) at 48V. The battery box has a BMS in it and I can monitor the individual cells (there are 16 of them) using an app. But the inverter has no communication with the BMS -- it is an hybrid inverter rated at 6.2 kW. So I think my inverter is already relying on the global battery voltage readings from the poles of the battery. Is this a problem?
If I add a second charge controller, which in the mean-time I realized to be quite an expensive option, it would also connect to the battery poles directly and would sample the battery voltage from there. But I will make sure to enter the same battery settings for the existing inverter and the new MPPT.
I do have a question regarding the MPPT choice. Given that my panel specs are 320W-410W 32V-40V 7.7A-10A and I will connect 6 of these in series, can you recommend me a budget-friendly MPPT that will work with them. Victron models are unfortunately very expensive.
Also, I noticed that inverters can consume non-negligible battery capacity at night. The aforementioned 6.2 kW inverter easily eats up around 15% of my 100 Ah battery. Do you know if charge controllers also consume such capacity?
Thanks.
Ahmet Oguz Akyuz wrote:Hi There,
In my setup, I two two strings of panels on east and west side of my roof. Each string has 6 approximately 400W panels serially connected. I would like to ensure that each string gets its own MPPT for optimal performance. I have previously found out that when I directly connected these strings in parallel, I ran into a malfunction in one of the strings (most likely bypass diodes in the strings). So I now want to do things right to avoid a similar future problem.
The issue is that my solar hybrid inverter has a single MPPT module. How can I best add the second MPPT to my system? It seems to be suggested that the second MPPT output should be directly connected to the battery. Doesn't it create a weird configuration where one string goes to an inverter and then to a battery and the other string directly goes to the battery (I mean after the MPPT). I guess if I make this connection, the inverter would be fully unaware of my second string which goes to the battery directly. Wouldn't it cause a problem?
Or would you suggest that using blocking diodes in a combiner box a better option for my system?
Thanks for any insights.
William Bronson wrote: So a heat pump works better when the gass or liquid it is stealing heat from is warmer.
A brief search shows that the average temperature of municipal sewage is 50° to 70 ° F.
I'm gonna guess that the air in the system is a similar temperature.
This gives me some crazy ideas.
-An air sourced heat pump that gets its air source from sewage vents.
To avoid breaking the liquid seals in the traps we would put in as much air as we extract.
-A liquid sourced heat pump that draws heat from grey water heals in an insulated tank.
Held greywater tends to turn into to black water, but aeration can prevent that.
-A liquid sourced heat pump that draws from a counterflow heat exchanger.This could be the least efficient.
I think the air sourced pump could be better because it will get cooler temps overall which will help for cooling, plus no tank of dirty water to deal with.
In a house that has city sewage but doesn't use it, there is way more leeway for such a system.
In house with a septic system, there is already a giant underground container of dirty water.
This could be a place to put a coil.
Hi Doug,Douglas Campbell wrote:Hi David;
The newer inverters can handle the output, but would need a large PV array to drive Level 2 @7200 W charging offgrid.
Current EV car batteries are ~~ 75 kWh capacity, so require ~~ 10 h of Level 2 @ 7200 W for complete charging; ~ 10%/h charge to the EV.
A sizable offgrid PV array of 7000 W nominal might take ~ 10 h of full sun to fully charge an EV battery.
In contrast, Level 1 charging at ~1500 W can often run for ~ 5 h/day from an 7000 W offgrid PV array, in parallel with domestic usage, giving about 2% charge/h or 10% day to the EV.
This all comes from my experience.
At home, with grid tie solar; Level 2 charging generally outruns our instantaneous PV (11400 kW nominal).
At an offgrid place (6700 kW nominal PV) we use Level 1 to gain ~ 10% EV charge daily, once the domestic battery bank is full.
A large offgrid battery bank is ~ 30 kWh capacity, so charging the 75 kWh car from an offgrid battery bank would drain it in less than a day, but it is useful to even out cloudy patches etc. during charges.
This comes down to use cases, adjusting from 'Drive to the gas station on empty and get 600 km of range for ~$75'.
~ 5 h of Level 1 is sufficient for most users, most days, to get back ~ 40 km or so.
~ 10 h of Level 2 gives the convenience of a 'fill up' over night or in a day, ~ 400 km or so.
Level 3 is (often) expensive but only used on road trips, ~ 300 km in ~ 30 min or so.
Pee break plus snacks.
Some people drive 100's of km a day and have no /limited charging infrastructure, and large bladders :)
But most people do not.
cheers Doug
there are lots of newer grid tied solar inverters coming online now that support high voltage dc charging or at the very least can support a level 2 charger.Douglas Campbell wrote:Interesting discussion.
I charge my EV at home with a Level 2 240V, powered by grid-tied solar (87% of our total household consumption is solar over the year).
Off grid, daytime solar can support Level 1 120V, drawing about 1500 W. Level 1 is slow, so offgrid feasibility depends upon use-case.
The fix-it-yourself arguments for locally common older cars are strong, but going forward both EV & ICE cars are becoming complicated.
And in my climate, most consumer vehicles older than 15 y rust out including, sadly, my previous Honda CRV.
Thumbs up for a Honda Fit/Jazz, one of our previous favourite cars, although a bit dinky in backroad snow.
I was tempted by the Chev Silverado EV WT, or Ford Lightning F150; quiet power stations on wheels for remote use, but mileage per kW is poor and charging is long.
John Weiland wrote:
larry kidd wrote:It got down to about 20f last night and I never insulated or heated the batteries. Lost power about 2:30am took till about noon to get the cells warmed up to about 35f or 2c and got power back online. Spent the better part of the day after that wrapping the cells with heat tape for pipes and put insulation under and over , still need to go back and insulate the sides. Used 30 feet of heat tape with a 90w draw. It has it's own thermostat on at 35 off at 50 if I remember correctly.
Living where we do in the central US just below the Canadian border, an experience like this is what causes me to hesitate on diving into LiFePO4. I probably will anyway and just keep the investment small to modest. Wife is still tooling around the farmyard with recent ~10 degree F using lead-acid batteries in a Polaris Ranger EV and we are grateful for the robustness of the time-tested tech, even with the known power deficits of these batteries in cold weather.
There was mention recently of Canada leaning more towards solid-state/sodium ion technology, partially because it may be a less expensive battery to produce, but also in large part due to its greater resiliency to cold temperatures. Still that battery too will use a battery management system (BMS) and one hopes these don't turn out to be a weak link in the technology. Larry K, I always wondered if a seedling heating mat would be enough to prevent severe temperature drop in such situations. CLearly if the location is too cold and the batteries unprotected, the BMS will do best to shut down the battery. But in situations where the batteries are housed in an insulated container of sorts, a seedling mat seems to be designed to produce low temperature, low wattage heat to the item(s) sitting on the mat. Perhaps this would be a safe solution for many out there? Also a question for those having installed LiFePO4 batteries going back a decade or two: Have you experienced or heard of situations where either the cells or the BMS itself failed causing need for battery or cell replacement? If the BMS goes bad and the cells are otherwise good, can the BMS be replaced (assuming a battery case whose contents can be accessed) fairly easily? Thanks!