John61CT wrote:Socal Tom wrote:My fridge uses less than 9 watts per hour, so roughly 200 watts over 24 hours ( 17 amps). I could theoretically go a regular 2 night trip without solar and stay above 50% amperage. Lighting etc brings me to about 300 watts per day, so 50 watts and good sun works out ok.

Watts per hour or day has no meaning, nor does amps per any time period.

Perhaps you mean watt hours and amp hours, wH and AH per time period?

Even very efficient 12V compressor fridges burn 3-4 amps while the compressor is running.

In very hot weather running as a freezer @15°F the duty cycle is a much higher percentage than as a fridge @38° in cooler ambient temps.

Say from 15A per day up to 60A.

I usually size banks to go without inputs for 3 days, of course for lead trying to never go below 50%.

FWIW

watts per hour is the same as watt hours. Watts and amps both have standard definitions.

From Wikipedia

The ampere is that constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed one metre apart in vacuum, would produce between these conductors a force equal to 2×10−7 newtons per metre of length.[6][3][7]

The watt (symbol: W) is a derived unit of power in the International System of Units (SI) defined as 1 joule per second[1] and can be used to quantify the rate of energy transfer. Power has dimensions of

M

L

2

T

−

3

{\displaystyle {\mathsf {ML}}^{2}{\mathsf {T}}^{-3}}

Since I gave a unit of time to go with the amps/watts it should all make sense.

Watt hours give a rate of usage. Watts are the total. It’s similar to miles, 55 miles per hour is a rate of speed, however if I drive 55 mph for 2 hours, I’ve gone 110 miles.

Sent from my iPad using Tapatalk