I am planning to suspend LED strip lights over my layout and all the materials have now arrived from the slow boat from Asia.
I am using a 5M strip of double density LEDs (600), type 5050. Based on the specs, the maximum current would scale to 144W @ 12V for the string, so I purchased a dedicated 120W 12V power supply. This factors in approximately 20% de-rating to be conservative.
So, some tests. No pictures because those LEDS are bright. I can't believe that I thought 1 amp of current illuminated them!
The power supply is operating to spec, 12.2V. I wired the string to the leads, taking care to match polarity. They light up like no tomorrow, almost painful to look at directly. However, my power meter indicates the supply is only drawing 35W from the main, so dismissing inefficiencies for the moment, this is only taking about 1/3 of my potential current and power.
After thinking about it a bit, I considered that amperage flow to the strip is limited by conductor size- c. 20 ga leads to the strip and then the flexible PCB on the strip itself to convey current.
A second test: wiring the two ends of the strip (as a loop) to the supply, careful (check twice) with polarity, and turn it on. Radically brighter to the eye, and now consuming 86W at the main. These things are bright- wow. People who light them up with the 1A output from a power pack would not recognize them.
Am I correct in assuming that my conductors were limiting current? If I give sufficient conductor to deliver current, that this system will approach the draw of the power supply (120W)? I'm no stranger to electronics, but I'm not used to working with high power circuits.
My installation plan is to cut the strips into three (unequal) lengths, and then feed current to the middle of each strip. I was going to use 16 ga feeders, just because that is what I have on hand, but I could change that if needed.
Any thoughts? Advice? Thanks in advance.