Electrician Talk banner
1 - 6 of 6 Posts

· Registered
Joined
·
36 Posts
Discussion Starter · #1 ·
There're many ballasts compatible LED tubes on the market for retrofitting, I wonder that people prefer Constant Current or Constant Voltage. Just a research, I'm appreciated for everyone's opinion.
Thank you!
 

· Registered
Joined
·
71 Posts
Both have their edges. A constant current drive prevents burnout by keeping the current in control and you don't have to fear the thermal burns. On the other hand, constant voltage drives have fixed amount of voltage. As far as I have observed most people use constant current because it has more flexibility.

Welcome
 

· Registered
Joined
·
4 Posts
To avoid the burnout risk constant voltage systems (like LED strips) use resistors to control the current flow. The downside to this is that you lose efficiency. A 12V LED strip actually only has about 9V worth of LEDs, the rest is just burned away as resistor heat.
 

· Registered
Joined
·
414 Posts
With LEDs, as the temperature goes up, the forward voltage at a given current goes down.

So if you use constant voltage drive, more current is passed as temperature goes up. Now the voltage and current have increased which of course increases the LED's power dissipation which in turn leads to a higher temperature and a lower forward voltage. This is the main mechanism for thermal run away.

The first level fix is to add a current limiting resistor that will provide negative feedback to the thermal run away effect. Temperature goes up and forward voltage goes down as before, but the current cannot increase because it is limited by the resistor.

With constant current drive, thermal run away does not happen because even though an increase in temperature lowers the forward voltage, the current is what you are controlling so it does not increase.

Constant current excitation makes sense from another angle; light power is almost directly proportional to forward current, so it is perfect for light intensity control. :thumbsup:
 

· Registered
Joined
·
36 Posts
Discussion Starter · #5 ·
With LEDs, as the temperature goes up, the forward voltage at a given current goes down.

So if you use constant voltage drive, more current is passed as temperature goes up. Now the voltage and current have increased which of course increases the LED's power dissipation which in turn leads to a higher temperature and a lower forward voltage. This is the main mechanism for thermal run away.

The first level fix is to add a current limiting resistor that will provide negative feedback to the thermal run away effect. Temperature goes up and forward voltage goes down as before, but the current cannot increase because it is limited by the resistor.

With constant current drive, thermal run away does not happen because even though an increase in temperature lowers the forward voltage, the current is what you are controlling so it does not increase.

Constant current excitation makes sense from another angle; light power is almost directly proportional to forward current, so it is perfect for light intensity control. :thumbsup:
yes exactly, it is perfect especially in countries with electrical instability I think...
 
1 - 6 of 6 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top