**A little theory**

Here we have a simple LED circuit using a battery and a resistor. This is a very common way of connecting (or biasing) an LED.

LED’s are diodes, which are biased with a current rather than voltage. Simply, when LED’s are “fed” with some current in the forward direction (plus to minus, or anode to cathode, or top to bottom in this example), it would start to emit light at some minimum current. A typical red LED’s require about 10mA to 20mA of current for decent brightness. Any more may not help much–LED’s would be stressed when pushed beyond the limits and may be destroyed.

Since LED’s are current devices, a voltage should not be applied directly across it, unless you know the voltage it needs to produce the desired current, which is, as an example, should equal to the biased voltage at say 20mA. A typical biased voltage at 20mA is about 1.5V. Otherwise, the 5V supply, here, being much greater than the 1.5V of the LED, will force so much current that the LED’s would be damaged. Therefore, we place a resistor into the circuit in series with the LED. The resistor will lower the current and drop the voltage down to a manageable level.

**Custom Design**

So, how do we figure out what value resistor to use? We shall use the ohm’s law for this. Ohms law states that voltage is the product of the current and the resistance, or** V = IR,** where “I” is the current.

To analyze the circuit, we assume that the circuit is operating and in steady state condition, with LED emitting light at the specified operating current. In that case, the voltage across the resistor in series with LED together is 5V from the battery (V1). We also said that a typical LED sets up a voltage when biased (operating), at about 1.5V. That means the voltage across the resistor is 5V – 1.5V = 3.5V.

**Calculations**

Well, we’re trying to find out what resistor value to use, so going back to the ohms law, V=IR, and do some algebra to get **R=V/I**. That’s, Resistance = Voltage divided by Current. We know the voltage across the resistor to be 3.5V and we want the current to be at spec of 20mA (which is what we assumed when the circuit is in steady state). 20mA is 0.02A.

If you do R = V / I , which is R = 3.5V/0.02A, you get 175 ohms. You can use any nearest standard value resistor you can find and it’ll work. Now you can calculate the resistor value for any voltage (to a point). One thing to remember is that the higher the voltage used as the source (battery), the more heat the resistor will dissipate. Try to stay below 12V. Otherwise, the typical 1/4W resistor will get quite warm.

**Powering multiple LED’s**

In order to power more than one LED, the setup we’re using here will require higher voltage. It would be more efficient to reduce the voltage difference between the sum of the LED voltages to the source voltage. Try to keep the voltage difference below 12V.We said that LED’s set up about 1.5V when biased. Four of them would mean 6V across the chain of four LED’s. That means the voltage across the resistor (R2) is 6V, since the total source is 12V. Going back to the ohms law again:

R = V / I = 6V / 0.02A = 300 ohms.

To calculate power dissipated in the resistor (turned into heat):

P = I^2 * R (that’s current squared multiplied by R)

**P = (20mA)^2 * 300 ohms = 120mW**

That’s actually quite a bit of power lost as heat in the resistor. It’s still under the 1/4W rating of a typical leaded resistor, though.

**Afterthoughts**

You can experiment with different colors of LED’s. They have different biased voltages. White LED’s have about 3V when biased.You can also buy blinking LED’s for some neat effects. You don’t need extra circuitry–it’s built into the LED. There are even “rainbow color” LED’s where red/blue/green colors alternate for some interesting effects.There are high powered LED’s out now which are rated in watts. They’re very bright and require lot more current. Some may need heat sinks.