CAVEAT: Electronics and electrical theory are not my strong suit, and this was written for mostly my own benefit, so if I am wrong please let me know.
ALSO: All the information for this post was taken from the Adafruit post here:
https://learn.adafruit.com/all-about-leds?view=all
If you have the time I recommend you just read that.
LEDs can be powered with a power source, assuming you supply the correct voltage. I have in mind a project that would be run off USB (5V), and most LEDs run off something less than that. So we need to add a resistor to bring the 5V’s down to something the LED can run without burning out. This requires some light (ha!) math that is better explained in the Adafruit post.
LEDs have a short leg and a long leg. The SHORT one is NEGATIVE (GROUND/BLACK). The LONG one is POSITIVE (THIS IS THE RED WIRE).
The resistor goes in between the SHORT LEG and GROUND.
Your LED also has something called a FORWARD VOLTAGE, which for mine is listed on the supplied datasheet as “2.8~3.2V” so we’ll call it 3V.
Here is some MATH:
SuppliedVoltage = ForwardVoltage + ResistorVoltage
Rewritten as…
ResistorVoltage = SuppliedVoltage – ForwardVoltage
Therefore…
ResistorVoltage= 5V (supplied via USB) – 3V
So the “ResistorVoltage” is 2V.
Now we use Ohm’s law:
V(volts)=I(amps)*R(ohms)
Whereas…
V is the “ResistorVoltage”
I is the amps used by the LED, which should also be on the supplied datasheet but is probably 20 mA (0.02 amps)
R is resistance measured in ohms (“Ω”), which is what we’re trying to solve for.
Rewritten, it looks like this:
R = V / I
Therefore:
R = 2V/0.02A
So the required resistance is 100 Ω. So pop in a 100Ω resistor and you now have an optimal circuit for the LED?