Jan 19 2004, 12:54 AM
Ok I'm doing a simple hd light mod. I have a 5v source going to an led (I'm not sure the voltage on it - It's a diag light for usb ports [they come with MSI boards]) I ahve a 220 ohm resistor. I test the resistor and it shows 220 ohm. If I have a 5v load going into the resistor should on the other side it be somethine like 3v? I'm not sure how you figure that but.... When i get out the good ol' multimeter it shows 5v flowing on both sides of the resistor... Is this right?
That's my only question. Should there be 5v coming out of a 220 ohm resistor if I have a 5v source going into it?
Jan 19 2004, 01:03 AM
ummm im not completely sure but... [5v]--->[100 ohm resistor]---->[3.3v] and i think 3.3 is what most leds are designed to run at.. im not completly sure but this is what lsdiodes said i think
Jan 19 2004, 01:47 AM
the reason probly u show the voltage being same on both sides...depends on whether u used Series or Parallel circuit, in a Parallel circuit the voltage is constant, in a Series circuit Current is constant and voltage is altered by resistors
Jan 22 2004, 12:48 PM
Series or parellel.... Ummm... I just have one led light. How can it be in Series or be Parellel?
Jan 22 2004, 03:30 PM
if you are doing whats known as 'open circuit' testing, i.e. putting a multimeter where th LED will be then you'll see 5V, a multimeter uses bugger all current, so there is no voltage drop.
if you wire it up and the LED is drawing current, putting a multimeter accross the LED (i.e. one lead on -, one on +) then you'll see your lower voltage.
a resistor drops voltage according to how much current goes through it
V = I x R
a multimeter (for simplicity) can be considered to have 0 (zero) current draw, or of infinite resistance
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here