Question: A voltage, V volts, applied to a resistor of R ohms produces an electric current of I amps where V = IR. As the current flows the resistor heats up and its resistance falls. If 100 volts is applied to a resistor of 1000 ohms the current is initially 0.1 amps but rises by 0.001 amps/minute. At what rate is the resistance falling if the voltage remains constant?