What Is Ohm's Law?
Ohm's Law, discovered by German physicist Georg Simon Ohm in 1827, is the most fundamental relationship in electrical engineering. It states that the current through a conductor between two points is directly proportional to the voltage across the two points, expressed as V = IR. Here, V is voltage measured in volts (V), I is current in amperes (A), and R is resistance in ohms (Ω). This simple equation forms the basis of all circuit analysis and design.
The Power Relationship
Electric power (P), measured in watts (W), describes the rate of energy transfer in a circuit. Three equivalent formulas relate power to voltage, current, and resistance: P = V × I gives power from voltage and current; P = I² × R gives power from current and resistance; P = V² / R gives power from voltage and resistance. These formulas are essential for determining component ratings, heat dissipation, and energy consumption in any electrical system.
Practical Applications
Ohm's Law is used every day in circuit design, from choosing resistor values for LEDs to sizing wire gauges for building wiring. Electronics engineers use it to calculate current draw, voltage drops across components, and power dissipation. Electricians rely on it for load calculations and safety compliance. Even hobbyists building Arduino or Raspberry Pi projects use Ohm's Law to protect components from overcurrent damage.
Limitations and Real-World Considerations
Ohm's Law assumes a linear (ohmic) relationship between voltage and current, which holds for most metallic conductors at constant temperature. Non-ohmic devices like diodes, transistors, and thermistors do not follow this simple relationship. Temperature changes affect resistance in most materials. In AC circuits, impedance replaces resistance and includes the effects of capacitance and inductance. Despite these limitations, Ohm's Law remains the starting point for virtually all electrical analysis.





