- Why does the current in the primary coil change when I add a load to the secondary?
- In an ideal transformer, power in equals power out (U₁I₁ = U₂I₂). With no load, I₂ is zero, so I₁ is also essentially zero (just a tiny magnetizing current, neglected in the ideal model). Connecting a load causes I₂ to flow, drawing power from the secondary. To conserve energy and maintain the magnetic flux, the primary must draw a corresponding current I₁ from the source to supply this power. The simulator shows this direct relationship.
- Can a transformer work with direct current (DC)?
- No. A transformer relies on a changing magnetic flux to induce a voltage in the secondary coil, as described by Faraday's law. A constant DC voltage creates a steady magnetic field, not a changing one, so no voltage is induced in the secondary after the initial switch-on transient. This simulator uses AC for this reason.
- What does 'ideal' mean in this context, and how do real transformers differ?
- 'Ideal' means the model assumes perfect efficiency with no energy losses. Real transformers have losses from wire resistance (copper losses), magnetic hysteresis, and eddy currents in the core (iron losses). They also have leakage flux (not all flux links both coils) and require a magnetizing current to establish the core's magnetic field. This simulator ignores these effects to focus on the fundamental voltage, current, and power relationships.
- If I step up the voltage, why does the current step down?
- This is a direct consequence of energy conservation. For a given power output (P = U₂I₂), if the voltage U₂ is increased, the current I₂ must decrease proportionally to keep the product constant. Since the input power must equal the output power (U₁I₁ = U₂I₂), the primary current I₁ adjusts accordingly. High-voltage transmission lines use this principle to reduce current, minimizing resistive power losses over long distances.