Milliampere (mA)
Definition
The milliampere (mA) is a unit of electric current equal to one-thousandth of an ampere (A). It is part of the International System of Units (SI). The canonical definition states that 1 mA = 0.001 A.
History
The term "ampere" was named after André-Marie Ampère in the early 19th century. The milliampere emerged as a practical unit in the late 19th century, becoming essential for measuring small currents in various applications, especially in electronics and telecommunications.
Uses
Milliampere is widely used in electronics to specify the current ratings of components, devices, and circuits. Common applications include power supplies, batteries, and medical devices like electrocardiograms (ECGs). It is essential in both consumer electronics and industrial equipment.
Conversions
- 1 mA = 0.001 A
- 10 mA = 0.01 A
- 100 mA = 0.1 A
Fun Facts
- A common misconception is that mA and A are interchangeable; they represent different scales of current.
- Many household devices, such as phone chargers, specify their output in mA to indicate the current they provide.