Convert Micrometers to Centimeters (µm to cm) ▶
How to convert
1 centimeter (cm) = 10000 micrometer (µm). Centimeter (cm) is a unit of Length used in Metric system. Micrometer (µm) is a unit of Length used in Metric system.
Centimeter: A Unit of Length Used in the Metric System
The centimeter (cm) is a unit of length in the metric system, which is the most widely used system of measurement in the world. The centimeter is equal to one hundredth of a meter, which is the SI base unit of length. The centimeter is also a derived unit in the International System of Units (SI), which is the official system of measurement for science and engineering. The symbol for centimeter is cm. The centimeter is used for measuring small distances and dimensions, such as the width of a fingernail or the diameter of a coin. The centimeter is also used for measuring areas and volumes, such as the area of a sheet of paper or the volume of a water bottle. The centimeter is named after the centi prefix, which means one hundredth in Latin. In this article, we will explore the definition, history, usage and conversion of the centimeter as a unit of length.
Definition of the Centimeter
The centimeter is a unit of length that is equal to one hundredth of a meter. It is defined as 1/100 meters. The meter is defined as the length of the path travelled by light in vacuum during a time interval of 1/299792458 seconds.
The definition of the centimeter has not changed since its introduction by the French Academy of Sciences in 1795, as part of the decimal metric system that was adopted after the French Revolution. However, the definition of the meter has changed several times over time, as different standards and methods of measurement were developed by various countries and organizations. The current definition of the meter as based on the speed of light was agreed upon by an international treaty in 1983.
History of the Centimeter
The origin of the centimeter as a unit of length can be traced back to 1795, when the French Academy of Sciences proposed a new system of measurement that was based on decimal fractions and natural constants. The system was called the metric system, and it was intended to replace the old and diverse systems of measurement that were used in France and other countries at that time. The metric system was designed to be simple, universal and rational.
The base unit of length in the metric system was the meter, which was defined as one ten-millionth of the distance from the equator to the North Pole along a meridian through Paris. The meter was divided into ten decimeters, each decimeter into ten centimeters, and each centimeter into ten millimeters. The prefixes deci, centi and milli indicated that they were one tenth, one hundredth and one thousandth of a meter respectively.
The metric system was officially adopted by France in 1799, and gradually spread to other countries over the next century. In 1875, an international treaty called the Metre Convention was signed by 17 countries to establish a common standard for measuring length and mass. The treaty also established an international organization called the International Bureau of Weights and Measures (BIPM) to maintain and improve the metric system.
In 1889, a new standard for the meter was created by using a platinum-iridium bar that was kept at BIPM. This bar was called the International Prototype Metre, and it was divided into ten equal parts to make standard centimeters. The bar was also compared with other national standards to ensure accuracy and consistency.
In 1960, an international conference called the General Conference on Weights and Measures (CGPM) adopted a new system of measurement called the International System of Units (SI), which was based on seven base units that could be derived from physical constants. The meter was redefined as 1,650,763.73 wavelengths of light emitted by a krypton-86 atom in a vacuum. The centimeter remained as a derived unit in SI, but it was no longer recommended for use in scientific and technical fields.
In 1983, another CGPM conference redefined the meter again as the length of the path travelled by light in vacuum during a time interval of 1/299792458 seconds. This definition was based on the speed of light, which is a universal constant that can be measured with high precision. The centimeter also changed accordingly to reflect this new definition.
Usage of the Centimeter
The centimeter is a unit of length that is used for measuring small distances and dimensions, such as the width of a fingernail or the diameter of a coin. The centimeter is also used for measuring areas and volumes, such as the area of a sheet of paper or the volume of a water bottle.
The centimeter is widely used in everyday life, especially in countries that follow the metric system. Some examples are:
The centimeter is also used in some scientific and technical fields, such as:
How to Convert Centimeter
The centimeter can be converted to other units of length by using conversion factors or formulas. Here are some examples of how to convert centimeters to other units of length in the SI system, the US customary system and other systems:
Micrometer: A Unit of Length
Definition of the micrometer
The micrometer, also known as the micron, is a unit of length in the International System of Units (SI) that equals one millionth of a meter. Its symbol is µm.
History of the micrometer
The term micron and the symbol µ were officially accepted for use in isolation to denote the micrometer in 1879, but officially revoked by the International System of Units (SI) in 1967. This became necessary because the older usage was incompatible with the official adoption of the unit prefix micro-, denoted µ, during the creation of the SI in 1960. In the SI, the systematic name micrometre became the official name of the unit, and µm became the official unit symbol.
How to convert micrometer
The micrometer can be converted to other units of length using simple multiplication or division by powers of 10. For example, one micrometer is equal to 0.001 millimeters, 0.000001 meters, or 0.000000001 kilometers in the SI system. One micrometer is also equal to 0.000039 inches, 0.0000033 feet, or 0.00000062 miles in the US standard system.
Where micrometer is used
The micrometer is a common unit of measurement for wavelengths of infrared radiation as well as sizes of biological cells and bacteria, and for grading wool by the diameter of the fibers. The width of a single human hair ranges from approximately 20 to 200 µm
The micrometer is used in different countries and applications for various purposes. For example:
Example conversions of micrometer to other units
Here are some sample conversions of micrometer to other units:
Español Russian Français
Centimeters to Decimeters
Centimeters to Feet
Centimeters to Inches
Centimeters to Meters
Centimeters to Millimeters
Centimeters to Yards
Micrometers to Millimeters
Centimeters to Inches
Feet to Inches
Feet to Kilometers
Feet to Meters
Feet to Yards
Inches to Centimeters
Inches to Feet
Inches to Meters
Inches to Millimeters
Kilometers to Miles
Meters to Feet
Meters to Inches
Meters to Yards
Miles to Kilometers
Millimeters to Inches
Yards to Feet
Yards to Inches
Yards to Meters