Microinches to Centimeters Converter (µin to cm)
All Conversions

Length Conversion
Area Conversion
Volume Conversion
Volume to Weight
Weight Conversion
Weight to Volume
Speed Conversion

Microinches to Centimeters
Converter





  

Select conversion type:


Rounding options:




Convert Centimeters to Microinches (cm to µin) ▶

Conversion Table

microinches to centimeters
µincm
100000 µin 0.254 cm
200000 µin 0.508 cm
300000 µin 0.762 cm
400000 µin 1.016 cm
500000 µin 1.27 cm
600000 µin 1.524 cm
700000 µin 1.778 cm
800000 µin 2.032 cm
900000 µin 2.286 cm
1000000 µin 2.54 cm
1100000 µin 2.794 cm
1200000 µin 3.048 cm
1300000 µin 3.302 cm
1400000 µin 3.556 cm
1500000 µin 3.81 cm
1600000 µin 4.064 cm
1700000 µin 4.318 cm
1800000 µin 4.572 cm
1900000 µin 4.826 cm
2000000 µin 5.08 cm

How to convert

1 microinch (µin) = 0.00000254 centimeter (cm). Microinch (µin) is a unit of Length used in Standard system. Centimeter (cm) is a unit of Length used in Metric system.

Microinch: A Unit of Length

The microinch is a unit of length that is equal to one millionth of an inch (0.000001 inch). It is a non-SI unit of measurement that is mainly used in engineering and manufacturing fields. The symbol for microinch is µin or µ". The microinch is also a derived unit in the British imperial and US customary systems of measurement.

The microinch is most commonly used when expressing small distances or dimensions, such as the surface roughness or flatness of materials and parts. The microinch is also used for measuring the wavelength of light and other electromagnetic waves.

In this article, we will explore the definition, history, usage and conversion of the microinch as a unit of length.

Definition of the Microinch

The microinch is a unit of length that is equal to one millionth of an inch (0.000001 inch). It is defined as 25.4 nanometers or 2.54 × 10^-8 meters by international agreement in 1959. One inch is equal to 25.4 millimeters or 0.0254 meter.

The definition of the microinch has changed over time, as different standards and methods of measurement were developed by various countries and organizations. The current definition of the microinch as based on the meter was agreed upon by an international treaty in 1959.

History of the Microinch

The origin of the microinch as a unit of length can be traced back to the early 20th century, when it was introduced by the American Society for Testing and Materials (ASTM) as a standard for measuring surface roughness. Surface roughness is a measure of how smooth or irregular a surface is, which affects its friction, wear and corrosion properties.

The microinch was adopted by other countries and industries that followed the American system of measurement, such as Canada and Japan. It was also incorporated into the ANSI/ASME B46.1 standard for surface texture in 1985.

The microinch was also used by some optical scientists and engineers to measure the wavelength of light and other electromagnetic waves. For example, the visible spectrum of light ranges from about 4,000 to 7,000 microinches.

Usage of the Microinch

The microinch is a unit of length that is used for measuring small distances or dimensions, such as the surface roughness or flatness of materials and parts. For example:

  • Measuring the smoothness or roughness of metal surfaces, such as steel, aluminum and copper.
  • Measuring the flatness or curvature of glass surfaces, such as lenses, mirrors and windows.
  • Measuring the thickness or diameter of thin films, coatings and wires.
  • Measuring the accuracy or tolerance of machined parts, such as gears, bearings and shafts.

The microinch is commonly used in engineering and manufacturing fields, especially in precision machining, metrology and quality control. Some examples are:

  • Measuring the surface finish of machined parts, such as turned, milled and ground surfaces.
  • Measuring the surface profile of textured surfaces, such as honed, lapped and polished surfaces.
  • Measuring the surface geometry of complex surfaces, such as grooved, fluted and serrated surfaces.
  • Measuring the surface defects of defective surfaces, such as scratches, pits and dents.

The microinch is also used for measuring the wavelength of light and other electromagnetic waves. For example:

  • Measuring the color or frequency of visible light, infrared light and ultraviolet light.
  • Measuring the polarization or phase of coherent light, such as laser light and holographic light.
  • Measuring the interference or diffraction patterns of interfering light, such as interferometric light and diffractive light.

Example Conversions of Microinch to Other Units

The microinch can be converted to other units of length by using different factors and formulas. Here are some examples of conversion for different types of units:

  • To convert a microinch to inches, divide by 1,000,000:

1 µin / 1,000,000 = 0.000001 in

  • To convert a microinch to feet, divide by 12,000,000:

1 µin / 12,000,000 = 8.333 × 10^-8 ft

  • To convert a microinch to yards, divide by 36,000,000:

1 µin / 36,000,000 = 2.778 × 10^-8 yd

  • To convert a microinch to meters, multiply by 2.54 × 10^-8:

1 µin x 2.54 × 10^-8 = 2.54 × 10^-8 m

  • To convert a microinch to kilometers, multiply by 2.54 × 10^-14:

1 µin x 2.54 × 10^-14 = 2.54 × 10^-14 km

  • To convert a microinch to nanometers, multiply by 25.4:

1 µin x 25.4 = 25.4 nm

  • To convert an inch to microinches, multiply by 1,000,000:

1 in x 1,000,000 = 1,000,000 µin

  • To convert a foot to microinches, multiply by 12,000,000:

1 ft x 12,000,000 = 12,000,000 µin

  • To convert a yard to microinches, multiply by 36,000,000:

1 yd x 36,000,000 = 36,000,000 µin

  • To convert a meter to microinches, divide by 2.54 × 10^-8:

1 m / 2.54 × 10^-8 = 39,370,078.74 µin

  • To convert a kilometer to microinches, divide by 2.54 × 10^-14:

1 km / 2.54 × 10^-14 = 39,370,078,740,157.48 µin

  • To convert a nanometer to microinches, divide by 25.4:

1 nm / 25.4 = 0.03937 µin

Centimeter: A Unit of Length Used in the Metric System

The centimeter (cm) is a unit of length in the metric system, which is the most widely used system of measurement in the world. The centimeter is equal to one hundredth of a meter, which is the SI base unit of length. The centimeter is also a derived unit in the International System of Units (SI), which is the official system of measurement for science and engineering. The symbol for centimeter is cm. The centimeter is used for measuring small distances and dimensions, such as the width of a fingernail or the diameter of a coin. The centimeter is also used for measuring areas and volumes, such as the area of a sheet of paper or the volume of a water bottle. The centimeter is named after the centi prefix, which means one hundredth in Latin. In this article, we will explore the definition, history, usage and conversion of the centimeter as a unit of length.

Definition of the Centimeter

The centimeter is a unit of length that is equal to one hundredth of a meter. It is defined as 1/100 meters. The meter is defined as the length of the path travelled by light in vacuum during a time interval of 1/299792458 seconds.

The definition of the centimeter has not changed since its introduction by the French Academy of Sciences in 1795, as part of the decimal metric system that was adopted after the French Revolution. However, the definition of the meter has changed several times over time, as different standards and methods of measurement were developed by various countries and organizations. The current definition of the meter as based on the speed of light was agreed upon by an international treaty in 1983.

History of the Centimeter

The origin of the centimeter as a unit of length can be traced back to 1795, when the French Academy of Sciences proposed a new system of measurement that was based on decimal fractions and natural constants. The system was called the metric system, and it was intended to replace the old and diverse systems of measurement that were used in France and other countries at that time. The metric system was designed to be simple, universal and rational.

The base unit of length in the metric system was the meter, which was defined as one ten-millionth of the distance from the equator to the North Pole along a meridian through Paris. The meter was divided into ten decimeters, each decimeter into ten centimeters, and each centimeter into ten millimeters. The prefixes deci, centi and milli indicated that they were one tenth, one hundredth and one thousandth of a meter respectively.

The metric system was officially adopted by France in 1799, and gradually spread to other countries over the next century. In 1875, an international treaty called the Metre Convention was signed by 17 countries to establish a common standard for measuring length and mass. The treaty also established an international organization called the International Bureau of Weights and Measures (BIPM) to maintain and improve the metric system.

In 1889, a new standard for the meter was created by using a platinum-iridium bar that was kept at BIPM. This bar was called the International Prototype Metre, and it was divided into ten equal parts to make standard centimeters. The bar was also compared with other national standards to ensure accuracy and consistency.

In 1960, an international conference called the General Conference on Weights and Measures (CGPM) adopted a new system of measurement called the International System of Units (SI), which was based on seven base units that could be derived from physical constants. The meter was redefined as 1,650,763.73 wavelengths of light emitted by a krypton-86 atom in a vacuum. The centimeter remained as a derived unit in SI, but it was no longer recommended for use in scientific and technical fields.

In 1983, another CGPM conference redefined the meter again as the length of the path travelled by light in vacuum during a time interval of 1/299792458 seconds. This definition was based on the speed of light, which is a universal constant that can be measured with high precision. The centimeter also changed accordingly to reflect this new definition.

Usage of the Centimeter

The centimeter is a unit of length that is used for measuring small distances and dimensions, such as the width of a fingernail or the diameter of a coin. The centimeter is also used for measuring areas and volumes, such as the area of a sheet of paper or the volume of a water bottle.

The centimeter is widely used in everyday life, especially in countries that follow the metric system. Some examples are:

  • Measuring clothing sizes and body measurements.
  • Measuring furniture dimensions and room sizes.
  • Measuring paper sizes and formats.
  • Measuring screen sizes and resolutions.
  • Measuring rainfall amounts and snow depths.
  • Measuring map scales and distances.

The centimeter is also used in some scientific and technical fields, such as:

  • Measuring wavelengths and frequencies of electromagnetic radiation.
  • Measuring lengths and diameters of microscopic objects.
  • Measuring thicknesses and cross-sections of materials.
  • Measuring focal lengths and apertures of lenses.
  • Measuring blood pressure and blood glucose levels.

  • How to Convert Centimeter

    The centimeter can be converted to other units of length by using conversion factors or formulas. Here are some examples of how to convert centimeters to other units of length in the SI system, the US customary system and other systems:

  • To convert centimeters to millimeters, multiply by 10. For example, 10 cm = 10 × 10 = 100 mm.
  • To convert centimeters to meters, divide by 100. For example, 10 cm = 10 / 100 = 0.1 m.
  • To convert centimeters to kilometers, divide by 100000. For example, 10 cm = 10 / 100000 = 0.0001 km.
  • To convert centimeters to inches, multiply by 0.3937. For example, 10 cm = 10 × 0.3937 = 3.937 in.
  • To convert centimeters to feet, multiply by 0.0328. For example, 10 cm = 10 × 0.0328 = 0.328 ft.
  • To convert centimeters to yards, multiply by 0.0109. For example, 10 cm = 10 × 0.0109 = 0.109 yd.
  • To convert centimeters to miles, multiply by 0.0000062137. For example, 10 cm = 10 × 0.0000062137 = 0.000062137 mi.
  • To convert centimeters to nanometers, multiply by 10000000. For example, one cm = one × 10000000 = 10000000 nm.
  • To convert centimeters to micrometers, multiply by 10000. For example, one cm = one × 10000 = 10000 µm.
Centimeters also can be marked as centimetres.



Español     Russian     Français
Related converters:

Centimeters to Decimeters
Centimeters to Feet
Centimeters to Inches
Centimeters to Meters
Centimeters to Millimeters
Centimeters to Yards
Centimeters to Inches
Feet to Inches
Feet to Kilometers
Feet to Meters
Feet to Yards
Inches to Centimeters
Inches to Feet
Inches to Meters
Inches to Millimeters
Kilometers to Miles
Meters to Feet
Meters to Inches
Meters to Yards
Miles to Kilometers
Millimeters to Inches
Yards to Feet
Yards to Inches
Yards to Meters

Report an error on this page


About Us     Contact     Terms of Service
Privacy Policy     Español     Russian     Français
Copyright © 2013-2023 Metric-Calculator.com