But this general rule should be applied carefully, because every observer often introduces imprecision to the sensibility of the apparatus. In instruments with indirect reading scales(discreet scales), as a digital chronometer, the instrumental uncertainty should be defined as:Įx: if our chronometer shows in its screen a time up to hundredths of second, its resolution is of 0.01 sec, then the instrumental uncertainty is of 0.01 sec. In a direct reading scale instrument(continuous scale), like a ruler, a protractor, or an analogical chronometer (with needles), the instrumental uncertainty or "instrumental error" denominated "ie", can be defined as: ie = resolution / 2Įx: if our ruler has intervals of 1 mm, its resolution is of 1 mm, then the instrumental uncertainty "ie" is of 0.5 mm if our protractor has intervals of 1°, its resolution is of 1°, then the instrumental uncertainty is 0.5°. The resolution of a graduated scale is the smallest discernible interval in that scale, that is to say the smallest distance between two successive divisions. Measured unit: is the unit of measurement used to express the magnitude of the phenomenon obtained with the measurement instrument (°, sec,", cm3, kg, etc).Įxample: AS = (3.00 ± 0.13)", also, AS = 3.00" ± 0.13", mean the most probable value of the AS is in the interval (3.00-0.13, 3.00+0.13)", or (2.87,3.13)". Uncertainty: commonly called "measurement's error", expresses the symmetrical interval, with center in the measured magnitude, as the most probable place where this magnitude is. Measured magnitude: is the numeric value obtained with the measurement instrument. Magnitude = (measured magnitude ± uncertainty) measured unit While observing a physical phenomenon or an object, we use the measurement process to determine the magnitude of this phenomenon or object for example, if we want to measure the Angular Separation AS of a binary star, we can use a filar micrometer to calculate the amount of times a unit of measurement enters in the distance between the two stars, in this case the unit of measurement is the arc second (").Īll observational magnitude should be expressed in the following way: This article seek to offer the observer the basic tools to be able to calculate the error of the angular separation obtained with the chronometric micrometer. Then we should ask ourselves: how precise this method is? Every observer has a different timing when pressing the button of the chronometer and can obtain false information if he doesn't bear in mind possible errors produced by the micrometer. When measuring double stars, the common amateur observer appeals to a visual and completely economic instrument: the classic and useful "Chronometric Micrometer" (see bibliography for further information). Uncertainty in measurements made with the Chronometric Micrometer Introduction
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |