When numbers of any sort are presented, whether in mathematics, science, business, government or finance, the default assumption is that the data presented are reasonably reliable to the last digit presented. Thus, if a light bulb is listed as using 3.14 watts, then its actual usage is presumably between 3.13 and 3.15 watts, and certainly not 2.8 or 4.2 watts. Or if the average interest rate paid on a set of securities is listed as 2.718 percent, then a reasonable reader presumes that the actual figure is between 2.717 and 2.719 percent.
The total number of significant digits can vary widely, depending on context. Some studies require enormous precision -- the present authors have published research studies requiring numbers to be computed to tens of thousands of digits. In other contexts, only one or two digit accuracy is appropria...
Also read: Don't be fooled by Apple's record high