Author:
theGeeko61
A BrainTeaser The attached GeoGebra data file contains a puzzle. Can you solve it??
This Applet contains data from the U.S. Bureau of Labor Statistics. The data is almost a century (98 years, to be precise) worth of data related to the CPI (Consumer Price Index). The spreadsheet contains the data in columns A and C. Column A contains the Year; Column C contains the CPI for that year. Column B contains the numbers from 0 through 97, which will be used as the range for X-coordinates (corresponding to the CPI values). Thus, the dataset consists of for i in the set . Now, we use GeoGebra to fit a polynomial to the dataset. The generated polynomial, , is then used to estimate what the CPI will be for the current year, 2011. The puzzle is this: The expected datapoint for 2011 is (98, 245.48) according to the generated polynomial. However, when we use GeoGebra to calculate , it returns . If we explicitly use the formula which is generated by , we get the correct answer. Can you identify what causes this discrepancy?