Auto Parts is a large manufacturer of spare parts for automobiles. The director of marketing research needs to determine which forecast method is the most accurate in forecasting sales for the year 2008 based on the collected data on quarterly sales for the previous four years. After running four different methods of forecasting: regression with time series, regression with economic factors, Holtz-Winters additive model, and Holtz-Winter multiplicative model. Based on the error the most appropriate method of forecasting is regression with economic factors. Based on this model, sales for the year 2008 decrease significantly, which may be indicative of possible recession. Therefore, it is highly recommended that auto parts plans efficiently with the available resources to prevent large loss of money.
Background
Forecast is “a planning tool that helps management in its attempts to cope with the uncertainty of the future, relying mainly on data from the past and present and analysis of trends” (BusinessDictionary.com).
A good forecast helps companies prepare to prevent large amount of money loses by planning more efficiently. In the Auto Parts forecasting case study, the director of marketing of a large manufacturer of spare parts for automobiles understands the consequences of forecasting errors and wishes to forecast the sales as accurate as possible. After collecting sales data for each quarter of the past four years, he ran a number of forecasts using the method of times series. However, there are some factors such as economic activity and oil prices that may have a significant impact on auto parts sales for which he is concerned. Therefore, the director of marketing research decided to use econometric variables to check if sales forecast are better predicted using this model. Problem
The large manufacturer of spare parts for automobiles must decide which forecast method is the most accurate in forecasting sales for the year 2008 based on the collected data on quarterly sales for the previous four years. Analysis
The information provided for the auto parts case study in Excel included: quarterly sales, non-farm activity index and oil prices for the years 2004, 2005, 2006, and 2007. Four different models were used to forecast sales for 2008: regression with time series, regression with economic factors, Holt-Winters additive model, and Holt-Winters multiplicative model.
Regression with time series:
Time series is a sequence of observations which are ordered in time or space (Young, 1997).
There are two types of time series data: continuous such as electrocardiograms and discrete which are spaced intervals. The main features of time series are trend and seasonality. Trend is a long term movement in a time series. The trend is the direction and rate of change in the time series. Trends may be identified by taking averages over a period of time in seasonal data. If the averages change over time, then a trend is identified. For example, in economics the GDP has a positive trend in the long term while resources and fix cost has a negative trend in the long term. Seasonality is the component of variation in a time series which is dependent on the time of the year. There are four seasons: spring, summer, fall and winter. Dummies are used for seasonality. For the auto parts case study, regression with time series method was ran where Y the dependent variable is sales while X, X1 the independent variables are trend and seasonality respectively.
Dummies were used for spring, summer, fall and winter. If a season is non-significant P>0.05, then it does not have an impact on sales. After running the first regression, winter (Q4) is non-significant because it has a P value greater than 0.05 and a t value less than absolute 2; therefore, winter (Q4) does not have an impact on sales. After the first regression based on the F statistics the model is good; however, one of the independent variables (Q4) was non-significant. Subsequently, Q4 was eliminated and a second regression was ran. After running the second regression without Q4, based on the F statistics the model is good. The R Square value means how much the independent variable explains the behavior of the dependent variable.
In this model, the R square value represents how much trend and seasonality explain the behavior of sales. R square is equal to 95.47, which means that the model explanatory power is high. Historical data (Q1) was recreated using the model to compare forecast to original data so that can be manipulate later and the error statistic is used. The error statistics were calculated using the notes for measures of forecasting error in blackboard. The smaller the error the better the model. The average of the errors must be equal to zero ME=0. MSE is calculated by taking the error absolute values and doing the average. Then the square root of MSE=RMSE, and MAPE is the % error. The notes for measures of forecasting error indicate that “a value of U>1 indicates a poor forecasting model relative to a naïve forecast. A good forecasting model has a value of U forecast.
Regression with economic factors
Regression with factor uses historical data as input. For the auto parts case study, the dependent variable sales data is from 2004 through 2007 and independent variables M2, non-farm activity index and oil prices represent the economic factor that will potentially impact sales during 2008. After the first regression with factors was ran, M2 was non-significant because the P value was greater than 0.05. Regression with factors uses intercept, trend and seasonality. Where L is intercept, b is trend and S is seasonality. Regression is the methodology used for forecasting. For regression with factors, the intercept, trend and seasonality are constant.
The L, b and S values were calculated using the Notes on Exponential smoothing. After reviewing the model using alpha, beta and gamma constants, the model was optimized by using Microsoft solver and as a target cell the square root of every error statistic minimized. Alpha, beta and gamma have an impact on L, b and S values which have an impact in the forecast model and therefore the error. Alpha has an impact on L, beta has an impact on b and gamma has an impact on S. The model explanatory power is high since R square is equal to 93.36. After running regression with factor the model is good based on the F statistics.
Regression Statistics
Holtz-Winters models
“Holt (1957) and Winters (1960) extended Holt’s method to capture seasonality. The Holt-Winters seasonal method comprises the forecast equation and three smoothing equations — one for the level ℓ t , one for trend b t , and one for the seasonal component denoted by s t , with smoothing parameters α , β ∗ and γ . We use m to denote the period of the seasonality, i.e., the number of seasons in a year. For example, for quarterly data m=4 , and for monthly data m=12 .There are two variations to this method that differ in the nature of the seasonal component. The additive method is preferred when the seasonal variations are roughly constant through the series, while the multiplicative method is preferred when the seasonal variations are changing proportional to the level of the series. With the additive method, the seasonal component is expressed in absolute terms in the scale of the observed series, and in the level equation the series is seasonally adjusted by subtracting the seasonal component. Within each year the seasonal component will add up to approximately zero.
With the multiplicative method, the seasonal component is expressed in relative terms (percentages) and the series is seasonally adjusted by dividing through by the seasonal component. Within each year, the seasonal component will sum up to approximately m ”. (OTexts, 2013) The difference between Regression and Holtz Winters is that while regression uses trend, intercept and seasonality as constant, in Holtz-Winters they are changing or moving and when calculating the forecast uses the period before, trend, intercept and the seasonality one season before. The formulas on the notes on exponential smoothing were used to calculate L, b and S. After reviewing the model using L, b and S using the period before, the model can be optimize. For the optimization we used Microsoft solver and as a target cell we minimized the square root of every square statistics (common RMSF). Alpha, Beta, and gamma have an impact on L, b and S which had impact in the forecast model and therefore the errors.
References
BusinessDictionary.com, 2013 retrieved from: http://www.businessdictionary.com/definition/forecasting.html#ixzz2nKL11Ly4 Doane, D. R., & Seward, L. E. (2013).
Applied Statistics Business & Economic (4th ed.).
U.S: McGraw-Hill Education. Microsoft Office Excel. (2013).
Redmond, WA: Microsoft Corporation Read more Makridakis, S., Wheelwright, S. C. & Hyndman, R. J. (1998).
Forecasting Methods and Applications, (3rd Edition).
New York: John Wiley & Sons, Inc. OText.com, 2013