Final published version
Research output: Contribution to Journal/Magazine › Journal article › peer-review
<mark>Journal publication date</mark> | 17/01/2011 |
---|---|
<mark>Journal</mark> | Journal of Intelligent and Fuzzy Systems |
Issue number | 1 |
Volume | 22 |
Number of pages | 5 |
Pages (from-to) | 15-19 |
Publication Status | Published |
<mark>Original language</mark> | English |
In fuzzy time series analysis, the determination of the interval length is an important issue. In many researches recently done, the length of intervals has been intuitively determined. In order to efficiently determine the length of intervals, two approaches which are based on the average and the distribution have been proposed by Huarng [4]. In this paper, we propose a new method based on the use of a single variable constrained optimization to determine the length of interval. In order to determine optimum length of interval for the best forecasting accuracy, we used a MATLAB function which is employing an algorithm based on golden section search and parabolic interpolation. Mean square error is used as a measure of forecasting accuracy so the objective function value is mean square error value for forecasted observations. The proposed method was employed to forecast the enrollments of the University of Alabama to show the considerable outperforming results.