Comparing prediction algorithms in disorganized data
Main Article Content
Abstract
Real estate market is very effective in today’s world but finding best price for house is a big problem. This problem creates a propose of this work. In this study, we try to compare and find best prediction algorithms on disorganized house data. Dataset was collected from real estate websites and three different regions selected for this experiment. KNN, KSTAR, Simple Linear Regression, Linear Regression, RBFNetwork and Decision Stump algorithms were used. This study shows us KStar and KNN algorithms are better than the other prediction algorithms for disorganized data.
Keywords: KNN, simple linear regression, rbfnetwork, disorganized data, bfnetwork.
Downloads
Article Details
This work is licensed under a Creative Commons Attribution 4.0 International License.
Global Journal of Computer Sciences: Theory and Research is an Open Access Journal. All articles can be downloaded free of charge. Articles published in the Journal are Open-Access articles distributed under CC-BY license [Attribution 4.0 International (CC BY 4.0)]
Birlesik Dunya Yenilik Arastirma ve Yayincilik Merkezi (BD-Center) is a gold open access publisher. At the point of publication, all articles from our portfolio of journals are immediately and permanently accessible online free of charge. BD-Center articles are published under the CC-BY license [Attribution 4.0 International (CC BY 4.0)], which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and the source are credited.
References
Altman, N. S. An introduction to kernel and nearest-neighbor nonparametric regression, The American Statistician, 1992, pp. 175-185.
Cleary, J. G and L. E. Trigg, K*: An instance-based learner using an entropic distance measure, Proceedings of the 12th International Conference on Machine learning, 1995, pp. 108-114.
Broomhead, D. S. and Lowe, D. Radial basis functions, multi-variable functional interpolation and adaptive networks, 1988.
Iba, W. and Langley, P. Induction of One-Level Decision Trees, Proceedings of the Ninth International Conference on Machine Learning, 1992, pp. 233–240.
Haara, A. and Kangas, A. S. Comparing K nearest neighbours methods and linear regression – is there reason to select one over the other?, MCFNS, 2012, 4 (1), pp. 50-65.
Sahibinden, Sahibinden.com, [Online]. Available from: www.sahibinden.com.
Hurriyet Emlak, Hurriyet Emlak, [Online]. Available from: www.hurriyetemlak.com.
Ripper, V. W. Visual Web Ripper, [Online]. Available from: http://www.visualwebripper.com/.
Witten, H. and Frank, E. Data Mining Practical Machine Learning Tools and Techniques, Morgan Kaufmann, 2005.