Proceedings of the 2016 4th International Conference on Mechanical Materials and Manufacturing Engineering

An automatic method to extract Populus Euphratica forest in a large area using remote sensing

Authors
Y.Y. Wang, B. Zhong, F.J. Shang
Corresponding Author
Y.Y. Wang
Available Online October 2016.
DOI
10.2991/mmme-16.2016.61How to use a DOI?
Keywords
Google Earth; Populus Euphratica Forest; Object-oriented Classification; Automation; High spatial resolution
Abstract

Populus Euphratica is a haloduric desert vegetation growing in arid regions. It is a drought-enduring plant and it is also a wind barrier to fix sands in desert areas. The distribution of Populus Euphratica forest is required to carry out the management of water resources in the arid region, such characteristics make it play an important role in maintaining the hydrological ecological balance in desert. However, they are distributed discretely and the single tree is very small, so it is difficult or even impossible to be extracted at a large scale based by using moderate to high spatial resolution remote sensing data. Thus, the high spatial resolution remote sensing data are required for extracting Populus Euphratica. However, the utilization of high and very high spatial resolu-tion remote sensing data in a large area is usually not implementable because of the high costs and low pro-cessing speed. In addition, the manual procedure is usually incorporated, which is further lower its imple-mentalbilty. Therefore, there is hardly any related research for extracting Populus Euphratica in a large area. In this context, this paper proposes an automatic method to extract the populus euphratica forest in lower-stream of the Heihe River (total area of approximately 21646.6km2) by using object-oriented classification method. Firstly, high spatial resolution data (higher than 2 m) are extracted from Google Earth (the data are used for free and the high level product based on Google Earth images don't have copyright issues). The extracted im-ages are mosaicked automatically using the application programming interface provided by Google Earth. Thirdly, based on analyzing the characteristics of Populus Euphratica as image objects, a set of rules for ex-tracting Populus Euphratica by employing object oriented method are constructed. Finally, the manual inspec-tion method is employed to verify the accuracy of the extracting results and it shows an accuracy better than 87%. The proposed method is capable of extracting populus euphratica forest using Google Earth automatical-ly with low-cost and high-precision and it will become a feasible technical solution to extract thematic infor-mation automatically with low-cost and high-precision. Moreover, it will lead a large amount of applications, which are able to provide high-precision and high-resolution thematic products at a very low cost for the sic-economic development in the future; therefore, it has great values on remote sensing applications.

Copyright
© 2016, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Download article (PDF)

Volume Title
Proceedings of the 2016 4th International Conference on Mechanical Materials and Manufacturing Engineering
Series
Advances in Engineering Research
Publication Date
October 2016
ISBN
978-94-6252-221-3
ISSN
2352-5401
DOI
10.2991/mmme-16.2016.61How to use a DOI?
Copyright
© 2016, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Cite this article

TY  - CONF
AU  - Y.Y. Wang
AU  - B. Zhong
AU  - F.J. Shang
PY  - 2016/10
DA  - 2016/10
TI  - An automatic method to extract Populus Euphratica forest in a large area using remote sensing
BT  - Proceedings of the 2016 4th International Conference on Mechanical Materials and Manufacturing Engineering
PB  - Atlantis Press
SP  - 266
EP  - 273
SN  - 2352-5401
UR  - https://doi.org/10.2991/mmme-16.2016.61
DO  - 10.2991/mmme-16.2016.61
ID  - Wang2016/10
ER  -