03/14/17
Gathering Data
Introduction:
The objective of this lab was to re-orient with the method of collecting data from internet sources. Also, part of the objective was to import the data into ArcGIS, join the data, and project data from these different sources into one coordinate system. A geodatabase to store the data was also created.
The scenario involved the first of several steps in an ongoing project. The project is to build a suitability and risk model for sand mining in Wisconsin. As discussed in a previous post about sand mining in Wisconsin, sand mining is very controversial. This makes the suitability and risk model of sand mining in Wisconsin very important.
Overall, the objectives can be laid out in the following manner.
1. Download data from different websites.
2. Import the data and join certain tables.
3. Create a python script to project, clip, and load all of the data into the geodatabase that was created.
Methods:
Data for the project first had to be retrieved and downloaded from online sources. The following sources used are numbered below. After the data were downloaded the data were unzipped and then extracted. The data were then loaded into the proper geodatabase.
1. (U.S. Department of Transportation). The first data retrieved was United States rail lines data. This data was located on the U.S DOT web page. USA DOT Website . After reaching the site the polyline feature class was selected to download the railway data.
2. (USGS National Map Viewer). Next, the 2011 National Land Cover Database was used to download land cover data for Trempealeau County, Wisconsin LandCover Data . Elevation data for Trempealeau Country was also retrieved from this site.
3. (USDA NASS Geospatial Data Gateway). The land crop cover data was found on the US Department of Agriculture website USDA CropCover. The Trempealeau County soil data was then navigated to.
4. (USDA NRCS Web Soil Survey). After going to the webpage USDA web soil survey, Trempealeau County was selected as the AOI (area of interest). Then the soils data was simply downloaded.
5. (Trempealeau County Land Records). Trempealeau County data was found on the Trempealeau County website Tremp County Data. The entire Trempealeau County geodatabase was downloaded.
After the data were downloaded py.scripter/python code was used to create three separate output rasters. The output rasters are results from the DEM model, Crop-cover data, and the Land-cover data. Python Code can by found here Python Script.
Results:
A map was created from the three output rasters which shows crop cover, general landcover, and elevation (Figure 1). Trempealeau County is located in a very hilly region. There are many hill or "bluffs" in Trempealeau County which can be clearly seen on the DEM. The landscape was also dominated by deciduous forest mixed with agricultural fields. Accounting for the southern border of the county is the Mississippi River which provides a large amount of wetland habitat.
Figure 1.
Data Accuracy:
The data accuracy was assessed based off of the metadata provided. Certain metadata was difficult to locate such as Planimetric Coordinate Accuracy. Metadata are shown in the table below (Figure 2).
Figure 2. Metadata displayed. Certain metadata proved difficult to locate and is marked as NA.
Conclusion:
There are great amounts of data that are available online. Understanding how to properly download online data is an important skill that should be utilized when completing a project. Furthermore, many online datasets and datasets in general are extremely large. Working with the data can become time consuming. Therefore, utilizing python scripter to perform various processes on the data can save time. Utilizing python becomes more important the larger the dataset gets. Finally, metadata should always be collected from online datasets. This gives indication into the data integrity. If the dataset does not have any metadata associated with it then there is cause for concern.
No comments:
Post a Comment