Wednesday, December 21, 2016

Field Activity 13: Pix4D Demo

Introduction

The objective of this lab was to gain experience with the processing software known as Pix4DMapper and use some of the various tools it has. Pix4DMapper is a program that transforms aerial imagery taken by unmanned aerial vehicles (UAV) or an aircraft into one large mosaic that can be processed using 2D or 3D surface models. The use of this software along with the coupling tools (such as a UAV) has been proven to be a cost effective way for companies to analyze their land and produce high resolution imagery to back up their work. For example, sand mining companies can use the software to calculate the amount of sand extracted from the ground in a given day by using the volume tool within the program. The lab focused on a mining site located just southwest of the University of Wisconsin- Eau Claire (Figure 1). The aerial imagery was captured using a Phantom drone provided by the university.

Figure 1: The location of the mining site in Eau Claire

Background

Prior to beginning the lab, it is important to do some research on the software and understand its capabilities and data parameters. The following questions were answered using the online Pix4D Software Manual. 
  • What is the overlap needed for Pix4D to process imagery?
    • Overlap refers to the overlap of the various images that will be tied together into one image (termed mosaic). In most cases, the overlap needed is roughly 75% frontal overlap and 60% side overlap . 
    • It is also recommended that the area of interest be captured in a grid patterned flight (Figure 2) in order to ensure proper overlap. 
      Figure 2: Recommended flight path to create the best overlap for mosaic
       (Pix4D Software Manual 2016)
  • What if the user is flying over sand/snow, or uniform fields?
    • 85% frontal overlap and at least 70% side overlap.
    • Also recommended to fly higher for better results
  • What is Rapid Check?
    • Processing template that creates output imagery faster but with lower accuracy. 
      • Reduces the resolution of the original images
      • Fewer keypoints are used on each image 
  • Can Pix4D process multiple flights? What does the pilot need to maintain if so?
    • Pix4Dmapper can process images taken from multiple flights as long as....
      • Adequate overlap is achieved
      • Same conditions (sun direction, weather conditions, no new buildings, etc.) are present during data acquisition.
  • Can Pix4D process oblique images? What type of data do you need if so?
    • Pix4D can process oblique images if.....
      • Enough overlap in each dataset and between datasets. 
  • Are GCPs necessary for Pix4D? When are they highly recommended?
    • No, they are not necessary but are useful to have. They are highly recommended during tunnel reconstruction
  • What is the quality report?
    • Processing report that identifies any errors with the data imagery.

Methods

To begin, the data was pasted into a personal folder. The image data was then uploaded into a new Pix4D project. From this point, the processing option had to be chosen. The one used was the 3D map. This allowed for a digital elevation model (DEM) and orthophotos to be created. After a mosaic was created, the area of interest had to be created. To save time, a small section of the mine was selected because the processing within the software takes a lot of time (Figure 3).  
Figure 3: the area of interest outlined in red was established to provide fast processing

The next step was to begin the processing of the AOI imagery. There are a total of 3 processing stages. Beginning with initial processing which allows user to change processing options and select the information to be put into the quality report. The quality report was looked at following this step (Figure 4). Based on this report, there was a 30% difference when it came to camera parameters and small issue between the ground control points. 
Figure 4: The quality report from the initial processing step. 
The next processing step was a point cloud mesh and DSM processing feature. This creates a 3D image of the data that can be seen in the results. the next steps were to use the various tools within the software. To begin, the volume tool allows one to calculate the volume of a material in a mound. In this case, the mound was made of frac sand. The volume was calculated by manually enclosing the pile of frac sand (Figure 5).The results are then calulated using the various pixel information within the specified area (Figure 6). Next, the DEM results were explored with different variations to show the data (Figure 7-8). The colored DEM was selected to how the differences in elevation more clearly.


Results

Volume:
                           
                       Figure 5: The study area with the pile of frac sand being calculated located in the lower right of the image.

 
Figure 6: The results from the 
DEM: 
Figure 7: The results from the study area DEM. The high points are shown in red and the low in blue. 


Figure 8: The total mosaic of the data prior to DEM compared to after DEM creation. 

Discussion/Conclusion:

The results from the Pix4D software shows the large amount of tools and capabilities that the software has to offer. though this lab was meant to just be a preview of its large amount of capabilities, it is clear that the software can be applied to many different fields such as business, environmental sustainability, and agriculture. 

Tuesday, November 29, 2016

Arc Collector 2: Creating your own database, features, and domains for deployment and use in ArcCollector

Introduction

This lab was designed to test the ArcCollector skills learned in the previous lab and provide experience in creating a database with features and domains. The project chosen to be completed was regarding the bike racks located on the lower campus of the University of Wisconsin- Eau Claire (Figure 1). The project should provide the university with information on whether or not there is enough bike space for the students and faculty alike. Whether for environmental, monetary, or health reasons, there has been a large amount of bikers communicating too and from the university. The university needs to be able to accommodate for the large amount of bikes needed to be stored throughout the day. During the week, except for Fridays, the bike racks located between Davies Center and Phillips Hall, the Old Library, and between Schneider Hall and Centennial Hall are very full leaving little, if any, space for more. The project took place on a Thursday to provide the best overall outlook with the time provided for the project. 
Figure 1: Map of the study area

Methods

Prior to heading out into the field, the geodatabase had to be set up. To begin, a file geodatabase was created in a folder dedicated to this lab within a personal folder. The geodatabase was named "bike_racks" following the naming rules learned from GIS I. Next, ArcMap was opened and domains for the geodatabase had to be set (Figure 2). 3 domains were created within the geodatabase. The first domain, Capacity, refers to the amount of bikes each rack can hold. This domain was a short integer type with a range from 0-100. Next, a domain, NumberBikes, was made for the input of number of bikes on the rack at the time of data collection.  The range was also 0-100 because the number most likely could not exceed the capacity. The third domain was Racksize. This domain was made to determine the size of the bike rack based on the full capacity. The domain was set as a text field with a code of small, medium, or large. These categories all had their own criteria. Small was 5-10, medium was 11-20, and large was anything over 20. 
Figure 2: The domains set for the geodatabase. The highlighted domain, rack size,
           shows the various categories and the ranges set in place for each one. 
The next step was to create a feature class create and define a feature class. The feature class created was named "bike_rack". The Projected coordinate system applied was the WGS 1984 Web Mercator (auxiliary sphere). this coordinate system was chosen because it is universal and therefore the data could be opened and used world wide. Next, four fields were created. The first three were named something similar to the domains listed above, capacity, number of bikes, and size. The final field was notes which was set as a text field to allow for any additional notes to be taken. Finally, the feature class was allowed to have picture attachments. Another feature class was created fro the study area.This feature class was a polygon type and was created by "freemouse"(aka freehand but on the computer) (refer back to figure 1).

Results

The results of the lab are shown in Figures 3-5. 
Figure 3: Map showing the locations of the bike racks

Figure 4: Map sshowing the sizes of bike racks

Figure 5: map showing the number of bikes present at the time of data collection


Discussion/Conclusion

Based on the results, the conclusion would be that the university does not need to add more bike racks but they should relocate several of them to areas of more foot traffic, such as near davies. In particular, the bike racks located behind Hibbard Hall are hard to access and have not had much use. However, the davies bike racks were almost at capacity and quite frankly it is doubtful that any more could have fit with how the bikes had been placed. This area should get more. 
There were several issues run into during the lab. The capacity domain that was set was too small and a couple of bike racks were unable to have accurate data inputs. Another issue was whether or not to count every individual rack or just one point for a set of racks in one area. The final issue was that the rack size ranges were very underestimated and most of the racks, though having a big capacity range, were labeled as large. 

Tuesday, November 15, 2016

Field Activity #9: ArcCollector Part 1- Microclimate

Introduction

The overall purpose of the lab was to gain experience using ArcCollector by ESRI. ArcCollector is a mobile app that is available for download through any app store (iPhone and Android). Collector allows you to update, observe, and create maps and features on the map. ArcCollector can be used both online or offline and allows for efficient data collection in the field. ArcCollector has many of the same main features as ArcGIS for the desktop such as the ability to create features in any type (point, line, polygon,) and add in several predefined attributes.

For this lab, the class was broken down into 9 groups of two.The campus was divided into 5 large zones (Figure 1). The main zone worked in by group 9 (Jackie Seamans and Amanda Senger), was group 5 which covered most of upper campus. The goal was to collect various points throughout both upper and lower campus.
Figure 1: Map of the UW- Eau Claire campus outlining the various zones for data collection

Methods

The points being collected throughout the lab were micro-climate data points. There were several attributes collected along with the location. These attributes included...
  • Group Number
  • Temperature (F)
  • Dew Point
  • Wind Speed
  • Wind Direction
  • Notes
Figure 2: Jackie holding the Kestrel device in the
air to get a weather reading 
The notes section allowed for any additional information about the point to be recorded such as proximity to water or vents which could alter the rest of the attributes.

Prior to data collection, downloading the app and setting up the online GIS account was necessary. By signing in through the ENTERPRISE (important that it be enterprise) and specifying "uwec" you are then able to sign in with your school account. As professor Hupy had already created the online map, downloading and saving the map to our own account was fairly simple. After being granted access to the geospatial field methods group on arcGIS online, making a copy of the map and saving it was done. From here, the next step was to download the app, again sign in through the ENTERPRISE access and open the map that was saved. From here, data could then be collected by the class at the same time. The 9 groups then separated to collect the data.

The device used to collect the temperature, dew point, and wind speed was a kestrel weather detector (Figure 2). To collect, you simply hold the detector up in the air, making sure not to breathe on it as that will affect the dew point reading, and find the measurement wanted by pressing the right and left arrows.(Hint: It is important to understand the symbols on the device as they are all very similar and can cause hella confusion)
Figure 3: Data point collected almost in the river.

As the professor stated, it was important to collect data points in a largely diverse area. This means that one should collect data from areas near water and then completely dry, some points should be collected in the sun while others in the shade, also low elevation and high elevation. As seen in Figure 3, the data was collected along the river which one would expect to result in a higher dew point.




Figure 4: Jackie using the compass while facing true north
to determine the azimuth angel of wind direction


In order to determine the wind direction, the estimated wind direction was decided by the users. Next, using a compass and facing North, the azimuth was determined based on where the wind was COMING from (it is important to note that wind direction is classified on which direction the wind is coming from not the direction it is going) (Figure 4). This measurement was expressed in degrees in a clockwise motion from true north.









Results

The results of the micro-climate data shows a few interesting patterns to be acknowledged. Once all of the data was submitted, it was then exported into a separate geodatabase where various attributes were mapped. These variables will be discussed further in the discussion portion of the lab report. 

Temperature Data:
Figure 5 shows the results of the temperature data within the different microclimate zones
Figure 5: Temperture data within the various zones throughout UW-Eau Claire
Dew Point: 
Figure 6 shows the results of the various dew point variability throughout the micro climate zones. 
Figure 6: the resulting map showing the different dew point readings throughout UW- Eau Claire 
Wind Speed:
Figure 7 shows the changes in wind speed throughout the campus.
Figure 7: Wind speed variability mapped throughout UW- Eau Claire

Discussion/ Conclusion

Temperature
The various changes in the temperature is most likely due to the location and elevation. Many of the cooler temperatures can be find near the river and creek that run throughout the campus. However, when looking at the notes within the data, warmer temperatures can be found in areas in the direct sun such as parking lots. Interestingly, not a single point was taken in zone 4.This is probably due to hearing issues while in class. 

Dew Point
Dew point (the temperature at which dew is formed) at first is confusing to look at. The areas by the water should have lower dew points because there is more moisture in the air, however, the symbols are larger indicating a higher dew point. 

Wind Speed
The areas with high wind speed appeared to be on the foot bridge, along the point bar of the river, and at higher elevations on upper campus. This is understandable due to the lack of any large buildings or hills blocking the wind.


Overall, the lab did a good job at introducing the class to ArcCollector which will ultimately benefit the class for the next lab. 

Tuesday, November 8, 2016

Field Activity #8: Priory Navigation

Introduction

Figure1: Location of the Priory 
      The purpose of the lab was to use the navigation map previously created to locate 5 designated points within the Priory (Figure 1). The class was split into 6 groups. The locations of the points were given to the groups in UTM. Therefore, the UTM navigation map is used for the lab. The group decided to use the map created by Anneli Williams (Figure2).The map she created was very well put together. It had the proper projection, contour lines, and both a scale and representative fraction scale. The lab is important because basic navigation skills are very important to have. In today’s world, navigation and other basic skills are dominated by technology. Need to get somewhere you’ve never been before? Type the address into your phones GPS and you’re good to go. What happens if your phone dies, you don’t have service, or the company messed up their data and you end up on a pedestrian bridge in the middle of a college campus? Being able to navigate, whether on a street or through a forest, is a skill all people, especially geographers, should have. The following post will discuss the methods and tools used during the navigation process, the results of the track logs of all 6 groups, and a discussion on the results and the issues encountered.

Figure2: Map used by Group 5 to navigate the priory. Map has contour lines and a 50 meter grid interval 

Methods

Figure 3: Manually placed points on the map. 
      After arriving at the priory, each group was given a set of 5 points, their map, a compass, and a GPS unit. To begin, the groups found the set of points on their map, marked them, and connected them with an arrow showing direction of travel (Figure 3). Being group 5, we were assigned the following point.

1. 618011,4957883
2.618093,4957823

3.618107,4957942
4.618195,4957878
5. 618220,4957840




Figure 4: Professor Hupy showing the class how
 to use the compass and map to find direction
      Following the manual placement of the points the map, Professor Hupy provided the class with a quick tutorial on how to use the compass and map for navigation (Figure 4). We were then tasked with determining a pace count for each individual person. This was done by walking alongside a 50-meter measuring tape and counting each step. This number was different per person due to different stride lengths. Next the GPS unit was set to UTM and the track log was activated. This allows for the path of each group to be recorded. 









Figure 5: Alignment of the compass edge
 with the line of travel
      Beginning in the parking lot of Priory Hall, the edge of the compass was aligned with the line from the parking lot to point 1 (Figure 5). The compass face was then twisted until the lines within the face was lined up with the UTM grid lines.  This reading is then considered the “heading” is the direction in number of degrees from north. Once the direction of travel is determined, the next step is to find the distance. This is done once again by placing the edge of the compass, which has a ruler in cm, along the line from point A to point B. This measure, along with the RF scale on the map (1 cm=37 meters), was used to calculate the real world distance from starting point to the next point.



      Since you now have the direction and distance, the next step is to send a runner in the direction of the end point. The runner is sent in as straight a line as possible in the direction of the end point until they can no longer be seen or the point is reached (Figure 6). Next, the two pacers walked behind the runner, also in a straight line, while counting the paces to get a good idea of the distance traveled and therefore how much further is needed to go (Figure 7). Once the point was located, the coordinates were checked on the GPS to be sure it was from this years lab (Figure 8).This process was used from point to point. Though there were several issues the group ran into that will be discussed later. 
Figure 6: The runner making his way to the ending point
   

Figure 7: Pacers counting the steps to the runner.



Figure 8: Coordinates on the GPS at a point
double checked with those given to ensure accuracy
















Results

The track logs from each group's GPS unit was uploaded onto the computer and added to a share folder. The logs were then brought into ArcMap for comparison (Figure 9). This allowed all groups to see the navigation route done by each group throughout the study area. Groups 5 is depicted by yellow (Figure 10). As seen, there was confusion with the second point. The point was no longer marked. The group walked around in circles looking for it and came across mice and a coyote.  In order to find the correct point the group used only the GS once in the correct area to find the exact location of the point where it was then marked (Figure 11).
Figure 9: Map showing the route by all groups

Figure 10: Zoomed in map of the route taken by group 5; notice the confusion near the bottom. 
Figure 11: Jeff marking the Priory with point 2
Because there were 6 groups, the 5th set of points were located by both group 5 and 6. However, group 6 did them in reverse. The results of the two groups' track log were interesting(Figure 12). It showed that group 6 did not find the second point (just like we could not at first) because they stop short of group 5 second point. 
Figure 11: Map of Group 5 and 6 track log at a larger scale. 

Discussion/Conclusion
The navigation lab through the priory proved to be a very helpful lab when it comes to both creating a navigation map and conducting the actual navigation itself. It is certain that the skills learned in this lab will come of help in the future endeavors by myself and peers. However, there were several issues reached by my group. It was difficult with the tough terrain to walk in a straight line and therefore get an accurate pace count to determine distance. This is something that, with more time and tools to cut dead branches, could be fixed in the future. 

Tuesday, November 1, 2016

Field Activity #7: Development of a Field Navigation Map

Introduction

Figure 1: Map showing the location of the
   Priory in relation to the University.
The purpose of the following lab was to create two navigation maps using ArcMap. Basic navigation skills are very important to have within today’s world. Like last lab, this lab emphasizes the unpredictability of technology today that many of us rely on. Using basic tools, and the maps created, our task for next lab is to navigate through an area and collect points therefore, this lab’s results are crucial and must be accurate. The two maps created will use different coordinate systems than most maps one sees today which is Latitude and Longitude. Rather, the maps will use a Universal Transverse Mercator (UTM) coordinate system and a Geographic Coordinate System of Decimal Degrees. UTM coordinate systems are often used in maps of a small geographic scale, therefore, more detail can be viewed.  

Study Area

The location the site in question is the University of Wisconsin- Eau Claire Children’s Nature Academy. For this project, the location will be referred to as the Priory because Priory Hall is located adjacent to it and it is located on Priory Road. The Priory is located roughly 3 miles south of the University (roughly an 10 minute drive) (Figure 1).

Methodology

The creation of the two maps was completed through the use of ArcMap, a computer mapping software produced by the almighty ESRI. All of the data could be found in the class share folder within the Priory geodatabase (Priory.gdb) The first step was to copy and paste these folder into a folder designated to this lab within a personal folder. This ensures that the entire class can make changes to the data without compromising other's.

To create these various maps, it is important to have a refresher on what makes a map, a map. In order for a map to be usable in the field and be considered a map it must include: 
  • Title
  • North Arrow
  • Scale Bar
  • Representative fraction 
  • Projection
  • Coordinate System
  • Labeled grid
  • Data sources
  • Cartographer's names
When creating the first map, the UAV imagery of the site was uploaded first. This ensured for the entire layer would be in the same projection,Transverse Mercator. The first map created contained a UTM coordinate system. A grid of the coordinate system was added to the map with a 50 meter spacing. This spacing was chosen to allow for a detailed grid to make accurate interpretations from the map. The imagery was then set to a 50% transparency to allow the grid to become easier to see.  The final map included the various components listed above (Figure 2). 

When creating the second map, the UAV imagery of the site was uploaded first just as before. This made it so that the projection for the entire layer and all data added would be the same one,Transverse Mercator. This map was to have a grid showing the geographic coordinate system in decimal degrees.  A grid of the decimal degree coordinate system was added to the map with a  1 second spacing.  This spacing was chosen to allow for a detailed grid to make accurate interpretations from the map. Because the geographic area is so small, the values were edited to only show to the hundredths of the degree as seen in (figure 3). The imagery was also set to a 50% transparency to allow the grid to become easier to see.  The final map included the various components listed above. 

Results

UTM Map

The final product of the first map created, the UTM map, is seen below.As one can see, the 50 meter line spacing allows one to navigate more accurately but also does not over-crowd the map (Figure 2).
Figure 2: Resulting map using the UTM grid 

Decimal Degrees Map

The results of the second map (Figure 3) shows a detailed image of the priory area along with a grid in decimal degrees for navigation.
Figure 3: Results of the second navigation map using a decimal degree grid.

Conclusion

The resulting maps created in this lab will be of good use in the lab to come. The navigation lab is crucial because technology is not always available or reliable. The maps were created with the soul purpose to aid in the navigation process. Therefore, certain aspects of the map are added such as the grid, or contour lines to help visualize the landscape. However, the contours tried in this lab caused the maps to be very cluttered. Future work would be to figure out this tool to add wider interval contour lines.  

Tuesday, October 25, 2016

Field Activity #6: Conducting a distince Azimuth Survey

Introduction

The purpose of the lab was to learn a new method of surveying known as distance azimuth. This method of data collection is a very low technological method. Much of surveying today is highly reliant on technology. the technological advances have had a positive impact but they come with a disadvantage of possibly malfunctioning while on site. If this were to happen methods, such as the distance azimuth survey method, would be beneficial to have experience with. Distance azimuth uses the concept of azimuth. Azimuth is the direction of a object in relation to a point. This is expressed in degrees. To gain experience with the method, the class was tasked with plotting the location of various tree types throughout a small area of Putnam Park located near campus(Figure 1). 
Figure 1: Location of Putnam park in relation to the University of Wisconsin- Eau Claire


Figure 2: The tools used to conduct the distance azimuth survey.
 A) garmin handheld GPs unit, B) Compass, C) laser distance finder 

Methods

Prior to beginning the survey research was done to avoid as much error as possible. The blog's of previous classes were studied. This allowed the class to learn from the mistakes. The tools to be used to conduct the survey included a  GPS unit, compass, and laser distance finder (Figure 2). Manual notes had to be taken as well so a notebook and writing utensil were necessary as well. The uses of each compass will be described in the data collection section.

      Data Collection

The first step necessary during data collection is to determine a point of origin. When dealing with azimuth, there needs to be one point that every other feature is measured from. In this case, the point was determined and marked using the handheld GPS. This will help when the data gets imported into ArcMap and will be georeferenced. The next step was to then determine the various trees in the vicinity, and get the distance and direction in degrees from the point of origin of each one. The information recorded manually on a piece of paper included; Distance,Azimuth, Tree type, and Diameter. There was a total of 30 different data points collected over the course of the lab.    

      

      Data Normalization

The 30 points were composed in an Excel file and put into the class discussion forum for everyone to use.  Here, the data was organized in a manner that would allow the information to be uploaded into ArcMap. This processes is called "normalizing" data. The final result of the excel file was a column for X, Y, Distance, Azimuthm Distance, Tree type, and Point number (Figure 3).



Results

The results form the lab was the following excel table. This table provided the information that has been used in ArcMap to show the location of various trees throughout Putnam Park.
Figure 3: The final normalized excel file with the various data point information. 

Conclusion

The surveying method used throughout the lab, distance azimuth, is a very low tech method. However, it is a very important method to learn because technology is not always reliable or accurate. the results form this lab will be used in a future lab where a map of the various tree types is created.


Tuesday, October 18, 2016

Field Activity 5: Visualizing and Refining Terrain Data

Introduction


Table 1: Snip from the Excel file used to input
elevation. Shows normalized data
The previous lab involved the creation of a landscape and the collection of the landscape’s elevation data. the purpose of the lab was to gain an understanding of sampling and the various methods of sampling. The data collected was put into an excel file which was normalized to be compatible with ArcMap. This meant there was only a X, Y, and Z column (Table 1).


The data points of the landscape were collected every 5 cm for the areas that had a lot of elevation change (hill, depression, ridge, valley) and every 10 cm for the areas of little change (plain) (Figure 1). These points will be plotted in ArcMap to be interpolated and transferred into a 3D visualizing software, ArcScene. Interpolation is the calculation method used to create new values for points within a known range. This allows for a more detailed data set and therefore high accuracy. The purpose of the lab is to create a DEM model of the landscape and compare the various interpolation methods including inverse distance weighted (IDW), Natural Neighbors, Kriging, Spline, TIN. 

Figure 1: The landscape created in the sandbox during the previous lab.
The needed features have been identified.

Methods

Figure 2: The data points after being added into ArcMap.
        There has been axes and the starting points
added to provide correct orientation. 

Before beginning the processing on the data points a folder was created along with a geodatabase (“sandbox”), both of which were dedicated to this individual lab to keep the files organized. The excel file created in the previous lab (refer back to Table 1) was imported into the sandbox geodatabase. To add the data points into ArcMap, the XY data was added via the imported excel file. The layer was exported as a point feature class (Figure 2).



Interpolation must be done in order create an accurate 3D visualization. In order to do so, the extensions had to be set to 3D Analyst. The feature class was then added as the input file for the first interpolation tool, IDW. This process was repeated with the remaining interpolation methods (Table 2). 


Once the interpolation was finished, the data was then exported to ArcScene for each individual feature class.  The scale had to be adjusted to the calculated range in order to portray the small area as a more accurate representation of the landscape. The orientation of the imagery is with the (0,0) point being in the lower left corner of the landscape (Figure 4). The various results are discussed further in the “Results” section. 
Table 2: The table provides a brief description of the various interpolation methods used to create DEMs of the landscape.



Results/Discussion

The overall results of the interpolation methods used: IDW, Natural Neighbors, Kriging, Spline, TIN; show a range of advantages and disadvantages. Some of the methods resulted in similar representations of the landscape created in the previous lab. Each individual method will be analyzed further.


Inverse Distance Weighted (IDW)


Figure 3: IDW interpolation result

The IDW interpolation result shows the landscape and the features needed (Figure 3). However, when analyzing the image, it is clearly interpolated by creating clusters of data points weighted heavier at certain points. However, if you compare Figure 3 to Figure 1 it is easy to point out the landscape features. A disadvantage of the IDW interpolation method is that it is not a smooth topographic representation of the landscape created in the previous lab. 

Natural Neighbors


Figure 4: Natural Neighbors interpolation result

The results of the Natural Neighbors interpolation method provided an accurate representation of the landscape created (refer to Figure 1) (Figure 4). An advantage of this interpolation method is the fact that creates a smooth surface when compared to the previous method, IDW. The extreme elevations are very clear within the image and therefore the landscape features are easy to identify. The valley leading into the low area surrounding the hill is easy to identify. The area near the starting point (0,0) is higher than the plain, could be referred to as an unintentional plateau. However, in the areas with little elevation change it is difficult to see the detail of the landscape. 

Kriging
Figure 5: Kriging interpolation result

The Kriging method of interpolation resulted in a very similar image to the previous method of Nearest Neighbor. The landscape shows the surface features of the grid (Figure 5). The ridge near the upper left portion of the landscape appears to be very broken up and not of high elevation as expected. The feature is not as defined as it appeared to be in the actual landscape (refer back to Figure 1).  An advantage of the method is a smoothed surface and transition. A disadvantage of this method is that there is not much detail in the image in the plain areas. The plain, though it had little change, still had some elevation change which did not appear in the Kriging method results. 

Spline
Figure 6: spline interpolation result

The Spline method of interpolation had a large effect on the intensity of the elevation change (Figure 6). This is a major advantage of using this interpolation method. One main aspect that is better with this method is the ability to see even the small elevation changed in the areas of the plain and plateau. It also represents the highest elevation well. The transitions from one elevation range to another is very smooth which allows for a more appealing DEM. 


Tin
Figure 7: TIN interpolation result
TIN interpolation was the one method that visually stood apart from the other (Figure 7). An advantage of the TIN method is that it shows the elevation extremes. The highest points, at the peak of the hill and another hill in the lower right area of the grid, are a light grey (almost white) which is easy to identify. The lowest elevation within the depression and valleys are also a lighter color than the higher surrounding landscape. 

Summary/Conclusion

After analyzing the results of the various interpolation methods it is concluded that Spline does the best at representing all aspects of the landscapes topography. The Spline method showed smooth transitions between the elevation ranges but also intensified the little changes in the flatter areas. This method’s results looked most realistic to the original landscape and did the best to represent the sandbox landscape.

 The overall results of the images proved that there had more than enough data points input into ArcMap to accurately represent the landscape that had been created previously. This means that there did not need to be a repeat of the lab to collect more points. The only errors that may have influenced the data is human error while relaying data vocally.

This survey provided experience with the advantages of sampling but also making sure to collect enough data to create a full picture of the survey area. This was the most important aspect of this lab when compared to others thus far. However, it is always not needed to collect many points in certain regions. As seen in this lab, with areas of little change there was not much need to collect many points because they were all similar and could be represented as so with few points. That being stated, areas of high amounts of change results in the need for more points for detail. 




Tuesday, October 11, 2016

Field Activity #4: Creation of a Digital Elevation Surface using critical thinking skills and improvised survey techniques

Introduction

Sampling is a technique used to investigate a population by gathering data from a small portion of the entirety. Sampling is used to save both time and money and is used for many studies. Sampling provides an overall look at the spatial variations of a phenomena within a study site. There are three sampling methods; 1) random ,2) systematic, 3) stratified. The objective of the lab was to create a landscape containing a wide range of elevation and collect and record the elevation through systematic point sampling. This means that samples were collected evenly throughout the study area. In our case, the samples were in form of centimeters and represent elevation.


Methods 

               Figure 1: The landscape was created
with all of the parameters needed. 
The elevation of the landscape was collected by taking measurements through a systematic point sampling technique. In order to accurately portray the topography of the landscape measurements had to be taken fairly often at a regular interval. The landscape was created in a 45 by 45 in  (114 by 114 cm) sandbox located east of Philips Hill. Designing the landscape involved several parameters. It was required for the landscape to have a hill, ridge, valley, depression, and a plain. Creation of the landscape was done by hand (Figure 1). Materials given to us was string, wall tacs, tape, measuring tape, and a meter stick. Using the wall tacs and meter stick, every 5 cm was marked on all 4 sides of the sandbox. After placement of the tacs, the string was used to create a grid over the top of the landscape (Figure 2). Our sea level or zero elevation was the actual ground. This was chosen to ensure that negative values would not be measured.

         Figure 2: The string was wrapped around the
 tacs to create a grid over the landscape

To collect the elevation points a metal hanger was straightened and stuck into the ground within one of the grid squares (Figure 3). Once ground was met, the hanger was taken out and placed against a meter stick to read the measurement in centimeters (Figure 4).
Figure 3: The grid helped keep the measurements
 organized and at set increments of 5 cm

Figure 4: Elevation collection

The elevation points were placed in an excel file with 3 columns (Figure 5). The grid had a column for X, Y, and Z so that each point could be plotted on a grid with the elevation. This will help in later visualization of the landscape within ArcMap.

Figure 5: Excel table showing set
 up of elevation measurements 

Results

The overall table consisted of over 400 sample points taken of the to show the elevation change (relief) of the landscape we created. The excel table is the most important result of the project. Using the correct formatting that can be read in ArcMap, the goal for future work is to input the table to create a topography model of the landscape through various processing procedures.
After running statistical measurements of the Z column (which holds the elevation in cm) various statistics were found...
- Maximum elevation = 23.3 cm above sea level
- Minimum elevation = 6.6 cm above sea level
- Mode= 14 cm above sea level

These statistics show the wide range of elevation. the areas that were taken for the plain resulted in less sample values because of the extremely low elevation change. The mode shows that many of the areas was relatively high above sea level which could indicate thick lithosphere in the real world.

Conclusion

Gaining an understanding in sampling (in particular the systematic point sample method) both time and money can be saved. this was witnessed first hand during the collection of the data which lasted 4 hours.  To show even more detail, which might be needed, in areas of great elevation change the data points for that area can increase. The more data the more likely it will be accurately portrayed.