Measure and PrecisionHawk Announce Partnership for Commercial Drone Services

WASHINGTON, Jan. 8, 2015 — (PRNewswire) —  Measure, a 32 Advisors Company, and PrecisionHawk are pleased to announce a partnership to provide unmanned aircraft system (UAS) services. This partnership will allow clients to have quicker, simpler, and more affordable access to advanced aerial data gathering, processing, and analysis through Measure’s turnkey Drone as a Service® offering.

“PrecisionHawk’s analytical software, among the most sophisticated in the world, is a critical tool for clients who need powerful near real time data processing capabilities,” commented Brandon Torres Declet, CEO and Co-founder of Measure. “As a Measure partner, we plan on integrating the DataMapper software solution with select clients through our Drone as a Service® offering.”

With Federal Aviation Administration drone regulations beginning to take shape, many companies, without a plan for drone operations, will likely be seeking guidance and support to take advantage of this cutting edge technology. With their partnership, Measure and PrecisionHawk will be uniquely positioned to meet growing demand.

“As the industry expands, companies are looking for a comprehensive solution to fit into their existing workflow. This extends far beyond a hardware platform,” said Christopher Dean, PrecisionHawk CEO. “In partnering with Measure, we can collectively engage with global clientele in an advisory and operational capacity to ensure users see a return on their investment.”

Both companies provide innovative solutions tailored for commercial clients and possess a collection of sector-specific expertise, essential in a field with limitless applications for diverse end-users, including oil & gas companies, utilities, farmers, insurers, mining companies, wildlife conservationists, public safety professionals, filmmakers, and many others.  Read the full article here.

How Google’s geo-crowdsourcing is transforming the map

By A. Stefanidis. A. Crooks and A. Croitoru in  12 Jan 2015 (The lead author is Dr. Tony Stefanidis of GMU–mk)

Google has managed to map most of the world. Recently, the company offered a behind-the-scenes glimpse into how it built the Google Maps application using a combination of technology (the Google Street View car); expansion (the acquisition of satellite-imagery startup Skybox); and algorithms (computer vision, photogrammetry, mapping).

The company’s initial focus had been on the world’s population centres. In 2006, Google had used high resolution satellite imagery to map 37 per cent of the world’s population; by 2012 that number had risen to 75 per cent.

But the company’s reach has now extended beyond human settlements. In Google Maps’ Street View feature, users can now observe penguins in Antarctica, tourists in Machu Picchu, and Himalayan base camps.

While the early focus of Google’s mapping efforts had been on mapping for the world, the company is now jumping on the crowdsourcing bandwagon: to collect mapping data fromthe world.

With mapping tools like “Google Map Maker” and “Report a Problem,” it tries to harness the geographical contributions of “on the ground” users as a way to complement existing content in Google Maps. People from all over the world can now edit information on the Google Maps application to ensure a higher accuracy.

In addition to being editors, users can also become data collectors. They can carry the Street View Trekker (a backpack outfitted with Google’s cameras) to snap images – later to be uploaded on Street View – as they hike through US National Parks and the Galapagos islands, or even take camel rides to map Abu Dhabi’s sand dunes.  Click here to read more.

Mapping the World’s Problems

Kate Galbraith at  8 Jan 2015

SAN FRANCISCO — Nearly a decade ago, an environmental group in Brazil grew concerned that government data and maps about Amazon deforestation were out of date and hard to view. The group, Imazon, decided to create its own monitoring tools, using information from satellites.

Imazon’s efforts caught the attention of Google, the search engine giant. Now, monthly reports on the Brazilian Amazon are produced throughGoogle Earth Engine, a technology platform within the company. The partnership has made data processing faster and the information more accessible, according to Carlos Souza Jr., a senior researcher at Imazon.

“What Google could do is to make things easier,” Dr. Souza said.

The invasion of America

Claudio Saunt in  Jan 2015

USA. Mount Rushmore, South Dakota. August 1, 2012. US Presidents plus nine Sioux leaders (l to r): Sitting Bull, One Bull, Rain-in-the-Face, Crow King, Gall, Red Horse, Fool Bull, Low Dog, Spotted Eagle and Red Cloud. Photo by Larry Towell/Magnum Photos
USA. Mount Rushmore, South Dakota. August 1, 2012. US Presidents plus nine Sioux leaders (l to r): Sitting Bull, One Bull, Rain-in-the-Face, Crow King, Gall, Red Horse, Fool Bull, Low Dog, Spotted Eagle and Red Cloud. Photo by Larry Towell/Magnum Photos

Between 1776 and the present, the United States seized some 1.5 billion acres from North America’s native peoples, an area 25 times the size of the United Kingdom. Many Americans are only vaguely familiar with the story of how this happened. They perhaps recognise Wounded Knee and the Trail of Tears, but few can recall the details and even fewer think that those events are central to US history.

Their tenuous grasp of the subject is regrettable if unsurprising, given that the conquest of the continent is both essential to understanding the rise of the United States and deplorable. Acre by acre, the dispossession of native peoples made the United States a transcontinental power. To visualise this story, I created ‘The Invasion of America’, an interactive time-lapse map of the nearly 500 cessions that the United States carved out of native lands on its westward march to the shores of the Pacific.  Click to continue reading. 


Elsevier to Add More than 100,000 Maps to Geofacets Platform through Agreement with Wiley  6 Jan 2015

New York, Jan. 5, 2015—Elsevier, a world-leading provider of scientific, technical and medical information products and services, today announced an agreement with Wiley to integrate more than 100,000 maps from Wiley’s leading journals into the Geofacets platform. The content integration will increase the amount of trusted data available to geoscientists, helping them to make better-informed decisions in exploration, as well as improve recommendations that influence company investments in new exploration opportunities.

Content to be added includes fundamental geologic maps, petroleum geoscience maps, geophysical maps and specialized maps, such as those focused on sedimentology and geochemistry. The content is from 26 Wiley journals, published on behalf of several society partners (including the American Geophysical Union,). This will increase Geofacets’ content portfolio to more than 500,000 maps extracted from almost 200,000 associated articles by the end of the year.

Geofacets is Elsevier’s premier research tool for geoscientists working in natural resource exploration.  Geofacets extracts geologic maps from trusted, peer-reviewed publications, making content easier to find and integrate with other data. Users will have seamless access to multi-disciplinary content extracted from Wiley journals and maps including high impact journals such as Basin Research, the Journal of Petroleum Geology, and Sedimentology and journals published on behalf of the American Geophysical Union including Geophysical Research Letters,Reviews of Geophysics, Journal of Geophysical Researchand Paleoceanography.

– See more at:

W3C and OGC to Collaborate to Integrate Spatial Data on the Web

www.sensors and  6 Jan 2015

Jan. 6, 2015—The W3C and the Open Geospatial Consortium (OGC) announced today a new collaboration to improve interoperability and integration of spatial data on the Web. Spatial data —describing geographic locations on the earth and natural and constructed features— enriches location-based consumer services, online maps, journalism, scientific research, government administration, the Internet of Things, and many other applications. In the United States alone, geospatial data and services are estimated to generate $1.6 trillion annually.

“Location, as well as providing context to much of today’s online information, is vital to the emerging field of connected devices,” said Ed Parsons, Geospatial Technologist at Google. “Through this collaboration we hope to make the understanding of geospatial knowledge a fundamental component of the Web.”

Spatial data is integral to many of our human endeavors and so there is a high value in making it easier to integrate that data into Web based datasets and services. For example, one can use a GIS system to find “the nearest restaurant” but today it is difficult to associate that restaurant with reviewer comments available on the Web in a scalable way. Likewise, concepts used widely on the Web such as “the United Kingdom” do not match the geographic concepts defined in a GIS system, meaning Web developers are missing out on valuable information available in GIS systems. Bridging GIS systems and the Web will create a network effect that enriches both worlds.

“Location plays a vital role for BBC Online, not least in our remit to provide timely information for Weather, Travel and Local News,” said Chris Henden, Service Owner for Location Services at BBC Future Media. “It matters across the service, from maps showing places of note in World War One, to detailed context for breaking news. We source data from various third parties, then transform, curate, and make it available to our front-end services. There is a perceptible gap between the specialised world of geographic data, and that of the Web. Bridging that gap can take significant, repeated effort, and is not always successful or possible. Therefore this collaboration between the OGC and W3C is more than welcome.”

More than 100 participants discussed these challenges at the March 2014 Workshop on Linking Geospatial Data, co-organized by OGC and W3C in partnership with the UK Government Linked Data Working Group, Google, and Ordnance Survey (the UK mapping agency). Stories ranging from management of data in response to the Fukushima nuclear plant accident to the use of spatial data to create new services from spatial and historical data in the Netherlands illustrated a diverse set of integration benefits and challenges.

Informed by the conclusions from that Workshop, the collaboration announced today will enable publishers of spatial data, providers of services that consume the data, and application developers to establish common practices and reduce the cost of integrating spatial data on the Web. Through the collaboration the Geospatial and Web communities will document use cases and requirements, develop best practices for publishing spatial data on the Web, and advance some existing technologies to W3C Recommendations and OGC standards.

– See more at:

ASPRS Tool: Keep Track of Professional Development Hours


To help you keep track of your professional development hours (PDHs), Please note the availability of the new PDH/CEU Tracking Spreadsheet.

This spreadsheet can be downloaded from the recertification page ( and is located in the “Recertification Guidelines” Section.

A direct link to the spreadsheet can be found here:

Relationship of and Transformations Between Most Common Reference Frames used in the U.S (Free webinar)

Free ASPRS Webinar  GEOBYTE  Date: 1/30/2015, 1pm Eastern

To register, click here.

Abstract: This presentation will cover the relationship of and transformations between the several geometric and vertical geodetic reference frames most commonly used in the United States. These include: The North American Datum of 1927 (NAD 27), the North American Datum of 1983 (NAD 83), the International Terrestrial Reference Frame (ITRF), the World Geodetic System of 1984 (WGS 84), the National Geodetic Vertical Datum of 1929 (NGVD 29) and the North American Vertical Datum of 1988 (NAVD). The program will specifically examine tools such as the gridded utilities NADCON and VERTCON developed by the National Geodetic Survey (NGS), as well as the commonly used 3-parameter abridged Molodensky transformations originally computed by the U.S. Defense Mapping Agency (DMA) and the more rigorous 14-parameter transformations supported by the NGS utility HTDP (Horizontal Time Dependent Positioning).

Uber signs deal with Boston to share ride data for public planning

Mic Wright at 7 Jan 2015

Uber has signed a deal with the city of Boston to share ride data, which it says will provide planners with new information on growth, traffic congestion, public transport provision and greenhouse gas emissions. It’s likely to be the first in a series of collaborations with municipal authorities.

The company will share anonymized trip-level data with the city by ZIP Code Tabulation Area, which is the way the US Census graphically represents ZIP codes. The information will include the date and time the trip began, where the trip began and ended, the distance travelled and the duration.

Uber says it will also offer the city technical support to interpret the data and support Vision Zero, the multi-national road safety scheme, along with other transport initiatives.

Boston’s Mayor, Martin J. Walsh, says: “In Boston, data is driving our conversations, our policy-making and how we envision the future of our city. We are using data to change the way we deliver services and we welcome the opportunity to add to our resources.”

The data provided by Uber will be used to feed into decisions in areas including traffic planning, congestion reduction, flow of people across the city, disaster and event planning, zoning changes and parking provision.

Uber makes a point of including a quote from Jules Polonetsky, the Director and Co-Chair of the Future of Privacy Forum, in its release saying:

“Local regulators often demand and receive granular trip records from taxi and car service providers. It’s great to see Uber taking a more privacy friendly approach by providing ZIP code level records, which can be just as useful for city planning purposes but which help maintain individual privacy.”

It’s worth noting that Uber is currently clashing with the city authorities in New York over its refusal to share trip data – specifically where passengers are picked up and dropped off – with the Taxi and Limousine Commission.

Even more pertinently, Uber is being sued in California by the state’s Public Utilities Commission, over its refusal to share ride data. The CPUC wants access to that data to monitor the effect of new services on traffic flow, one of the issue’s this new deal with Boston will cover.  Click here for more.

This autonomous robot draws sand art and looks like a turtle

Josh Lownsohn in  10 Jan 2015

(credit: Beachbot)
(credit: Beachbot)

We’ve built robots to wash our dishes, vacuum our floors, and even to replace members of our family, so it should come as no surprise there’s now a robot that draws a perfect recreation of Disney’s Nemo — or anything else — on sand. This is the Beachbot, the result of a collaboration between Disney Research and Swiss Engineering school ETH Zurich. It’s been specifically designed to drag a tiny rake in the sand while maneuvering around beaches like a canvas, drawing computer-aided designs of Disney characters, or whatever else is loaded into its telemetry. You can also take it over by remote control and draw with reckless abandon.

The robot keeps itself from veering into a watery death by tracking its distance by laser to four poles that are stuck into the sand. And in order to keep from leaving footsteps like a clumsy human, the Beachbot uses what Disney refers to as balloon wheels, which look a lot like giant polyurethane skateboard wheels. Tucked just behind those is the rake system, which can be adjusted to create lines as thin as 5cm.

Before getting too excited about purchasing a Beachbot to impress dates or entertain your children, there’s no mention of this getting a commercial release anytime soon.