In our previous post, we talked about our experience this round at JIFX, the experimental event series organized several times a year by the Naval Postgraduate School to bring industry, academia, and government together in a field testing environment out at Camp Roberts, CA. The structure of the JIFX series is unique in its ability to put different technologies (and the engineers that build them) in the field together to simulate field conditions, and to collaborate with one another. Demonstrating success (or failure!) in collaboration and integration is critical for stakeholders to see things in action, plus it allows the engineers to test the limits of their software, hardware, or systems to see what works and what doesn’t. As an engineer and designer of Fulcrum, this collaborative space is incredible to see the tool out in the field, allowing us to relentlessly improve Fulcrum’s capability as a powerful field data collection platform.
The simulated EOC at Camp Roberts
Each participant at JIFX submits a short whitepaper for an experiment to conduct during the week, describing what the technology, product, or capability does, and the objective to shoot for during the event — sort of a proposal or “hypothesis” that you intend to test by setting up experiments, either solo or in conjunction with others. Over the course of several days, the participants get out in the field, set up their tech, and start trying things out. Some things work incredibly well, some things fall short — but the whole point is for engineers to get out there and push things to the limit to see what fails, and have some hard feedback on how to make improvements.
Aside from planned proposals, the JIFX model actively promotes “ad hoc” experimentation. As you explore what others are testing out, you can see where there are points of integration or collaboration, and put your tech together to create a workflow on-the-fly. Last week we did several ad hocs with Fulcrum, each very different, but valuable in proving interesting things to do with the Fulcrum platform.
Crowdsourced Imagery Extraction & Groundtruthing
On the first day at JIFX, we met up Luke Barrington from DigitalGlobe, who had brought along some recent satellite imagery for the area around Camp Roberts. Luke is one of the founders of Tomnod (now part of DigitalGlobe), a crowdsourcing platform for building tasks and games around identifying features in imagery. It presents the contributor a portion of imagery and asks you to tag things you see – structures, wildfires, vehicles & trucks, deforestation – all sorts of things that may be visible in imagery. The crowd’s contributions are then piped through an algorithm to generate a consensus for occurrences of those features. It’s an incredible platform for rapid response, especially interesting with current imagery for fast feature extraction. And it can be ridiculously fast. Luke even hooked our task up to Amazon’s Mechanical Turk to get things geotagged even faster.
Our experimental use case was simulating groundtruthing of damaged structures from near real-time satellite imagery; think earthquake or wildfire, creating widespread damage. We wanted to quickly get a current dataset of structures in the area, seed those locations into Fulcrum, then get some boots on the ground with the Fulcrum mobile app to field survey each location with a damage assessment app. Beginning with the imagery of Camp Roberts, Luke set up a Tomnod task to identify structures in the image. Putting this out to the Tomnod crowd, over 2,000 locations were tagged in less than an hour, giving us a good dataset of structure locations. Using Fulcrum’s data API, we piped these locations into a damage assessment app, so volunteers with smartphones could visit each site to survey potential damage to the buildings. As a result, we took information from fresh space-based imagery, to extracted data, to groundtruthed information from the field all within a couple of hours. The capability we demonstrated could be incredibly powerful to first responders, emergency managers, or other users looking to gain insight into current on-the-ground situations rapidly, and with limited resources.
Remote Deployment for Field Operations
Another experiment was to get a Fulcrum server up and running on a system dubbed the “EOC in a box” (emergency operations center), a portable server system in a rugged box that can be deployed into a field ops center. Buddy Barreto from Naval Postgraduate School had put the EOC in a box together, and was on hand to set up a virtual machine for us to host the local Fulcrum instance (alongside dozens of other virtual servers hosting things like ArcGIS and remote sensing software), to which we deployed the web application within the experimental ops center. This system allowed for response teams working out of the field operations tent to have access to sync and combine field-collected data in the Fulcrum web platform without needing access to the public cloud environment — a critical component to flexibility when responding in disconnected conditions. A local umbrella wifi network allowed systems in the EOC to talk to one another, and for mobile devices to sync data. This entire EOC server system ran for three full days on a Honda generator with 5 gallons of gas, and allowed the EOC to run fully, partially, or totally disconnected from the Internet.
XLSForm Compatibility
Since OpenDataKit is used in a lot of institutions for making forms and surveys, there was interest in adding interoperability between ODK and Fulcrum. Many folks have already built forms with the XLSForm spec — an Excel-based form design logic read by ODK tools — so Zac put together a simple conversion tool where you can drop in an XLSForm Excel file, and have it output the compatible form structure to import into your Fulcrum account. The code is open source on GitHub, and you can even give it a try to convert some forms yourself.
These are just a few of the cool experiments that went on during the week, and doesn’t count the numerous other conversations and ideas that happened along the way. The ad hoc field exploration environment provided by JIFX gives us an opportunity to link Fulcrum in with interesting tools others are building, to become a multiplier for those looking for field tools and better data. Thanks again to the NPS crew that puts the JIFX series together, it’s always a worthwhile time to get out there to learn and experiment. If you’re interested in knowing more about these experiments, feel free to ping me on Twitter, I’d love to see where we can go from here.