Pictured above: Charlie enjoying a day in her pool with pool masters (and beta testers) Randy and Tabitha Savage
We realized early that it would be important to us to gather as much data about how people test and maintain their pool or spa water. Since we’re in Silicon Valley and data geeks at heart, we got inspired by an app that most of you probably use daily — Waze.
Waze is an app on almost everyone’s phones today but the most inspiring aspect of Waze is how they started. They wanted to make an application where drivers could exchange information about traffic and other road conditions. The hurdle they encountered was the GPS data required expensive subscriptions.
To circumvent this, they asked their first users to run an app while driving. To complement the raw data generating the maps, they asked other volunteers to edit those resulting maps to add street names. With numerous drivers generating the same maps and other volunteers editing the maps to ensure accuracy, a free GPS and road conditions program evolved.
As more and more volunteers contributed, the maps became more accurate. Of course, new roads spring up, detours happen with construction projects, and traffic patterns change with each hour of the day. Using Waze’s crowdsourced method, a treasure trove of data is continually processed benefitting local drivers worldwide. Although Waze was later acquired by Google, the crowdsourced data continues to be processed today.
The Wisdom of the Crowd
In the spirit of Waze, Sutro has been crowdsourcing data over the past month to eventually incorporate into our new Smart Monitor system. Specifically, we have been looking at the app since we don’t yet have testable devices (they are coming soon). The more data we can collect and process, the better we will be able to help people all over the world keep their pools and spas safe for their friends and family.
From the user point of view, app testing consists of taking manual pool readings, emailing them to us, and then looking at the app to see the recommendations. Soon, we’ll be taking this a step further and will ask for your confirmations or denials about the suitability and accuracy of those recommendations.
Along with testing, we are also collecting the chemicals people use to treat their pools so that we will be able to add these chemicals to our database and have the ability to recommend any of them in our expanding library.
Crowdsourced Results
Participation in the phase 1 beta program has exceeded expectations and the data that has been collected has generated over a dozen improvements to our system. Participants are from all over the US and Canada and have collectively completed over 200 pool readings and uploaded close to 300 pool chemicals over the last four weeks. The following graphs show the distribution of readings for each of the readings we measure:
FIrst, let’s look at the pH graph above. As you can see, the majority of readings come in between 7.4 and 7.6 which is perfect but many are significantly above and below. Also, since our lower limit is 6.8 and our upper limit is 8.4, those graph lines are a bit misleading as many of the readings are outside those limits.
Second, the Free Chlorine graph. Here you can see the majority of readings hover around the 3.5 level and the zero level. As with the pH graph, we have an upper limit here too, set at 10ppm. We’ve received some readings as high as 20ppm. Last, but not least, is the Total Alkalinity graph. This one seems to be easy to get in the desired range of 100-120. The readings at the lowest limits reflect some of our beta testers who just opened their pools and before any chemical treatment.