Sensors are the future of distributed data. General-purpose computing is dissipating and becoming increasingly embedded into our lives (four words: the Internet of Things). We will soon begin to move in a sea of data, our movements monitored and our environments measured and adjusted to our preferences, without need for direct intervention.

What will this look like? How can we create and shape it? How can we introduce the relevant hardware to people who already possess data analytics skills?

One answer is: the O'Reilly Data Sensing Lab.

The Lab started at the O'Reilly Strata Conference in New York in October 2012, where we gave attendees a taste of the super-connected world that's ahead of all of us by instrumenting the conference environment with basic sensors and wireless mesh networking. 

The Data Sensing Lab is run by a dedicated group of O'Reilly Media and Make employees and other key persons outside these organizations.  The active members are Alan Alasdair, Kipp Bradford, Rob Faludi, Kim Rees, Veton Saliu and Julie Steele.


We built more than 40 sensor motes using Arduino Leonardo boards, XBee radios, and a handful of off-the-shelf parts, including a PIR motion detector, a temperature and humidity sensor, and an electret microphone amplifier. These motes were distributed around the conference venue, and reported back during the conference. The data was made publicly available online.


At the O'Reilly Strata Conference in Santa Clara in February 2013, we added more sensors (which included rebuilding some that had been damaged in transit, for a total of around 50 sensor motes), real-time visualization, and a new interactive feature for attendees: the “Awesome Button,” a giant red button outside of each session room, which attendees were encouraged to push as they exited if they thought the talk they had just seen was awesome.


We deployed over four hundred sensor motes—with over four thousand data streams running over Device Cloud by Etherios—to continuously monitor temperature, humidity, pressure, light, air quality, motion, and both RF and audio noise levels in San Francisco's Moscone Center during Google I/O.  We've teamed up with the Google Cloud Platform team to gather, transform, and analyze the data, then show off some heat maps and other data visualizations in collaboration with the Google Maps team.  Want the data from Google I/O? Get it here.

ONA 2013

The O'Reilly Data Sensing Lab explored sensor journalism with the fine folks in the Online News Association at their annual Conference & Awards Banquet. We've scaled back again (from the hundreds of sensor motes we deployed at Google I/O), but we've evolved! We built new battery-powered boards, courtesy of kippkitts.  This event also marks an exciting collaboration with John Keefe and his colleagues at WNYC, who have designed some custom data visualizations. 


The O'Reilly Data Sensing Lab will be at the Mozilla Festival in London. We'll have 20 sensor motes scattered around the venue. Measuring temperature, humidity, pressure, light and both ambient and radio noise.  The data will be flowing live throughout the weekend into Digi's Device Cloud and will be available to hack on both during and after the conference.


This was our third Strata conference, and we brought all the hardware you know and love, along with the new hardware we developed for the ONA conference. We've deployed motes across the third floor, and our new hardware is lurking under seats in the keynote room. We've also come up with an interesting new use for the Awesome Buttons … a pop quiz!