Ideas Matter




The Social Life Of The Internet Of Things: Machine Learning

O'Reilly Foo Camp 2015 panel submission. It's an evolving concept I call the Social Life Of The Internet Of Things

Just like your human friends on Twitter, the interactions between machines in a social networking context will allow things that would never happen otherwise to happen.   

The Social Life Of The Internet Of Things, a FLUX(Forward Looking User Experience) session: What if the IoT had a social networking layer between machines, replete with followers, trends and memes? This is multi-player game of ideas. Bring your seeds on how human social network signaling and machine learning can amplify and disperse new experiences in a user programmable browser/interpreter UI. What are the best use-cases that would arise from a Social Networking inspired IoT? The worst? 

Questions answered in this session:

-How can the things that make up the Internet Of Things become friends?

-How can Machine Learning, along with Social Networking signals from human users create, amplify and disperse new experiences in the IoT and beyond?

-What client experiences are enabled with a curated dynamic and extensible machine to machine social graph?

#IsTheClubBumpin? Social IoT Use Case: Interview with Olin Hyde, CEO of Englue, an AI company employing Machine Learning technology

One of the IoT social flow examples I came up with involves something like a camera that counts people, maybe for a nightclub, telling the staff how much space they've got at any given time, for code compliance/safety reasons. 

Along with the People Counter camera, a "Shazammer", a little box that listens to songs playing in the space, creating a timeline of artist, genre, beat structure, etc. It also creates a stream of income for nightclubs as nearby patrons hear a song and hit it up for the artist link and are directed to an affiliate sale purchase. 

So both of these machines have day-jobs, their own reasons to be plugged in and active. The social-networking-like friending of the two could result in a "meme", an application that uses the timeline output of each device to produce new and useful data. In this case something line an #IsThisClubBumpin meme would correlate the music being played with how many folks are on the dance-floor. The result is piped into the timeline of the Meme available as a service to other Apps, like Google Maps for example.You might browse clubs on a map and notice small icons dancing about the satellite overhead view, indicating a good time is being had by all within. 

Coming soon, more on how Machine Learning might make authoring Social IoT Memes into a cross between the humble #Hashtag and IFTTT



#RoboStache SXSW 2015 Part 1: "Helper Disks"


Getting ready for SXSW 2015 with #RoboStache, a 3D printed fake mustache with hi-brite LEDs, sensors and Arduino microcontroller. The device will facilitate attendee brainstorming walkabouts around Austin. Sign up through the mail list at

The 'Stache is a social experiment I hope will replicate some of the epic sessions we have had at Qualcomm 

I was a co-founder of something called FLUX (Forward Looking User Experience), an employee driven collaborative IP generation methodology. Through our moderated approach, we were able to generate more than 60 patents in 60 months.

Here is an example from one of my personal contributions to the Qualcomm patent portfolio in the UX space, for what happens to Tablet/Smartphone content hidden beneath the users fingers:


Helopter Shapeways Unbox

Helopter: Single seat ducted-air craft mini model from Shapeways

#HOODSAFEROCK: Fenech-Soler Interview

Caught up for a brief chat with UK electronic music band Fenech-Soler. I found their collaborative musical experimentation process interesting. Indivdual band members sketch up music ideas in tools like Logic and Reason and their output is fed to a band member who uses Pro Tools to integrate ideas and polish them into a finished song. This process reminds me alot of software integration, where code is taken from several developers and merged together to form a final build. 



#NOTGLASS v2-Beta Modular Headset Frame

Not Stylish, Not Subtle, #NOTGLASS


Director Casey Neistat, SXSW 2014

The #NOTGLASS project began as an exploration of the Digital Divide, focused on Google Glass, during the 2013 SXSWi (South By Southwest Interactive) conference in Austin Texas, consisting of a cardboard box, Android phone, 3G Wifi Hotspot, and powered speakers. The conversations and experiences gathered while interacting with conference attendees inspired the idea that users should be able to easily expand wearable headset platforms.

The 3D Printable Kit available for download


Where the incredible Google Glass project drives tightly integrated, minimalist hardware design for delivering contextually filtered information to the user, #NOTGLASS aims to ultimately provide a modular platform for integrating hardware from a variety of sources, much like the early IBM compatible PC motherboard. A platform for head mounted, wearable computing experimentation will enable users to discover new ways to bring the world around them and the internet together.

The current incarnation of #NOTGLASS is 3D printed, made up of modular interlocking parts that align to rail-mounts. Hollow spaces run through each modular part to allow for cable routing. Applications are limited only by the imagination, anything from BCI (Brain Computer Interface) headsets, to 3D laser environment scanning, to 360-Degree video capture.