Signals from the Crowd

“Uncovering Social Relationships through Smartphone Probes”

By: Marco V. Barbera, Alessandro Epasto, Alessandro Mei, Vasile C. Perta, and Julinda Stefa Department of Computer Science, Sapienza University of Rome, Italy,

Using low level wiFi probe-request frames to sense devices and infer characteristics  of people carrying those devices. They also use the PNL (preferred network list) to match people based on public SSID’s they have previously accessed.

http://conferences.sigcomm.org/imc/2013/papers/imc148-barberaSP106.pdf

Abstract

The ever increasing ubiquitousness of WiFi access points, cou- pled with the diffusion of smartphones, suggest that Internet every time and everywhere will soon (if not already has) become a re- ality. Even in presence of 3G connectivity, our devices are built to switch automatically to WiFi networks so to improve user ex- perience. Most of the times, this is achieved by recurrently broad- casting automatic connectivity requests (known as Probe Requests) to known access points (APs), like, e.g., “Home WiFi”, “Campus WiFi”, and so on. In a large gathering of people, the number of these probes can be very high. This scenario rises a natural ques- tion: “Can significant information on the social structure of a large crowd and on its socioeconomic status be inferred by looking at smartphone probes?”.

In this work we give a positive answer to this question. We or- ganized a 3-months long campaign, through which we collected around 11M probes sent by more than 160K different devices. Dur- ing the campaign we targeted national and international events that attracted large crowds as well as other gatherings of people. Then, we present a simple and automatic methodology to build the un- derlying social graph of the smartphone users, starting from their probes. We do so for each of our target events, and find that they all feature social-network properties. In addition, we show that, by looking at the probes in an event, we can learn important sociolog- ical aspects of its participants—language, vendor adoption, and so on.

Leap Motion musical UI for closed eyes

How many musicians do you know that play with their eyes closed? Not many computer music apps allow this. Bloom is an exception.  http://www.generativemusic.com/bloom.html

As an exercise, I tried to make something like Bloom, using Leap Motion. With your eyes closed you can accurately position your hand at the level of eyes, shoulders, hips, etc., And you can quickly move to a point directly outward from your nose, or shoulders. This is the basis of sobriety tests.

The interface, works with a hand motion like sprinkling seeds. Every time you open your hand, it triggers a note based on the height of your hand. It also triggers one of the “Bloom”  circles at the XY position.

The prototype was done in Max/MSP Jitter. It was derived from a “bloom clone” project by John Mayrose at: http://www.johnmayrose.com/maxmsp.html

Here’s an example:

download

https://github.com/tkzic/max-projects

folder: bloom

patches

  • (main patch) circlething.maxpst
  • (poly~ sub-patch) FMPoly2~.maxpat
  • (Leap Motion main-patch) leap-finger-switch.maxpat
  • (Leap Motion sub-patch) leap-switch-test2.maxpat
externals and dependencies

Note: If you don’t have a Leap Motion sensor, you can use a mouse.

If you are using Leap Motion, download the aka.leapmotion external – and add it to your Max file path in options | file preferences: http://akamatsu.org/aka/max/objects/

instructions

(if not using Leap Motion sensor, skip to step 4)

  1. Plug in the Leap Motion sensor.
  2. Open leap-finger-switch.maxpat and click the “start” toggle. 
  3. Wave your hand around – it should be detected and displayed
  4. Open circlething.maxpat
  5. If using mouse, just click in the black “circlething” window to play.
  6. If using Leap Motion, click the message box to activate Leap Motion
  7. Then open and close your hand, over the sensor to play
  8. High notes are higher in the window.