Twitter streaming php decoder breaks out individual tweets

This code was adapted (i.e. stolen verbatim) from a stackoverflow post by drew010

http://stackoverflow.com/questions/10337984/using-the-curl-output

Here’s the code. It solves a huge problem for the class of projects which need to grab a large amount of tweets in real time to either save in a database, or trigger some action.

My version of the code is in tkzic/api/twitterStream1.php

<?php

$USERNAME = 'youruser';
$PASSWORD = 'yourpass';
$QUERY    = 'nike';

/**
 * Called every time a chunk of data is read, this will be a json encoded message
 * 
 * @param resource $handle The curl handle
 * @param string   $data   The data chunk (json message)
 */
function writeCallback($handle, $data)
{
    /*
    echo "-----------------------------------------------------------\n";
    echo $data;
    echo "-----------------------------------------------------------\n";
    */

    $json = json_decode($data);
    if (isset($json->user) && isset($json->text)) {
        echo "@{$json->user->screen_name}: {$json->text}\n\n";
    }

    return strlen($data);
}

$ch = curl_init();

curl_setopt($ch, CURLOPT_URL, 'https://stream.twitter.com/1/statuses/filter.json?track=' . urlencode($QUERY));
curl_setopt($ch, CURLOPT_USERPWD, "$USERNAME:$PASSWORD");
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'writeCallback');
curl_setopt($ch, CURLOPT_TIMEOUT, 20); // disconnect after 20 seconds for testing
curl_setopt($ch, CURLOPT_VERBOSE, 1);  // debugging
curl_setopt($ch, CURLOPT_ENCODING,  'gzip, deflate'); // req'd to get gzip
curl_setopt($ch, CURLOPT_USERAGENT, 'tstreamer/1.0'); // req'd to get gzip

curl_exec($ch); // commence streaming

$info = curl_getinfo($ch);

var_dump($info);

Running http: requests from Max

notes
  1. How to separate the status return code  from the actual response data?

For jit.uldl-  status reports get sent out the right outlet and errors are reported in the Max window. However their doesn’t appear to be a way to get the http: status codes or other header data.

For curl, you can write the response (JSON for example) to a file. Then you can read the file using the [js] object and parse the JSON. If you are using [aka.shell] to run the curl command, the stdout and stderr can be routed from the object – for instance, into the Max window. The -v flag (verbose mode) causes curl to output a bunch of header data.

 

 

 

 

sending Tweets with curl in Max,

Using xively.com and zapier.com

Note: To get this project to work you’ll need a Twitter account. And you’ll need to set up a device (feed) at xively.com and a ‘zap’ at zapier.com as directed in this post. It explains how to send tweets using triggers.

 https://reactivemusic.net/?p=6903

Also, you may notice delays due to the number of steps involved.

Looking for an easier way? Send Tweets using ruby: https://reactivemusic.net/?p=7013

download

https://github.com/tkzic/internet-sensors

folder: twitter-curl

files

Max
  • tweetCurl5a.maxpat
externals

[aka.shell] download from here: http://www.iamas.ac.jp/~aka/max/ – and add the path to the folder to Options | File Preferences in Max

authorization

  • xively.com feed id and api-key is embedded in max patch
  • you need a Twitter account
  • you need to set up a xively.com feed with twitter trigger, (as described here  https://reactivemusic.net/?p=6903) to get your own feed id, API-key, and authorize access to your Twitter account

instructions

  • Open the Max patch: tweetcurl5a.maxpat
  • enter your xively feed number and API-key into the fields (then press enter)
  • Type your Tweet text.
  • Press the big green button.

notes on curl

You can use curl for http: requests in Max by formatting the command line with [sprintf] and running it in [aka.shell]. There are a few idiosyncrasies – for example with escape sequences.

In tweetCurl5a.maxpat, the curl command is built in two sections:

  1. The request data is written to a data file /tmp/abc.json
  2.  the actual curl command is formatted and run from the command line. 
Here is the part of the patch which formats request data:

Using ‘quotes’ with [sprintf]

You’ll notice a lot of backslashes used in [sprintf]. This is done to preserve quotes. Normally a quote in [sprintf] indicates a string. Use 3 backslashes to escape a quote:

\\\"

Passing arguments into [sprintf]

The [sprintf] code is obtuse because we are formatting JSON data. The resulting data looks like this:

{ "id":95586, "datastreams":[{ "current_value":"this is a tweet", "id": "tweet"}]}

 

Note that you can pass arguments into [sprintf] using %s – but if you are using a [textedit] to collect data from the user, you’ll need to use [tosymbol] to consolidate the text into a single symbol before passing into [sprintf]

Here’s the code which writes the formatted JSON data to a file:

The next step is to format the curl command, which will read the JSON data file and send an http: request to cosm.com. Here you can see the [sprintf] for this command.

Redirecting aka.shell output to the Max window

At the very end of the [sprintf] you will see

>2&1

This is the linux method to redirect error messages and standard output from [aka.shell] to the same place, which in this case will be the Max window.

command line curl

By the way, here is what the curl command will look like on the command line

curl -v --request PUT --data-binary @/tmp/abc.json --header "X-ApiKey: abcdefg1234567" http://api.cosm.com/v2/feeds/95586.json 2>&1

 

Note: The actual ApiKey above has been replaced with: abcdefg1234567 – so that you don’t accidentally send embarrassing Tweets from my account.

Cosm with Max

update 6/2014: Cosm is now Xively. Have not re-tested examples below. There is a working Twitter example at internet sensors projects: https://reactivemusic.net/?p=5859

original post

notes

Today I was finally able to get this working. Reading a Cosm (Pachube) feed from curl and from Max. Here is an example that works in curl: (replace API-KEY with actual key)

curl http://api.cosm.com/v2/feeds/76490/datastreams/Power.xml?key=API-KEY

You can get JSON responses by leaving off the .xml extension or replacing it with .json

Its critical to use “key=…” not “X-ApiKey=…” like in the cosm documentation, or you will get permission errors from curl and Max.

I was also able to get the Max project called “pachube report” from Nicholas Marechal to work (requires jasch and cnmat externals)

http://cycling74.com/toolbox/pachube-tools/

This patch uses the typical jit.uldl and jit.textfile objects and some regexp parsing tricks.

Next trick will be creating a feed and sending it to Cosm.