Login | Register   
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Building an Application with the Twitter API : Page 2

Twitter API libraries are available for most popular programming languages, and they're opening up new possibilities for publishing information. Explore a short Python script that tweets when the Dow Jones Industrial Average (or any stock or index you like) makes a big jump.


advertisement
After storing and testing the script on my host provider's system, I set up a cron job to run the script at 52 minutes after the hour from 9 AM to 4 PM Eastern time. The market is open from 9:30 to 4:30, so with the 20-minute delay, the script checks every hour from the posting of the opening price to the posting of the closing price. Another cron job executed at 9:30 deletes any existing files that store the result of earlier tweetstockjump.cgi executions, so that the day's first call to the script compares the latest price to the day's opening price.

The complete script is shown in Listing 1.

The script starts with the import statements that identify any necessary specialized libraries:

  • urllib to read the remote Yahoo! Finance data
  • os.path to check for the existence of the data file with older data
  • string to convert the retrieved comma-separated number strings to floating point numbers
  • python-twitter to assemble and send a message to Twitter's server
When I logged in to my host provider's server and tried this interactively, it all worked fine, but the first time I had a cron job try to execute it, it didn't know about the python-twitter library or the simplejson library that it depends on. Why? Because when I was logged in, it knew that these were in my PYTHONPATH, but the process started by the cron job didn't know about my PYTHONPATH. That's why my own copy of tweetstockjump.cgi has sys.path.append statements to point to the python, simplejson, and python-twitter libraries before the import twitter line: to ensure that the cron job process can find those. I omitted them above because your need for them depends on your own environment.



Along with the assignment of variables that let any Twitter ID check jumps of any increment in any stock or index, the beginning of the script also creates a dataFileStorage variable to identify the directory holding the files that store the results of recent checks of stock or index figures, so don't forget to create this directory before running the script. The files themselves are named after the ticker symbol with a .csv extension, so that in the script as shown, the Dow Jones information is stored in a file named INDU.csv in a tweetstockjump subdirectory.

After setting all these variables, standard Python code reads the CSV file identified by the quoteService variable and the stock symbol into a newData string, which is then split into the newDataFields array and copied to variables that identify the information: newQuote, newTime, openingPrice, and dayDiff for the difference in the stock price since the day's opening.

If no file exists with old quote figures for this stock, the openingPrice value is used for oldQuote and a time of 9:30AM is assigned as the oldTime. If the file does exist, values from there are read into an oldData variable that gets split into an oldDataFields array, and then assigned to variables for comparison with the new data. If the stock price has moved more than increment since oldTime, the next few lines assemble phrases into a complete sentence to post to Twitter's server.

The commented-out print tweetMsg line is handy when debugging to make sure that all the preceding code does its job properly before you start sending messages off to the Twitter server. The real point of the script, though, is to tweet that message, which is what the next two lines do. Passing a Twitter username and password (the script above uses the fake username and password twitguy and tgpw; replace them with real values before trying the script) to the twitter.Api call creates an instance of the Api class. Among the variety of methods you can call for instances of that class, PostUpdate posts the string passed as a parameter just as if you had entered that string into the "What are you doing?" box on that Twitter account's homepage.

This script essentially treats Twitter as a device to write to, but there's plenty of data that you can read in with the API as well. The PostUpdate method returns a status message showing how well the method call went, and other Twitter API methods can return much more information, such as the results of a query or recent tweets by accounts that you follow. Put all this together and you have a new output medium, with more data to read every minute and millions of connected users. It all adds up to a powerful platform, just as its inventors intended.

As a bonus that its inventors didn't intend, Twitter's open-source competitor, identi.ca, offers a Twitter-compatible API so that a global replace of twitter.com with identi.ca/api in the URLs used for method calls (and appropriate changes to the username and password used for authentication) invoke those methods on identi.ca's server. Identi.ca is based on the open source Laconica project, and you can install and run your own Twitter-like Laconica server, perhaps behind a firewall for use within your company. If so, slight changes to the same API URLs allow you to call these methods on your Laconica server, so the people and processes of your company can use it to communicate about the work that they're doing.

This opens up even more application development possibilities. While some people use Twitter to announce to the world whether they're having a second cup of coffee that morning, its API lets you create new kinds of communication between the people of a large or small community and the applications that they may wish to use—all with a few new calls in your favorite programming language or with a few well-chosen URLs.



Bob DuCharme, a solutions architect at Innodata Isogen, was an XML "expert" when XML was a four-letter word. He's written four books and nearly 100 on-line and print articles about information technology without using the word "functionality" in any of them. See his blog at snee.com/bobdc.blog for more.
Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap