Sunday, June 21, 2009

Automating Repetitive Twitter Tasks Using Linux

First, I realize this is going to generate some issues for some, since Twitter's use is primarily for micro-blogging, keeping connected with others, but some have been using it for other various reasons.

Image representing Twitter as depicted in Crun...Image via CrunchBase

I think the producing backlinks (which really doesn't work as far as Google is concerned...but I digress) and driving website traffic are two good examples, and these are what I've been investigating with a couple of accounts at Twitter.

One of those accounts is @zippydpinhead. I picked on him because I used to read Zippy the Pinhead comic strips in HighTimes magazine in my early 20's and thought it was...*dude*...*funny*...*cough**cough*...

I primarily use Ubuntu for my workstation these days, but have used Linux for years on the desktop and in small enterprise.

My experiments were first to see if automation worked. Fortunately we have curl and a multitude of posts on the subject floating about the internet.

My zippy character posts several times per day, using this crontab entry the following simple shell script and the fortune program. It is supposed to redirect stderr to a log, and stdout to /dev/null. I'm still experimenting with the redirect bit, so please correct me if it's wrong ;) :

# This runs our script 6 times starting at 1pm
*/10 13 * * * /home/me/bin/zippy.sh
#!/bin/sh

# Compose our message using fortune, and the zippyism's
MSG=`fortune zippy`
curl -s -u zippydpinhead:zippypassword -d "status=$MSG http://is.gd/16zaD #zippythepinhead" http://twitter.com/statuses/update.xml 2>/where/youwant/logs/zippytwits.log 1>/dev/null

This found me wondering if I could automate other things...so I came up with this script, that pulls down my followers, parses the contents, and adds a few tweets on Fridays for #followfriday

#!/bin/bash

USER=`grep twitter.com ~/.netrc | awk '{printf $4}'`
PASS=`grep twitter.com ~/.netrc | awk '{printf $6}'`
TEMP="tempff.txt"
NAMES="followers.txt"
RAND=`cat /proc/sys/kernel/random/uuid | cut -c1-4 | od -d | head -1 | cut -d' ' -f2`

# This gets a max of 100 users from our followers list
# since we know we have > 2000 followers, we'll randomly select
# from one of 20 pages
PAGE=`expr $RAND % 20 + 1`

curl --netrc -s http://twitter.com/statuses/followers.xml?page=$PAGE > $TEMP

# This rips everthing out except the properly formatted screen name
sed -e "/<screen_name>/!d" \
    -e 's/^\ *//' \
    -e 's/<\/\{0,1\}screen_name>//g' \
    -e 's/^/@/' $TEMP > $NAMES
    
# remove our temp file
rm $TEMP

# Extract followers
NFOLL="10" # This is the number of followers we will cull from the list
LINES=`cat $NAMES | wc -l`
LINE=`expr $RAND % $LINES + 1`
MSG=`head -$LINE $NAMES | tail -$NFOLL`

# Announce our ff
curl --netrc -s -d "status=Follow Us!: $MSG #FF http://bit.ly/nFu0I" http://twitter.com/statuses/update.xml

# remove our ff file
rm $NAMES

If you change the amount that tail grabs from the followers1.txt line (it's -10 right now) you can modify the number of users you want to suggest for #ff

I recognize the reason why this bot-like method of updating your Twitter account will be unpopular with most, but how and why you use these methods are up to you.

Reblog this post [with Zemanta]

2 comments:

Dennis said...

Hi John,

Cute Kids... I would like to know if your MSM-f- Access Project program is free ware?

I am about to start catologing a large house and private museum in Scotland and need a Collection Management System. We are a poor charitable Trust and need to minimize costs...

Please let me know if I pruchase MS Access and download your program will I be able to use it. I am just average at computer skills and need something that is fairly simple to use...

Thanks for your help. I leave for Scotland on the the 13th.

Look forward to hearing from you soon.

Dennis I. McAllister
President and Board Chairman
Clan MacAlister Society, Ltd.
An Educational 501(c)(3)Non Profit

John Croson said...

Yes, it is freeware.

If you check the project site, you'll find instructions on how to download, and use WITHOUT Access.