AppDailySales, and running on Synology NAS Devices

Continuing my prior post where I talk about my server replacement, a Synology NAS (DS211j). In essence, a cheap, tiny dedicated Linux box featuring an ARM CPU and custom web UI specially designed for SUPER EASY RAID array maintenance and creation. There’s a 4 drive version, some many-more drive versions, but they cost more.

This time, I’m here to talk about a little tool for getting your daily sales data from Apple’s iTunes Connect website. AppDailySales!

http://code.google.com/p/appdailysales/

This is a python script I was recommended some months back, but it needed a few things IMO. So in a shocking rare case of OSS justice, I added the features I wanted myself and submitted a patch:

http://code.google.com/p/appdailysales/issues/detail?id=31&can=1

Just a few days ago (nine months later), version 2.9 was released (sparked by Apple’s change to the system), now including my patches! 😀

To run the script, all you need is Python. Before I used to run this on Windows XP (Python installed in C:\Python2.7\, with PATH environment variable set to include this).

Now with the NAS, I SSH in to the device (Go download Putty if on Windows), login as root (same password as admin), and install the python package.

ipkg install python

Easy. Done.

Place the AppDailySales.py script somewhere on your computer, likely in the same folder you want your stats placed. Then simply invoke it with –help to get a list of arguments.

python appdailysales.py --help

I run it from a batch file or script. My old Batch File Version looks like this:

The new, shockingly similar shell script version (i.e. only single %’s) is this:

This creates files like so:

2010/2010-12/Daily-2010-12-13.txt
2010/2010-12/Daily-2010-12-12.txt
2010/2010-12/Daily-2010-12-11.txt
2010/2010-12/Daily-2010-12-10.txt

Nice and easy to read names, unlike the default files generated by iTunes Connect.

In the next section, I’ll talk more about setting it up as a service. In other words, totally automated!

* * *

Regretably, I did not document what I did on the Windows XP side.

But, looking at the state of my server, it looks like I created a separate user for this script. This user has Task Scheduler task set up for him, that runs the batch file every day at noon EST (a good 4 hours after sales data usually goes online). Task Scheduler can be found in the Control Panel under Administrative Tools.

I can’t remember if there was a specific reason I created this extra user though. Sorry.

* * *

I am not a VI expert, but the SVN instructions from last time taught me a few things… a few things I’ll probably forget, so here comes some point form TODO.

vi /etc/crontab

SHIFT+A to enter Append mode, and jump to end of the current line.

Press Enter. There should be no entries in the crontab file (just description). Use tabs to jump between notes.

For my 12 hour daily task, I did the following:

0 12 * * * appdailysales cd /volume1/Mike/SalesStats/;./getstats.sh

appdailysales is a dummy user created that will run the task (I don’t even know the password). The invocation changes to the correct director

Press ESC to exit append mode.

Now you can type “:wq” to write the file and exit.

The job is now added. Finally, restart the service.

synoservice --restart crond

And that’s it! We’re done! Admittedly, I haven’t checked if this works yet, but IT SHOULD! Hehe. I’ll know in the morning if I have a new log file.

* * *

And there we have it! Done! I have now moved the entirety of my server function to the Synology NAS device.

Now my dying server is free to be formatted, dead drives removed, and converted in to a workstation running RAID 5 on the remaining disks.