A few months ago I bought a weather station but only lately registered with Wunderground and was slightly dismayed that the software I use (Cumulus, which is heaps better than what the station supplier included) does not post historical data; it only uploads to Wunderground what it downloads from the station when Cumulus starts up.
It seems there is no “off the shelf” way of doing this so here is a how-to-do it. The steps should be applicable to any weather station software that stores the data as a “CSV” or formatted text file, with a little bit of fiddling. I guess a similar process works for weather upload sites other than Wunderground too. The steps are:
- get the stored data into a spreadsheet. This is easy for Cumulus since the data is stored in a text file with commas separating the values and the Cumulus help file tells you what each column is.
- convert the units. I prefer SI units but Wunderground uses American conventions.
- create a URL (web address), one for each data record to be uploaded. The structure for these is documented on the wunderground wiki. You can simply copy these into a web browser (Firefox, Internet Explorer etc) one by one but once you’ve had the satisfaction of seeing it work, the remaining thousands of records need a better method.
- use wget to process a batch of URLs. Wget is free open source software. Windows users can find in installer here.
I’ve uploaded an example spreadsheet that does the conversion and creates the URLs. Notice it has three sheets and that the “Conversion Factors” also contains the base of the URL. This must be edited to have your station name and password. NB there is also a Celcius-Farenheit conversion and the peculiar column L on the “Converted” sheet is needed to get a correctly formatted URL from the date-time combination.
To use wget I copied the URLs into a “.txt” file, one URL per line (just drag down the entries in the column of the spreadsheet and copy/paste into an empty text file) and created a small “.bat” file containing:
PATH “C:\Program Files\GnuWin32\bin”
wget -v -o log.txt -O responses.txt -i “urls Apr12.txt”
You may need to alter the first line depending on where you installed wget. The second line assumes the list of URLs was saved to “urls Apr12.txt”. The “log.txt” file contains verbose logging of each “wget” and the “responses.txt” file should just contain a long string composed of one “sucess” for each sucessful posting of data. If it doesn’t, something went wrong…
Thanks for doing this. Once I had figured out how to install wget and, more importantly, the associated DLLs, I used this approach with 8 hours worth of data that had been lost from WU due to internet being down. Only problem is that, I guess, times need to be changed as well because I see that my data is now starting at almost 2 am whereas it is actually from 2 hours earlier. I guess I can subtract 2 hours from all my times. The other thing is that it is rapid-fire so I should probably thin my data out a bit. Can I correct the info. that is now two hours out?
Just an update. I was able to correct the info. and resubmit but Wunderground, at this time, only caters for single record deletes (and doesn’t overwrite existing records, so they all have to be deleted) so it was a lengthy process deleting 8 hours of per minute data.
Shouldn’t column D in the converted be column O from the raw? Looks like column D/E are the same. I think also, based on my Cumulus output, the conversion factor for wind should be 1.
Actually, wind conversion factor (for me) is 0.62137 as the raw data is in kph so it has to be converted to mph in order to be converted back correctly.