Hi,
I've been using the GreenEye for about 6 months to monitor my electricity usage in my home here in upstate NY. I'm collecting data at 10 second intervals and storing it to a mySQL database using btmon running on a Windows XP box. I wrote a few Python scripts to do some post-processing and plotting of the data. I thought I'd share the highlights (PDF attached) in case others might be interested or it might spark some new thoughts on useful ways others have found to make the data useful. The mnemonics I've used for individual circuits are a little cryptic but hopefully still meaningful (e.g., SE = SouthEast, Out=Outlets, etc).
I'm planning to move to Nova Scotia Canada soon and I'll be taking the GreenEye with me. I like that aspect of portability and am looking forward to collecting some new data in a different location.
Enjoy!
Bob
GreenEye Home Monitoring Summary
-
- Posts: 21
- Joined: Fri Dec 07, 2012 3:18 pm
GreenEye Home Monitoring Summary
- Attachments
-
- GreenEyeSummary.pdf
- My summary of my set up and post-processing of home energy monitoring with the GreenEye monitor.
- (2.29 MiB) Downloaded 588 times
-
- Posts: 32
- Joined: Wed Apr 24, 2013 8:18 pm
Re: GreenEye Home Monitoring Summary
Would you be willing to share the python code for generating those heatmaps? Those are great! My data is in an rrd, so I'd probably have to muck with it, but it would be worth it.
-
- Posts: 21
- Joined: Fri Dec 07, 2012 3:18 pm
Re: GreenEye Home Monitoring Summary
Sure. All the scripts were created for my use so they are not well documented, elegantly designed, nor well tested but that being said I'm happy to share what I have. A couple of things to note about the process I use that may be helpful in deciphering them.garbled wrote:Would you be willing to share the python code for generating those heatmaps? Those are great! My data is in an rrd, so I'd probably have to muck with it, but it would be worth it.
As I mentioned, I collect the data from the GEM using btmon and store to mySQL. I use getallrecords.py to extract a chunk of the mySQL records to a CSV file. getallrecords.py takes an optional set of arguments (yr month day hr mon sec) to define a start point in the DB and then extracts to the end. I then manually partition the CSV data according to my utlility billing period. I also use my utility bill to determine a cost-per-kwhr for that period.
I have a flash drive with the following file structure. Top level folders - Data, Analysis, and Figures. Data and Figures have sub-folders for each billing period (e.g. Mar_Apr). All of the scripts sit in the Analysis folder and I put the CSV file for a given billing period in the proper Data sub-folder. Then, I just run RunAll.Bat from the Analysis folder with the proper arguments.
I use the mySQL, numpy and matplotlib extensions to Python 2.7.
I'm attaching ZIP and TAR archives of the scripts.
Have a blast!
Bob
- Attachments
-
- Scripts.zip
- (14.7 KiB) Downloaded 361 times
-
- Scripts.tar
- (48.5 KiB) Downloaded 373 times