nilmtk icon indicating copy to clipboard operation
nilmtk copied to clipboard

Interpretation of Hart85 training

Open gjwo opened this issue 8 years ago • 6 comments

The position of the centroids, particularly with respect to reactive power doesn't look right to me, unless I am plotting the wrong things related to #444

Training
We'll now do the training from the aggregate data. The algorithm segments the time series data into steady and transient states. Thus, we'll first figure out the transient and the steady states. Next, we'll try and pair the on and the off transitions based on their proximity in time and value.
In [45]:

h = Hart85()
h.train(mains,cols=[('power','active'),('power','reactive')],min_tolerance=100,noise_level=70,buffer_size=20,state_threshold=15)
​
Finding Edges, please wait ...
Edge detection complete.
Creating transition frame ...
Transition frame created.
Creating states frame ...
States frame created.
Finished.
In [46]:

h.centroids
Out[46]:
(power, active) (power, reactive)
0   89.987508   105.169375
1   76.942336   540.287504
2   2086.828654 23.059838
3   719.493513  25.488081
4   2700.307958 44.042016
5   1329.800000 13.891310
6   789.500000  603.818910
In [47]:

plt.scatter(h.steady_states[('active average')],h.steady_states[('reactive average')])
plt.scatter(h.centroids[('power','active')],h.centroids[('power','reactive')],marker='x',c=(1.0, 0.0, 0.0))
plt.legend(['Steady states','Centroids'],loc=4)
plt.title("Training steady states Signature space")
plt.ylabel("Reactive average (VAR)")
plt.xlabel("Active average (W)");
labels = ['Centroid {0}'.format(i) for i in range(len(h.centroids))]
for label, x, y in zip(labels, h.centroids[('power','active')], h.centroids[('power','reactive')]):
    plt.annotate(
        label, 
        xy = (x, y), xytext = (-5, 5),
        textcoords = 'offset points', ha = 'right', va = 'bottom',
        bbox = dict(boxstyle = 'round,pad=0.5', fc = 'yellow', alpha = 0.5))

image

gjwo avatar Oct 15 '15 14:10 gjwo

I think this shows it better, I set the threshold to 1000, to get only large devices, it looks like the reactive value of the centroid is the absolute rather than the actual value, in the plot below centroid 0 should be (21, -1093)

In [7]:

h = Hart85()
h.train(mains,cols=[('power','active'),('power','reactive')],min_tolerance=100,noise_level=1000,buffer_size=20,state_threshold=15)
​
Finding Edges, please wait ...
Edge detection complete.
Creating transition frame ...
Transition frame created.
Creating states frame ...
States frame created.
Finished.
In [8]:

h.centroids
Out[8]:
(power, active) (power, reactive)
0   21.814394   1093.201672
1   2085.859678 23.049851
2   2702.026630 56.171196
3   1325.468750 20.782213
In [9]:

plt.scatter(h.steady_states[('active average')],h.steady_states[('reactive average')])
plt.scatter(h.centroids[('power','active')],h.centroids[('power','reactive')],marker='x',c=(1.0, 0.0, 0.0))
plt.legend(['Steady states','Centroids'],loc=4)
plt.title("Training steady states Signature space")
plt.ylabel("Reactive average (VAR)")
plt.xlabel("Active average (W)");
labels = ['Centroid {0}'.format(i) for i in range(len(h.centroids))]
for label, x, y in zip(labels, h.centroids[('power','active')], h.centroids[('power','reactive')]):
    plt.annotate(
        label, 
        xy = (x, y), xytext = (-5, 5),
        textcoords = 'offset points', ha = 'right', va = 'bottom',
        bbox = dict(boxstyle = 'round,pad=0.5', fc = 'yellow', alpha = 0.5))

image

gjwo avatar Oct 21 '15 14:10 gjwo

@nipunbatra @JackKelly If you guys have finished your theses now please could you review the various Hart questions (including this one) that I raised in October. If not let me know when I should next prompt you.

gjwo avatar May 10 '16 14:05 gjwo

How time flies! Graham, I've still not started writing my thesis :) BTW, are you by any chance coming to the NILM workshop? George Hart is delivering one of the keynote lectures.

nipunbatra avatar May 10 '16 16:05 nipunbatra

I'm afraid I'm also some distance from finishing my thesis, sorry. Would be great to see you at the NILM Workshop in Vancouver this weekend if you're going!

JackKelly avatar May 10 '16 18:05 JackKelly

I saw the post from @oliparson about the workshop, I won't be going but would be interested in what the speakers have to say, particularly George Hart. (Also your guest lecture at Southampton Oli)

gjwo avatar May 11 '16 12:05 gjwo

We're hoping to livestream some of the presentations, and hopefully they'll also end up on youtube afterwards if you miss it! Keep on eye on my blog and the conference twitter feed: https://twitter.com/NILM2016

oliparson avatar May 12 '16 08:05 oliparson