Stream Performance and Power Data Analysis

Example analysis of the Stream performance and power data collected with WattProf (developed by RNet Technologies), focusing on the Stream benchmark triad function.

In [170]:
%matplotlib inline
import pandas as pd
from numpy import nan as NA
import matplotlib.pyplot as plt
# Some general font settings for plots
font = {'family' : 'normal',
        'weight' : 'normal',   #bold
         'size'   : 10}
import matplotlib
matplotlib.rc('font', **font)

First, load the summary data for experiments on 11 different problem sizes.

In [171]:
summaryData = pd.read_csv('data/all.csv')
summaryData
Out[171]:
Problem_size C0_Power C0_Energy C1_Power C1_Energy C2_Power C2_Energy C3_Power C3_Energy C4_Power ... TIME Ex_TIME PAPI_L2_DCM Ex_PAPI_L2_DCM PAPI_L1_DCM Ex_PAPI_L1_DCM PAPI_TOT_INS Ex_PAPI_TOT_INS PAPI_FP_INS Ex_PAPI_FP_INS
0 1000000 6.948410 0.006948 26.532341 0.026532 8.761912 0.008762 48.633041 0.048633 45.455711 ... 11233 11233 11732 11732 376236 376236 56000000 56000000 9002458 9002458
1 10000000 6.968211 0.843154 29.227571 3.536536 8.777411 1.062067 53.185787 6.435480 49.600278 ... 111019 111019 97584 97584 3758318 3758318 560000000 560000000 90000000 90000000
2 20000000 7.022909 1.766262 29.547589 7.431219 8.787722 2.210112 53.816286 13.534796 50.141385 ... 199991 199991 187740 187740 7518529 7518529 1120000000 1120000000 180000000 180000000
3 30000000 7.036968 2.665251 30.257066 11.459864 8.814280 3.338409 55.041788 20.847077 51.290878 ... 305363 305363 230077 230077 11300000 11300000 1680000000 1680000000 270000000 270000000
4 40000000 7.068476 2.686021 30.832878 11.716494 8.836191 3.357753 56.139250 21.332915 52.279197 ... 395559 395559 372237 372237 15100000 15100000 2240000000 2240000000 360000000 360000000
5 50000000 7.093654 3.596482 30.966424 15.699977 8.841375 4.482577 56.311527 28.549944 52.459825 ... 512638 512638 361384 361384 18800000 18800000 2800000000 2800000000 450000000 450000000
6 60000000 7.069502 4.337140 31.027978 19.035665 8.838399 5.422358 56.476869 34.648559 52.616443 ... 607694 607694 483754 483754 22600000 22600000 3360000000 3360000000 540000000 540000000
7 70000000 7.053963 5.359248 30.682378 23.310937 8.832366 6.710390 55.839108 42.423763 52.004965 ... 728319 728319 492709 492709 26300000 26300000 3920000000 3920000000 630000000 630000000
8 80000000 7.057219 5.367015 30.877954 23.482684 8.833557 6.717920 56.201020 42.740876 52.326137 ... 798116 798116 658571 658571 30100000 30100000 4480000000 4480000000 720000000 720000000
9 90000000 7.078779 6.331968 30.919193 27.657219 8.838325 7.905882 56.277262 50.340011 52.404582 ... 928373 928373 644682 644682 33800000 33800000 5040000000 5040000000 810000000 810000000
10 100000000 7.063027 7.161910 31.153908 31.590063 8.847654 8.971521 56.702476 57.496311 52.790535 ... 1019342 1019342 716559 716559 37600000 37600000 5600000000 5600000000 900000000 900000000

11 rows × 29 columns

A simple scatter plot to do a quick visual pairwise variable correlations check.

In [172]:
#colHeaders = list(summaryData.columns.values)
summary  = summaryData[["Problem_size","C3_Power","C3_Energy","TIME","PAPI_L2_DCM","PAPI_L1_DCM"]]
_ = pd.scatter_matrix(summary, alpha=0.5, figsize = (12,12), diagonal='hist')

Correlations between the variables of interest. We don't include instruction counts because the hardware counter values on the test architecture are unreliable.

In [173]:
summary.corr()
Out[173]:
Problem_size C3_Power C3_Energy TIME PAPI_L2_DCM PAPI_L1_DCM
Problem_size 1.000000 0.775320 0.995102 0.999598 0.989434 0.999998
C3_Power 0.775320 1.000000 0.787064 0.771395 0.803773 0.775984
C3_Energy 0.995102 0.787064 1.000000 0.996133 0.976891 0.995043
TIME 0.999598 0.771395 0.996133 1.000000 0.985485 0.999548
PAPI_L2_DCM 0.989434 0.803773 0.976891 0.985485 1.000000 0.989672
PAPI_L1_DCM 0.999998 0.775984 0.995043 0.999548 0.989672 1.000000

Linear regression for energy

Selecting energy on one of the cores as our modeling target, we perform a linear fit for C3_Energy. We only do single independent variables here. Including multiple dependent variables is not a great idea because a lot of the potential independent variables are highly correlated.

In [174]:
import statsmodels.formula.api as smf

indepcols = ["Problem_size","TIME","PAPI_L2_DCM","PAPI_L1_DCM"]
depvar = 'C3_Energy'

count = 1
for indepvar in indepcols:
    print '+'*40 + '\n' + "Independent variable(s): " + str(indepvar)

    # input points for testing predictions
    xx = pd.DataFrame({indepvar: np.linspace(summary[indepvar].min(), summary[indepvar].max(), 10)})
    # 1st order polynomial 
    poly_1 = smf.ols(formula='%s ~ 1 + %s' % (depvar, indepvar), data=summary).fit()
    print "\n1st order polynomial r-squared: %f\n" % poly_1.rsquared
    
    fig = plt.figure(count)
    plt.scatter(summary[indepvar], summary[depvar], alpha=0.3)  # Plot the raw data
    plt.plot(xx, poly_1.predict(xx), 
         'g-', label='Poly n=1 $R^2$=%.2f' % poly_1.rsquared, alpha=0.9)
    plt.xlabel(indepvar); plt.ylabel(depvar)
    plt.show()
    count += 1
++++++++++++++++++++++++++++++++++++++++
Independent variable(s): Problem_size

1st order polynomial r-squared: 0.990228

++++++++++++++++++++++++++++++++++++++++
Independent variable(s): TIME

1st order polynomial r-squared: 0.992281

++++++++++++++++++++++++++++++++++++++++
Independent variable(s): PAPI_L2_DCM

1st order polynomial r-squared: 0.954316

++++++++++++++++++++++++++++++++++++++++
Independent variable(s): PAPI_L1_DCM

1st order polynomial r-squared: 0.990111

Second-order polynomial fit:

In [175]:
indepcols = ["Problem_size","TIME","PAPI_L2_DCM","PAPI_L1_DCM"]
depvar = 'C3_Energy'

count = 1
for indepvar in indepcols:
    print '+'*40 + '\n' + "Independent variable(s): " + str(indepvar)

    # input points for testing predictions
    xx = pd.DataFrame({indepvar: np.linspace(summary[indepvar].min(), summary[indepvar].max(), 10)})
    # 1st order polynomial 
    poly_2 = smf.ols(formula='%s ~ 1 + %s + (%s) ** 2' % (depvar, indepvar, indepvar), data=summary).fit()
    print "\n2nd order polynomial r-squared: %f\n" % poly_2.rsquared
    
    fig = plt.figure(count)
    plt.scatter(summary[indepvar], summary[depvar], alpha=0.3)  # Plot the raw data
    plt.plot(xx, poly_2.predict(xx), 
         'g-', label='Poly n=2 $R^2$=%.2f' % poly_2.rsquared, alpha=0.9)
    plt.xlabel(indepvar); plt.ylabel(depvar)
    plt.show()
    count += 1
++++++++++++++++++++++++++++++++++++++++
Independent variable(s): Problem_size

2nd order polynomial r-squared: 0.990228

++++++++++++++++++++++++++++++++++++++++
Independent variable(s): TIME

2nd order polynomial r-squared: 0.992281

++++++++++++++++++++++++++++++++++++++++
Independent variable(s): PAPI_L2_DCM

2nd order polynomial r-squared: 0.954316

++++++++++++++++++++++++++++++++++++++++
Independent variable(s): PAPI_L1_DCM

2nd order polynomial r-squared: 0.990111

In [176]:
# Using statsmodels
import statsmodels.api as sm
import numpy as np
import itertools
from statsmodels.sandbox.regression.predstd import wls_prediction_std

# reshape the data from a pandas Series to columns 
# the dependent variable
depvar = 'C3_Energy'
y = np.matrix(summary[depvar]).transpose()
indepcols = ["Problem_size","TIME","PAPI_L2_DCM","PAPI_L1_DCM"]
indepvars = summary[indepcols]

models = []
#for numvars in range(1,len(indepcols)):
#    for indeps in itertools.combinations(indepvars, numvars):
for indepvar in indepcols:
        print '+'*40 + '\n' + "Independent variable(s): ", indepvar, "; Dependent Variable: ", depvar

        nsample= summary[indepvar].shape[0]
        x = summary[indepvar]
        y = summary[depvar]
        X = np.column_stack((x,np.sin(x), (x)**3, np.ones(nsample)))
        #X = np.column_stack((x,(x)**3, np.ones(nsample)))
        
        # Categorical not appropriate here, but just as a future example
        #dummy = sm.categorical(np.array(summary[indepvar]), drop=True)
        #X = np.column_stack((x, dummy[:,1:]))
        #X = sm.add_constant(X, prepend=False)

        model = sm.OLS(y,X)
        f = model.fit()

        print f.summary()
        print('Parameters: ', f.params)
        print('Standard errors: ', f.bse)
        print('Predicted values: ', f.predict())
        
        # Draw a plot of to compare the true relationship to OLS predictions. 
        # Confidence intervals around the predictions are built using the wls_prediction_std command.
        prstd, iv_l, iv_u = wls_prediction_std(f)
        
        fig, ax = plt.subplots(figsize=(8,6))
    
        
        x_hat= np.linspace(summary[indepvar].min(), summary[indepvar].max(), nsample)
        ax.plot(summary[indepvar], y, 'o', label="data")
        #ax.plot(x, y_true, 'b-', label="True")
        ax.plot(x, f.fittedvalues, 'g--.', label="OLS")
        ax.plot(x, iv_u, 'r--')
        ax.plot(x, iv_l, 'r--')
        
        ax.legend(loc='best');
        plt.xlabel(indepvar)
        plt.ylabel(depvar)

        plt.show()
        #models.append((model,x,y,f))
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  Problem_size ; Dependent Variable:  C3_Energy
                            OLS Regression Results                            
==============================================================================
Dep. Variable:              C3_Energy   R-squared:                       0.991
Model:                            OLS   Adj. R-squared:                  0.990
Method:                 Least Squares   F-statistic:                     1042.
Date:                Fri, 24 Apr 2015   Prob (F-statistic):           1.29e-10
Time:                        18:42:23   Log-Likelihood:                -20.928
No. Observations:                  11   AIC:                             45.86
Df Residuals:                       9   BIC:                             46.65
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
x1          5.428e-07   1.97e-08     27.576      0.000      4.98e-07  5.87e-07
x2                  0          0        nan        nan             0         0
x3          -2.15e-19   1.32e-19     -1.625      0.139     -5.14e-19  8.43e-20
const       5.043e-16   1.83e-17     27.576      0.000      4.63e-16  5.46e-16
==============================================================================
Omnibus:                        0.098   Durbin-Watson:                   2.565
Prob(Omnibus):                  0.952   Jarque-Bera (JB):                0.319
Skew:                           0.063   Prob(JB):                        0.853
Kurtosis:                       2.176   Cond. No.                     1.88e+36
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The smallest eigenvalue is 2.41e-34. This might indicate that there are
strong multicollinearity problems or that the design matrix is singular.
('Parameters: ', x1       5.428093e-07
x2       0.000000e+00
x3      -2.150499e-19
const    5.042537e-16
dtype: float64)
('Standard errors: ', x1       1.968442e-08
x2       0.000000e+00
x3       1.323469e-19
const    1.828624e-17
dtype: float64)
('Predicted values: ', array([  0.32775942,   7.41157894,  12.8396724 ,  18.26776585,
        23.69585931,  29.12395277,  34.55204622,  39.98013968,
        45.40823314,  50.83632659,  56.26442005]))
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  TIME ; Dependent Variable:  C3_Energy
                            OLS Regression Results                            
==============================================================================
Dep. Variable:              C3_Energy   R-squared:                       0.992
Model:                            OLS   Adj. R-squared:                  0.991
Method:                 Least Squares   F-statistic:                     1131.
Date:                Fri, 24 Apr 2015   Prob (F-statistic):           8.97e-11
Time:                        18:42:23   Log-Likelihood:                -20.485
No. Observations:                  11   AIC:                             44.97
Df Residuals:                       9   BIC:                             45.77
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
x1          5.824e-05   2.17e-06     26.861      0.000      5.33e-05  6.31e-05
x2         -3.164e-28   1.18e-29    -26.861      0.000     -3.43e-28  -2.9e-28
x3         -3.209e-18    2.9e-18     -1.108      0.297     -9.76e-18  3.34e-18
const       1.549e-10   5.77e-12     26.861      0.000      1.42e-10  1.68e-10
==============================================================================
Omnibus:                        0.763   Durbin-Watson:                   1.854
Prob(Omnibus):                  0.683   Jarque-Bera (JB):                0.693
Skew:                           0.426   Prob(JB):                        0.707
Kurtosis:                       2.114   Cond. No.                          inf
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The smallest eigenvalue is      0. This might indicate that there are
strong multicollinearity problems or that the design matrix is singular.
('Parameters: ', x1       5.823562e-05
x2      -3.164304e-28
x3      -3.208982e-18
const    1.548981e-10
dtype: float64)
('Standard errors: ', x1       2.168044e-06
x2       1.178034e-29
x3       2.896198e-18
const    5.766676e-12
dtype: float64)
('Predicted values: ', array([  0.65415613,   6.46086895,  11.62093079,  17.69162985,
        22.83701226,  29.42147565,  34.66928621,  41.1743616 ,
        44.84735919,  51.49673047,  55.96319905]))
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  PAPI_L2_DCM ; Dependent Variable:  C3_Energy
                            OLS Regression Results                            
==============================================================================
Dep. Variable:              C3_Energy   R-squared:                       0.954
Model:                            OLS   Adj. R-squared:                  0.949
Method:                 Least Squares   F-statistic:                     188.2
Date:                Fri, 24 Apr 2015   Prob (F-statistic):           2.44e-07
Time:                        18:42:24   Log-Likelihood:                -30.133
No. Observations:                  11   AIC:                             64.27
Df Residuals:                       9   BIC:                             65.06
Df Model:                           1                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
x1           7.37e-05   7.34e-06     10.036      0.000      5.71e-05  9.03e-05
x2          1.538e-11   1.53e-12     10.036      0.000      1.19e-11  1.89e-11
x3          3.943e-18   1.93e-17      0.205      0.842     -3.96e-17  4.75e-17
const       2.711e-10    2.7e-11     10.036      0.000       2.1e-10  3.32e-10
==============================================================================
Omnibus:                        0.840   Durbin-Watson:                   3.117
Prob(Omnibus):                  0.657   Jarque-Bera (JB):                0.626
Skew:                          -0.502   Prob(JB):                        0.731
Kurtosis:                       2.403   Cond. No.                     4.42e+17
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The condition number is large, 4.42e+17. This might indicate that there are
strong multicollinearity or other numerical problems.
('Parameters: ', x1       7.369783e-05
x2       1.538366e-11
x3       3.942814e-18
const    2.711171e-10
dtype: float64)
('Standard errors: ', x1       7.343471e-06
x2       1.532874e-12
x3       1.925498e-17
const    2.701491e-11
dtype: float64)
('Predicted values: ', array([  0.86462936,   7.19539338,  13.86212164,  17.00419712,
        27.63642057,  26.81930399,  36.09797685,  36.7831907 ,
        49.66145287,  48.56810269,  54.25949495]))
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  PAPI_L1_DCM ; Dependent Variable:  C3_Energy
                            OLS Regression Results                            
==============================================================================
Dep. Variable:              C3_Energy   R-squared:                       0.364
Model:                            OLS   Adj. R-squared:                  0.364
Method:                 Least Squares   F-statistic:                       inf
Date:                Fri, 24 Apr 2015   Prob (F-statistic):                nan
Time:                        18:42:24   Log-Likelihood:                -44.623
No. Observations:                  11   AIC:                             91.25
Df Residuals:                      10   BIC:                             91.64
Df Model:                           0                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [95.0% Conf. Int.]
------------------------------------------------------------------------------
x1           1.24e-36   1.78e-37      6.970      0.000      8.44e-37  1.64e-36
x2                  0          0        nan        nan             0         0
x3          1.368e-21   1.96e-22      6.970      0.000      9.31e-22  1.81e-21
const       3.939e-44   5.65e-45      6.970      0.000      2.68e-44   5.2e-44
==============================================================================
Omnibus:                        2.810   Durbin-Watson:                   0.235
Prob(Omnibus):                  0.245   Jarque-Bera (JB):                1.474
Skew:                          -0.889   Prob(JB):                        0.479
Kurtosis:                       2.770   Cond. No.                     9.11e+39
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.
[2] The smallest eigenvalue is 6.73e-35. This might indicate that there are
strong multicollinearity problems or that the design matrix is singular.
('Parameters: ', x1       1.239924e-36
x2       0.000000e+00
x3       1.367887e-21
const    3.939457e-44
dtype: float64)
('Standard errors: ', x1       1.778872e-37
x2       0.000000e+00
x3       1.962456e-22
const    5.651790e-45
dtype: float64)
('Predicted values: ', array([  7.28502991e-05,   7.26157564e-02,   5.81365065e-01,
         1.97372037e+00,   4.70956869e+00,   9.08916194e+00,
         1.57897630e+01,   2.48838480e+01,   3.73035171e+01,
         5.28202429e+01,   7.27132955e+01]))
In [177]:
# Using numpy polyfit

depvar = 'C3_Energy'
indepcols = ["Problem_size","TIME","PAPI_L2_DCM","PAPI_L1_DCM"]
indepvars = summary[indepcols]

for indepvar in indepcols:
        print '+'*40 + '\n' + "Independent variable(s): ", indepvar, "; Dependent Variable: ", depvar

        nsample= summary[indepvar].shape[0]
        x = summary[indepvar]
        y = summary[depvar]
        
        
        # Evaluate model on new inputs
        x_hat= np.linspace(summary[indepvar].min(), summary[indepvar].max(), nsample)
        #f = model(x_hat)   # for poly1d only
        
        #model = np.poly1d(np.polyfit(x, y, 3))
    
        if indepvar in ['Problem_size','TIME']: degree = 4
        elif indepvar == 'PAPI_L1_DCM': degree = 8
        else: degree = 6   # PAPI_L2_DCM
        model_cheb = np.polynomial.chebyshev.chebfit(x,y,degree)
        print 'Chebyshev:', model_cheb 
        f_cheb = np.polynomial.chebyshev.chebval(x_hat,model_cheb)
        print f_cheb
        
        model_leg = np.polynomial.legendre.legfit(x,y,degree)
        print 'Legendre:', model_leg
        f_legendre = np.polynomial.legendre.legval(x_hat,model_leg)
        print f_legendre

        
        fig, ax = plt.subplots(figsize=(8,6))
            
        ax.plot(summary[indepvar], y, 'o', label="Data")
        #ax.plot(x, y_true, 'b-', label="True")
        ax.plot(x_hat, f_cheb, 'r:', label="Chebyshev(%d)"%degree)
        ax.plot(x_hat, f_legendre, 'g--.', label="Legendre(%d)"%degree)

        #ax.plot(x, iv_u, 'r--')
        #ax.plot(x, iv_l, 'r--')
        
        ax.legend(loc='best');
        plt.xlabel(indepvar)
        plt.ylabel(depvar)
        plt.show()
        
        plt.savefig('figures/%s_%s.pdf'%(depvar,indepvar), bbox_inches='tight')
        #models.append((model,x,y,f))
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  Problem_size ; Dependent Variable:  C3_Energy
Chebyshev: [ -7.17353896e-01   8.13405510e-07  -3.57199442e-15   1.62203039e-23
  -2.12727500e-32]
[  0.08897234   7.38160965  13.66271152  19.24718854  24.41071703
  29.38973905  34.38146237  39.54386048  44.99567261  50.81640369
  57.04632441]
Legendre: [ -7.17353896e-01   8.13405510e-07  -4.76265923e-15   2.59524863e-23
  -3.88987428e-32]
[  0.08897234   7.38160965  13.66271152  19.24718854  24.41071703
  29.38973905  34.38146237  39.54386048  44.99567261  50.81640369
  57.04632441]
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  TIME ; Dependent Variable:  C3_Energy
Chebyshev: [ -9.78828530e-01   8.09211533e-05  -3.40495335e-11   1.43078381e-17
  -1.66931313e-24]
[ -0.07835307   7.31138445  13.68476877  19.34026628  24.54324024
  29.52595079  34.48755494  39.59410658  44.97855645  50.7407522
  56.94743833]
Legendre: [ -9.78828530e-01   8.09211533e-05  -4.53993780e-11   2.28925410e-17
  -3.05245830e-24]
[ -0.07835307   7.31138445  13.68476877  19.34026628  24.54324024
  29.52595079  34.48755494  39.59410658  44.97855645  50.7407522
  56.94743833]
<matplotlib.figure.Figure at 0x11497cad0>
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  PAPI_L2_DCM ; Dependent Variable:  C3_Energy
Chebyshev: [  2.39745076e+00  -2.42313297e-04   2.50416832e-09  -7.29891904e-15
   9.70820650e-21  -5.98204826e-27   1.38608459e-33]
[  0.19828303   3.30656824  13.06659516  18.39845288  20.78186388
  24.71765849  32.10457073  40.53135528  45.48422593  46.4696151
  57.05225488]
Legendre: [  2.39745076e+00  -2.42313297e-04   3.33889110e-09  -1.16782705e-14
   1.77521490e-20  -1.21540028e-26   3.07218750e-33]
[  0.19828303   3.30656824  13.06659516  18.39845288  20.78186388
  24.71765849  32.10457073  40.53135528  45.48422593  46.4696151
  57.05225488]
<matplotlib.figure.Figure at 0x110b99290>
++++++++++++++++++++++++++++++++++++++++
Independent variable(s):  PAPI_L1_DCM ; Dependent Variable:  C3_Energy
Chebyshev: [ -1.90236902e+00   6.02807376e-06  -1.18521945e-12   1.36622747e-19
  -7.79583874e-27   2.38384433e-34  -3.98331288e-42   3.42700824e-50
  -1.18707495e-58]
[  0.05794884   6.85985205  14.4924729   20.3355074   22.69569203
  27.49147568  36.01592148  41.91986611  42.97492168  50.59346186
  57.50029116]
Legendre: [ -1.90236902e+00   6.02807376e-06  -1.58029260e-12   2.18596396e-19
  -1.42552480e-26   4.84336625e-34  -8.82881469e-42   8.18008493e-50
  -3.02238322e-58]
[  0.05794884   6.85985205  14.4924729   20.3355074   22.69569203
  27.49147568  36.01592148  41.91986611  42.97492168  50.59346186
  57.50029116]
<matplotlib.figure.Figure at 0x111a01690>
<matplotlib.figure.Figure at 0x110ee67d0>