Plotting more than 2 Features for Decision Tree Classifier using matplotlib python
The Dataset
I've been playing around the Pima Indians Dataset on Classifying using Decision Tree Classifier. However I've got my results and as obvious stage I've been looking for Visualization of same.
here's the head of the dataset:
TimesPregnant GlucoseConcentration BloodPrs SkinThickness Serum BMI
0 6 148 72 35 0 33.6
1 1 85 66 29 0 26.6
2 8 183 64 0 0 23.3
3 1 89 66 23 94 28.1
4 0 137 40 35 168 43.1
DiabetesFunct Age Class
0 0.627 50 1
1 0.351 31 0
2 0.672 32 1
3 0.167 21 0
4 2.288 33 1
Plotting more than 2 features?
Here's the code I've assembled using references and tutorials around web. Apparently it doesn't work for more than 2 features. In here, as you can notice except the last column, all the others are my features.
The code
# Visualising the Training set results
from matplotlib.colors import ListedColormap
X_set, y_set = X_train, y_train
X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
alpha = 0.75, cmap = ListedColormap(('red', 'green')))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
c = ListedColormap(('red', 'green'))(i), label = j)
plt.title('Decision Tree (Train set)')
plt.xlabel('Age')
plt.ylabel('Estimated Salary')
plt.legend()
plt.show()
You may notice those X1
, X2
making up from meshgrid so as to utilize the space I'm using for coloring, however you are free to ignore if solution you propose covers plotting more than 2 features as far as it is on matplotlib.
Now, I can't make 8 X's for 8 Features here, I look for quite efficient way to do same.
python numpy matplotlib
add a comment |
The Dataset
I've been playing around the Pima Indians Dataset on Classifying using Decision Tree Classifier. However I've got my results and as obvious stage I've been looking for Visualization of same.
here's the head of the dataset:
TimesPregnant GlucoseConcentration BloodPrs SkinThickness Serum BMI
0 6 148 72 35 0 33.6
1 1 85 66 29 0 26.6
2 8 183 64 0 0 23.3
3 1 89 66 23 94 28.1
4 0 137 40 35 168 43.1
DiabetesFunct Age Class
0 0.627 50 1
1 0.351 31 0
2 0.672 32 1
3 0.167 21 0
4 2.288 33 1
Plotting more than 2 features?
Here's the code I've assembled using references and tutorials around web. Apparently it doesn't work for more than 2 features. In here, as you can notice except the last column, all the others are my features.
The code
# Visualising the Training set results
from matplotlib.colors import ListedColormap
X_set, y_set = X_train, y_train
X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
alpha = 0.75, cmap = ListedColormap(('red', 'green')))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
c = ListedColormap(('red', 'green'))(i), label = j)
plt.title('Decision Tree (Train set)')
plt.xlabel('Age')
plt.ylabel('Estimated Salary')
plt.legend()
plt.show()
You may notice those X1
, X2
making up from meshgrid so as to utilize the space I'm using for coloring, however you are free to ignore if solution you propose covers plotting more than 2 features as far as it is on matplotlib.
Now, I can't make 8 X's for 8 Features here, I look for quite efficient way to do same.
python numpy matplotlib
add a comment |
The Dataset
I've been playing around the Pima Indians Dataset on Classifying using Decision Tree Classifier. However I've got my results and as obvious stage I've been looking for Visualization of same.
here's the head of the dataset:
TimesPregnant GlucoseConcentration BloodPrs SkinThickness Serum BMI
0 6 148 72 35 0 33.6
1 1 85 66 29 0 26.6
2 8 183 64 0 0 23.3
3 1 89 66 23 94 28.1
4 0 137 40 35 168 43.1
DiabetesFunct Age Class
0 0.627 50 1
1 0.351 31 0
2 0.672 32 1
3 0.167 21 0
4 2.288 33 1
Plotting more than 2 features?
Here's the code I've assembled using references and tutorials around web. Apparently it doesn't work for more than 2 features. In here, as you can notice except the last column, all the others are my features.
The code
# Visualising the Training set results
from matplotlib.colors import ListedColormap
X_set, y_set = X_train, y_train
X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
alpha = 0.75, cmap = ListedColormap(('red', 'green')))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
c = ListedColormap(('red', 'green'))(i), label = j)
plt.title('Decision Tree (Train set)')
plt.xlabel('Age')
plt.ylabel('Estimated Salary')
plt.legend()
plt.show()
You may notice those X1
, X2
making up from meshgrid so as to utilize the space I'm using for coloring, however you are free to ignore if solution you propose covers plotting more than 2 features as far as it is on matplotlib.
Now, I can't make 8 X's for 8 Features here, I look for quite efficient way to do same.
python numpy matplotlib
The Dataset
I've been playing around the Pima Indians Dataset on Classifying using Decision Tree Classifier. However I've got my results and as obvious stage I've been looking for Visualization of same.
here's the head of the dataset:
TimesPregnant GlucoseConcentration BloodPrs SkinThickness Serum BMI
0 6 148 72 35 0 33.6
1 1 85 66 29 0 26.6
2 8 183 64 0 0 23.3
3 1 89 66 23 94 28.1
4 0 137 40 35 168 43.1
DiabetesFunct Age Class
0 0.627 50 1
1 0.351 31 0
2 0.672 32 1
3 0.167 21 0
4 2.288 33 1
Plotting more than 2 features?
Here's the code I've assembled using references and tutorials around web. Apparently it doesn't work for more than 2 features. In here, as you can notice except the last column, all the others are my features.
The code
# Visualising the Training set results
from matplotlib.colors import ListedColormap
X_set, y_set = X_train, y_train
X1, X2 = np.meshgrid(np.arange(start = X_set[:, 0].min() - 1, stop = X_set[:, 0].max() + 1, step = 0.01),
np.arange(start = X_set[:, 1].min() - 1, stop = X_set[:, 1].max() + 1, step = 0.01))
plt.contourf(X1, X2, classifier.predict(np.array([X1.ravel(), X2.ravel()]).T).reshape(X1.shape),
alpha = 0.75, cmap = ListedColormap(('red', 'green')))
plt.xlim(X1.min(), X1.max())
plt.ylim(X2.min(), X2.max())
for i, j in enumerate(np.unique(y_set)):
plt.scatter(X_set[y_set == j, 0], X_set[y_set == j, 1],
c = ListedColormap(('red', 'green'))(i), label = j)
plt.title('Decision Tree (Train set)')
plt.xlabel('Age')
plt.ylabel('Estimated Salary')
plt.legend()
plt.show()
You may notice those X1
, X2
making up from meshgrid so as to utilize the space I'm using for coloring, however you are free to ignore if solution you propose covers plotting more than 2 features as far as it is on matplotlib.
Now, I can't make 8 X's for 8 Features here, I look for quite efficient way to do same.
python numpy matplotlib
python numpy matplotlib
asked Nov 22 '18 at 19:27
T3J45T3J45
131314
131314
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Here's how you can do it:
from itertools import product
from matplotlib import pyplot as plt
import numpy as np
import scipy.stats as sts
features = [np.linspace(0, 5),
np.linspace(9, 14),
np.linspace(6, 11),
np.linspace(3, 8)]
labels = ['height',
'weight',
'bmi',
'age']
n = len(features)
fig, axarr = plt.subplots(n, n, figsize=(4*n, 4*n))
fig.subplots_adjust(0, 0, 1, 1, 0, 0)
for (x,y),ax in zip(product(features, features), axarr.T.flat):
X,Y = np.meshgrid(x, y)
# get some fake data for demo purposes
mnorm = sts.multivariate_normal([x.mean()**(7/10), y.mean()**(11/10)])
Z = mnorm.pdf(np.stack([X, Y], 2))
ax.contourf(X, Y, Z)
# label and style the plot
# ...in progress
Output:
That's cool, however where can I put my line of ML Modelclassifier.predict
?
– T3J45
Nov 23 '18 at 3:06
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53437076%2fplotting-more-than-2-features-for-decision-tree-classifier-using-matplotlib-pyth%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Here's how you can do it:
from itertools import product
from matplotlib import pyplot as plt
import numpy as np
import scipy.stats as sts
features = [np.linspace(0, 5),
np.linspace(9, 14),
np.linspace(6, 11),
np.linspace(3, 8)]
labels = ['height',
'weight',
'bmi',
'age']
n = len(features)
fig, axarr = plt.subplots(n, n, figsize=(4*n, 4*n))
fig.subplots_adjust(0, 0, 1, 1, 0, 0)
for (x,y),ax in zip(product(features, features), axarr.T.flat):
X,Y = np.meshgrid(x, y)
# get some fake data for demo purposes
mnorm = sts.multivariate_normal([x.mean()**(7/10), y.mean()**(11/10)])
Z = mnorm.pdf(np.stack([X, Y], 2))
ax.contourf(X, Y, Z)
# label and style the plot
# ...in progress
Output:
That's cool, however where can I put my line of ML Modelclassifier.predict
?
– T3J45
Nov 23 '18 at 3:06
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
add a comment |
Here's how you can do it:
from itertools import product
from matplotlib import pyplot as plt
import numpy as np
import scipy.stats as sts
features = [np.linspace(0, 5),
np.linspace(9, 14),
np.linspace(6, 11),
np.linspace(3, 8)]
labels = ['height',
'weight',
'bmi',
'age']
n = len(features)
fig, axarr = plt.subplots(n, n, figsize=(4*n, 4*n))
fig.subplots_adjust(0, 0, 1, 1, 0, 0)
for (x,y),ax in zip(product(features, features), axarr.T.flat):
X,Y = np.meshgrid(x, y)
# get some fake data for demo purposes
mnorm = sts.multivariate_normal([x.mean()**(7/10), y.mean()**(11/10)])
Z = mnorm.pdf(np.stack([X, Y], 2))
ax.contourf(X, Y, Z)
# label and style the plot
# ...in progress
Output:
That's cool, however where can I put my line of ML Modelclassifier.predict
?
– T3J45
Nov 23 '18 at 3:06
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
add a comment |
Here's how you can do it:
from itertools import product
from matplotlib import pyplot as plt
import numpy as np
import scipy.stats as sts
features = [np.linspace(0, 5),
np.linspace(9, 14),
np.linspace(6, 11),
np.linspace(3, 8)]
labels = ['height',
'weight',
'bmi',
'age']
n = len(features)
fig, axarr = plt.subplots(n, n, figsize=(4*n, 4*n))
fig.subplots_adjust(0, 0, 1, 1, 0, 0)
for (x,y),ax in zip(product(features, features), axarr.T.flat):
X,Y = np.meshgrid(x, y)
# get some fake data for demo purposes
mnorm = sts.multivariate_normal([x.mean()**(7/10), y.mean()**(11/10)])
Z = mnorm.pdf(np.stack([X, Y], 2))
ax.contourf(X, Y, Z)
# label and style the plot
# ...in progress
Output:
Here's how you can do it:
from itertools import product
from matplotlib import pyplot as plt
import numpy as np
import scipy.stats as sts
features = [np.linspace(0, 5),
np.linspace(9, 14),
np.linspace(6, 11),
np.linspace(3, 8)]
labels = ['height',
'weight',
'bmi',
'age']
n = len(features)
fig, axarr = plt.subplots(n, n, figsize=(4*n, 4*n))
fig.subplots_adjust(0, 0, 1, 1, 0, 0)
for (x,y),ax in zip(product(features, features), axarr.T.flat):
X,Y = np.meshgrid(x, y)
# get some fake data for demo purposes
mnorm = sts.multivariate_normal([x.mean()**(7/10), y.mean()**(11/10)])
Z = mnorm.pdf(np.stack([X, Y], 2))
ax.contourf(X, Y, Z)
# label and style the plot
# ...in progress
Output:
answered Nov 22 '18 at 21:30
teltel
7,40621431
7,40621431
That's cool, however where can I put my line of ML Modelclassifier.predict
?
– T3J45
Nov 23 '18 at 3:06
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
add a comment |
That's cool, however where can I put my line of ML Modelclassifier.predict
?
– T3J45
Nov 23 '18 at 3:06
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
That's cool, however where can I put my line of ML Model
classifier.predict
?– T3J45
Nov 23 '18 at 3:06
That's cool, however where can I put my line of ML Model
classifier.predict
?– T3J45
Nov 23 '18 at 3:06
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
Also, I've seen similar plots with seaborn. It's it possible to do with seaborn as well?
– T3J45
Nov 23 '18 at 3:07
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
If you could respond for classifier, that would be great. I'm unable to figure out where to put my classifier here.
– T3J45
Nov 23 '18 at 15:33
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53437076%2fplotting-more-than-2-features-for-decision-tree-classifier-using-matplotlib-pyth%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown