-
Notifications
You must be signed in to change notification settings - Fork 3
/
svm_visualization_3d.py
84 lines (63 loc) · 3.22 KB
/
svm_visualization_3d.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
## ---- python svm add need visualization the process, Requires three-dimensional data and 3D graphics display process
# To visualize the process of training an SVM with **three-dimensional data**, you can extend the 2D case to 3D using `matplotlib`'s 3D plotting capabilities. Here's how you can do it:
# ### Steps:
# 1. **Generate or load a 3D dataset**: You can use `make_classification` from `sklearn` or any other 3D dataset.
# 2. **Train an SVM on this 3D data**.
# 3. **Visualize**: Plot the data and decision boundary in 3D.
# ### Example with 3D data:
# ```python
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
from sklearn.datasets import make_classification
from sklearn.svm import SVC
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
# Generate a 3D dataset
X, y = make_classification(n_samples=100, n_features=3, n_informative=3, n_redundant=0, n_classes=2, random_state=42)
# Split the dataset
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)
# Standardize the data
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
# Train the SVM with a linear kernel
model = SVC(kernel='linear')
model.fit(X_train, y_train)
# Visualization in 3D
def plot_svm_decision_boundary_3d(model, X, y):
fig = plt.figure(figsize=(10, 8))
ax = fig.add_subplot(111, projection='3d')
# Plot the training points
scatter = ax.scatter(X[:, 0], X[:, 1], X[:, 2], c=y, s=30, cmap=plt.cm.coolwarm)
# Create grid to evaluate model (this defines the 3D space)
xlim = ax.get_xlim()
ylim = ax.get_ylim()
zlim = ax.get_zlim()
xx = np.linspace(xlim[0], xlim[1], 20)
yy = np.linspace(ylim[0], ylim[1], 20)
zz = np.linspace(zlim[0], zlim[1], 20)
# Create a meshgrid to evaluate the decision function
YY, ZZ = np.meshgrid(yy, zz)
XX = -(model.coef_[0][0] * YY + model.coef_[0][2] * ZZ + model.intercept_) / model.coef_[0][1]
# Plot the decision surface
ax.plot_surface(XX, YY, ZZ, color='gray', alpha=0.3, rstride=100, cstride=100)
# Highlight support vectors
ax.scatter(model.support_vectors_[:, 0], model.support_vectors_[:, 1], model.support_vectors_[:, 2],
s=100, facecolors='none', edgecolors='k', linewidth=1.5, label='Support Vectors')
ax.set_title('SVM Decision Boundary in 3D')
ax.set_xlabel('Feature 1')
ax.set_ylabel('Feature 2')
ax.set_zlabel('Feature 3')
# Add color legend
legend1 = ax.legend(*scatter.legend_elements(), loc="best", title="Classes")
ax.add_artist(legend1)
plt.show()
# Plot the decision boundary in 3D
plot_svm_decision_boundary_3d(model, X_train, y_train)
# ```
# ### Key Concepts:
# 1. **3D Data**: We use `make_classification` to generate a dataset with 3 features.
# 2. **3D Visualization**: `Axes3D` from `matplotlib` is used to plot the data points and the decision surface.
# 3. **Decision Surface**: The SVM decision boundary is plotted as a surface, showing the separation between the classes.
# This visualization can help you understand the decision boundaries in a higher-dimensional space. Let me know if you'd like further modifications or explanations!