1- """
2- Multilayer Perceptron (MLP) Classifier
3-
4- A Multilayer Perceptron (MLP) is a type of feedforward artificial neural network
5- that consists of at least three layers of nodes: an input layer, one or more hidden
6- layers, and an output layer. Each node (except for the input nodes) is a neuron
7- that uses a nonlinear activation function.
8-
9- Mathematical Concept:
10- ---------------------
11- MLPs learn a function f(·): R^m → R^o by training on a dataset, where m is the
12- number of input features and o is the number of output classes. The network
13- adjusts its weights using backpropagation to minimize the difference between
14- predicted and actual outputs.
15-
16- Practical Use Cases:
17- --------------------
18- - Handwritten digit recognition (e.g., MNIST dataset)
19- - Binary and multiclass classification tasks
20- - Predicting outcomes based on multiple features
21- (e.g., medical diagnosis, spam detection)
22-
23- Advantages:
24- -----------
25- - Can learn non-linear decision boundaries
26- - Works well with complex pattern recognition
27- - Flexible architecture for various problem types
1+ from sklearn .neural_network import MLPClassifier
282
29- Limitations:
30- ------------
31- - Requires careful tuning of hyperparameters
32- - Sensitive to feature scaling
33- - Can overfit on small datasets
3+ X = [[0.0 , 0.0 ], [1.0 , 1.0 ], [1.0 , 0.0 ], [0.0 , 1.0 ]]
4+ y = [0 , 1 , 0 , 0 ]
345
35- Time Complexity: O(n_samples * n_features * n_layers * n_epochs)
36- Space Complexity: O(n_features * n_hidden_units + n_hidden_units * n_classes)
376
38- References:
39- -----------
40- - https://en.wikipedia.org/wiki/Multilayer_perceptron
41- - https://scikit-learn.org/stable/modules/neural_networks_supervised.html
42- - https://medium.com/@aryanrusia8/multi-layer-perceptrons-explained-7cb9a6e318c3
7+ clf = MLPClassifier (
8+ solver = "lbfgs" , alpha = 1e-5 , hidden_layer_sizes = (5 , 2 ), random_state = 1
9+ )
4310
44- Example:
45- --------
46- >>> X = [[0, 0], [1, 1], [0, 1], [1, 0]]
47- >>> y = [0, 0, 1, 1]
48- >>> result = multilayer_perceptron_classifier(X, y, [[0, 0], [1, 1]])
49- >>> result in [[0, 0], [0, 1], [1, 0], [1, 1]]
50- True
51- """
11+ clf .fit (X , y )
5212
53- from collections .abc import Sequence
5413
55- from sklearn .neural_network import MLPClassifier
14+ test = [[0.0 , 0.0 ], [0.0 , 1.0 ], [1.0 , 1.0 ]]
15+ Y = clf .predict (test )
5616
5717
58- def multilayer_perceptron_classifier (
59- train_features : Sequence [Sequence [float ]],
60- train_labels : Sequence [int ],
61- test_features : Sequence [Sequence [float ]],
62- ) -> list [int ]:
18+ def wrapper (y ):
6319 """
64- Train a Multilayer Perceptron classifier and predict labels for test data.
65-
66- Args:
67- train_features: Training data features, shape (n_samples, n_features).
68- train_labels: Training data labels, shape (n_samples,).
69- test_features: Test data features to predict, shape (m_samples, n_features).
70-
71- Returns:
72- List of predicted labels for the test data.
73-
74- Raises:
75- ValueError: If the number of training samples and labels do not match.
76-
77- Example:
78- >>> X = [[0, 0], [1, 1], [0, 1], [1, 0]]
79- >>> y = [0, 0, 1, 1]
80- >>> result = multilayer_perceptron_classifier(X, y, [[0, 0], [1, 1]])
81- >>> result in [[0, 0], [0, 1], [1, 0], [1, 1]]
82- True
20+ >>> [int(x) for x in wrapper(Y)]
21+ [0, 0, 1]
8322 """
84- if len (train_features ) != len (train_labels ):
85- raise ValueError ("Number of training samples and labels must match." )
86-
87- clf = MLPClassifier (
88- solver = "lbfgs" ,
89- alpha = 1e-5 ,
90- hidden_layer_sizes = (5 , 2 ),
91- random_state = 42 , # Fixed for deterministic results
92- max_iter = 1000 , # Ensure convergence
93- )
94- clf .fit (train_features , train_labels )
95- predictions = clf .predict (test_features )
96- return list (predictions )
23+ return list (y )
9724
9825
9926if __name__ == "__main__" :
10027 import doctest
10128
102- doctest .testmod ()
29+ doctest .testmod ()
0 commit comments