Skip to content

Commit 23a0273

Browse files
committed
Update documentation
1 parent 5fcc08e commit 23a0273

2 files changed

Lines changed: 4 additions & 4 deletions

File tree

docs/user/features.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,8 +54,8 @@ Ability to directly process missing data
5454
Learning algorithms for the general fuzzy min-max neural network supported by the library may classify inputs with missing
5555
data directly without the need for replacing or imputing missing values as in other classifiers.
5656

57-
Continual learning of new classes in an incremental manner
58-
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
57+
Continual learning ability of new classes
58+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
5959
Incremental learning algorithms of hyperbox-based models in the **hyperbox-brain** library can grow and accommodate new
6060
classes of data without retraining the whole classifier. Incremental learning algorithms themselves can generate new hyperboxes
6161
to represent clusters of new data with potentially new class labels both in the middle of normal training procedure and in the

index.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,7 @@
139139
<div class="keyfeatures-box-content keyfeatures-underline">
140140
<p></p>
141141
<div class="keyfeatures-box-title">Learning from both labelled and unlabelled data</div>
142-
<div class="keyfeatures-box-text">One of the exciting features of learning algorithms for the general fuzzy min-max neural network(GFMMNN) is the capability of creating classification boundaries among known classes and clustering data and representing them as hyperboxes in the case that labels are not available. Unlabelled hyperboxes is then possible to be labelled on the basis of the evidence of next incoming input samples. As a result, the GFMMNN models have the ability to learn from the mixed labelled and unlabelled datasets in a native way.
142+
<div class="keyfeatures-box-text">One of the exciting features of learning algorithms for the general fuzzy min-max neural network (GFMMNN) is the capability of creating classification boundaries among known classes and clustering data and representing them as hyperboxes in the case that labels are not available. Unlabelled hyperboxes is then possible to be labelled on the basis of the evidence of next incoming input samples. As a result, the GFMMNN models have the ability to learn from the mixed labelled and unlabelled datasets in a native way.
143143
</div>
144144
<p></p>
145145
</div>
@@ -153,7 +153,7 @@
153153
<div class="keyfeatures-box-content keyfeatures-underline">
154154
<p></p>
155155
<div class="keyfeatures-box-title">Continual learning of new classes in an incremental manner</div>
156-
<div class="keyfeatures-box-text">Incremental learning algorithms of hyperbox-based models in the **hyperbox-brain** library can grow and accommodate new classes of data without retraining the whole classifier. Incremental learning algorithms themselves can generate new hyperboxes to represent clusters of new data with potentially new class labels both in the middle of normal training procedure and in the operating time where training has been finished. This property is a key feature for smart life-long learning systems.
156+
<div class="keyfeatures-box-text">Incremental learning algorithms of hyperbox-based models in the hyperbox-brain library can grow and accommodate new classes of data without retraining the whole classifier. Incremental learning algorithms themselves can generate new hyperboxes to represent clusters of new data with potentially new class labels both in the middle of normal training procedure and in the operating time where training has been finished. This property is a key feature for smart life-long learning systems.
157157
</div>
158158
<p></p>
159159
</div>

0 commit comments

Comments
 (0)