Skip to content

Commit 9d64fbc

Browse files
authored
Merge pull request #1092 from NA-V10/add-unsupervised-readme
Add README and comments for MNIST autoencoder (unsupervised example)
2 parents f39b9ac + 274189e commit 9d64fbc

File tree

5 files changed

+156
-1
lines changed

5 files changed

+156
-1
lines changed

dl4j-examples/src/main/java/org/deeplearning4j/examples/quickstart/modeling/feedforward/classification/MNISTSingleLayer.java

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,11 @@
1717
* SPDX-License-Identifier: Apache-2.0
1818
******************************************************************************/
1919

20+
21+
// MNISTSingleLayer.java
22+
// Simple single-hidden-layer MLP for MNIST digit classification.
23+
// Demonstrates basic feedforward networks in DL4J.
24+
2025
package org.deeplearning4j.examples.quickstart.modeling.feedforward.classification;
2126

2227
import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator;
@@ -72,6 +77,9 @@ public static void main(String[] args) throws Exception {
7277

7378

7479
log.info("Build model....");
80+
81+
// Build a single-hidden-layer MLP for MNIST (28x28 images flattened to 784 inputs)
82+
7583
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
7684
.seed(rngSeed) //include a random seed for reproducibility
7785
// use stochastic gradient descent as an optimization algorithm
@@ -100,6 +108,7 @@ public static void main(String[] args) throws Exception {
100108
log.info("Train model....");
101109
model.fit(mnistTrain, numEpochs);
102110

111+
// Evaluate the model on the MNIST test dataset
103112

104113
log.info("Evaluate model....");
105114
Evaluation eval = model.evaluate(mnistTest);

dl4j-examples/src/main/java/org/deeplearning4j/examples/quickstart/modeling/feedforward/classification/ModelXOR.java

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,11 @@
1717
* SPDX-License-Identifier: Apache-2.0
1818
******************************************************************************/
1919

20+
21+
// ModelXOR.java
22+
// Demonstrates solving the XOR problem using a small MLP.
23+
// XOR is not linearly separable -> requires hidden layers.
24+
2025
package org.deeplearning4j.examples.quickstart.modeling.feedforward.classification;
2126

2227
import org.deeplearning4j.nn.conf.MultiLayerConfiguration;
@@ -110,6 +115,9 @@ public static void main(String[] args) {
110115

111116
log.info("Network configuration and training...");
112117

118+
// Build a small 2-layer MLP for XOR classification
119+
120+
113121
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
114122
.updater(new Sgd(0.1))
115123
.seed(seed)
Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
# Feedforward Neural Network Classification Examples – DeepLearning4J
2+
3+
This folder contains several feedforward neural network (MLP) classification examples using DeepLearning4J.
4+
They demonstrate how to train neural networks on classic datasets such as MNIST, Iris, XOR, and synthetic datasets.
5+
6+
---
7+
8+
## 🧠 MNISTSingleLayer.java
9+
A simple single-hidden-layer MLP for MNIST digit classification.
10+
11+
### What this example shows
12+
- Loading MNIST data
13+
- Building a minimal feedforward model
14+
- Backpropagation training
15+
- Evaluating test accuracy
16+
17+
---
18+
19+
## 🧠 MNISTDoubleLayer.java
20+
A deeper MLP with two hidden layers for MNIST.
21+
22+
### Why it's useful
23+
- Shows the impact of depth on accuracy
24+
- Good introduction to multi-layer feedforward networks
25+
26+
---
27+
28+
## 🌸 IrisClassifier.java
29+
A classifier for the Iris flower dataset.
30+
31+
### What you learn
32+
- Basic classification with a very small dataset
33+
- How to use evaluation metrics
34+
- Simple preprocessing
35+
36+
---
37+
38+
## 🧪 ModelXOR.java
39+
A classic MLP solving the XOR problem.
40+
41+
### Why XOR?
42+
- Not linearly separable
43+
- Demonstrates why deep networks are needed
44+
45+
---
46+
47+
## 🌙 MoonClassifier.java
48+
Binary classification on a synthetic two-moon dataset.
49+
50+
### Learnings
51+
- Handling noisy 2D datasets
52+
- Visualizing classification boundaries
53+
54+
---
55+
56+
## 🪐 SaturnClassifier.java
57+
Classification of Saturn (concentric circles) synthetic dataset.
58+
59+
### Shows
60+
- Decision boundaries
61+
- How MLPs learn non-linear patterns
62+
63+
---
64+
65+
## ✔ How to Run Any Example
66+
67+
Use the following command template:
68+
69+
70+
mvn -q exec:java -Dexec.mainClass="org.deeplearning4j.examples.quickstart.modeling.feedforward.classification.<ClassName>"
71+
72+
73+
Example:
74+
75+
76+
77+
mvn -q exec:java -Dexec.mainClass="org.deeplearning4j.examples.quickstart.modeling.feedforward.classification.MNISTSingleLayer"
78+
79+
80+
---
81+
82+
## 🙌 Why This README Helps
83+
These classification examples previously had no documentation.
84+
This README improves clarity, explains datasets, and helps beginners understand each example.

dl4j-examples/src/main/java/org/deeplearning4j/examples/quickstart/modeling/feedforward/unsupervised/MNISTAutoencoder.java

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,11 @@
1717
* SPDX-License-Identifier: Apache-2.0
1818
******************************************************************************/
1919

20+
21+
// MNISTAutoencoder.java
22+
// Demonstrates training an autoencoder on MNIST digit images.
23+
// Autoencoders learn compressed representations (unsupervised learning).
24+
2025
package org.deeplearning4j.examples.quickstart.modeling.feedforward.unsupervised;
2126

2227
import org.apache.commons.lang3.tuple.ImmutablePair;
@@ -81,6 +86,8 @@ public static void main(String[] args) throws Exception {
8186
.build())
8287
.build();
8388

89+
// Build a simple autoencoder: Encoder → Bottleneck → Decoder
90+
8491
MultiLayerNetwork net = new MultiLayerNetwork(conf);
8592
net.setListeners(Collections.singletonList(new ScoreIterationListener(10)));
8693

@@ -101,6 +108,8 @@ public static void main(String[] args) throws Exception {
101108
INDArray indexes = Nd4j.argMax(dsTest.getLabels(),1); //Convert from one-hot representation -> index
102109
labelsTest.add(indexes);
103110
}
111+
112+
// Train the autoencoder to minimize reconstruction loss
104113

105114
//Train model:
106115
int nEpochs = 3;
@@ -154,8 +163,10 @@ public int compare(Pair<Double, INDArray> o1, Pair<Double, INDArray> o2) {
154163
worst.add(list.get(list.size()-j-1).getRight());
155164
}
156165
}
166+
167+
//Visualize by default
168+
// Evaluate reconstruction quality or print sample reconstructions
157169

158-
//Visualize by default
159170
if (visualize) {
160171
//Visualize the best and worst digits
161172
MNISTVisualizer bestVisualizer = new MNISTVisualizer(2.0, best, "Best (Low Rec. Error)");
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
# Unsupervised Learning Examples – Autoencoder (DeepLearning4J)
2+
3+
This folder contains unsupervised learning examples implemented using DeepLearning4J.
4+
The primary example in this directory demonstrates how to train an autoencoder on MNIST digits to perform dimensionality reduction and reconstruction.
5+
6+
---
7+
8+
## 🧠 MNISTAutoencoder.java
9+
10+
A simple autoencoder trained on the MNIST dataset (28×28 grayscale digit images).
11+
Autoencoders learn to compress input data into a lower-dimensional representation and then reconstruct it.
12+
13+
### What this example shows
14+
- How autoencoders work
15+
- How to compress images into a bottleneck latent space
16+
- How to reconstruct input images
17+
- How unsupervised neural networks are trained
18+
19+
### Key Concepts
20+
- **Encoder:** Compresses image → latent representation
21+
- **Decoder:** Reconstructs latent representation → image
22+
- **Loss Function:** Measures reconstruction quality
23+
24+
### Expected Behavior
25+
The autoencoder gradually learns to:
26+
- Rebuild digit outlines
27+
- Capture key features
28+
- Reduce noise
29+
30+
This is not a classifier — it learns **patterns** without labels.
31+
32+
---
33+
34+
## ✔ How to Run
35+
36+
mvn -q exec:java -Dexec.mainClass="org.deeplearning4j.examples.quickstart.modeling.feedforward.unsupervised.MNISTAutoencoder"
37+
38+
39+
---
40+
41+
## 🙌 Why This README Helps
42+
The unsupervised folder previously had no explanation, run instructions, or conceptual overview.
43+
This documentation improves clarity and helps beginners understand autoencoders and unsupervised learning techniques in DL4J.

0 commit comments

Comments
 (0)