DocsNode Tutorial
Node Editor Tutorial
Build a simple neural network using the visual Node Editor. No coding required!
Goal
Create a 2-layer Dense network for MNIST classification:
Input (784) → Dense (128, ReLU) → Dense (10, Softmax) → Output
Step 1: Open the Node Editor
Go to View → Node Editor (or press Ctrl+1)
| Action | Control |
|---|---|
| Pan | Middle-mouse drag or Space+drag |
| Zoom | Mouse wheel |
| Select | Left-click |
Step 2: Add an Input Node
- Right-click on the canvas
- Navigate to Data → DatasetInput
- Configure in Properties: Input Shape: 784, Batch Size: 32
+------------------+ | DatasetInput | +------------------+ | Dataset: [None] | | Batch: 32 | +------------------+ | [output]→ | +------------------+
Step 3-5: Add Dense Layers
- Right-click → Layers → Dense
- Configure first Dense: Units: 128, Activation: ReLU
- Connect DatasetInput output to Dense input (drag between pins)
- Add second Dense: Units: 10, Activation: Softmax
- Connect the layers together
Step 6: Add Output Node
- Right-click → Output → ModelOutput
- Connect the second Dense to the Output
+-------------+ +-------------+ +-------------+ +-------------+
| DatasetInput|-→--| Dense(128) |-→--| Dense(10) |-→--| ModelOutput |
+-------------+ | ReLU | | Softmax | +-------------+
+-------------+ +-------------+Step 7: Validate the Graph
Go to Nodes → Validate Graph (or Ctrl+Shift+V)
A green checkmark appears if valid. Common errors:
- "Missing input connection" → Ensure all inputs are connected
- "Shape mismatch" → Check layer dimensions
Step 8: Generate Code
PyTorch Output
import torch.nn as nn
class GeneratedModel(nn.Module):
def __init__(self):
super().__init__()
self.dense_1 = nn.Linear(784, 128)
self.relu_1 = nn.ReLU()
self.dense_2 = nn.Linear(128, 10)
self.softmax = nn.Softmax(dim=-1)
def forward(self, x):
x = self.dense_1(x)
x = self.relu_1(x)
x = self.dense_2(x)
x = self.softmax(x)
return xPyCyxWiz Output
import pycyxwiz as cx
dense_1 = cx.Dense(784, 128)
relu_1 = cx.ReLU()
dense_2 = cx.Dense(128, 10)
softmax = cx.Softmax()
def forward(x):
x = dense_1.forward(x)
x = relu_1.forward(x)
x = dense_2.forward(x)
x = softmax.forward(x)
return xCommon Node Types
| Category | Nodes |
|---|---|
| Data | DatasetInput, DataLoader |
| Layers | Dense, Conv2D, LSTM, Flatten |
| Activations | ReLU, Sigmoid, Tanh, Softmax, GELU |
| Normalization | BatchNorm, LayerNorm |
| Output | ModelOutput, LossOutput |
What You Learned
- Adding nodes from the context menu
- Connecting nodes with links
- Configuring node properties
- Validating the graph
- Generating Python code