CyxWiz LogoCyxWiz
DocsNode Tutorial

Node Editor Tutorial

Build a simple neural network using the visual Node Editor. No coding required!

Goal

Create a 2-layer Dense network for MNIST classification:

Input (784) → Dense (128, ReLU) → Dense (10, Softmax) → Output

Step 1: Open the Node Editor

Go to View → Node Editor (or press Ctrl+1)

ActionControl
PanMiddle-mouse drag or Space+drag
ZoomMouse wheel
SelectLeft-click

Step 2: Add an Input Node

  1. Right-click on the canvas
  2. Navigate to Data → DatasetInput
  3. Configure in Properties: Input Shape: 784, Batch Size: 32
+------------------+
|  DatasetInput    |
+------------------+
| Dataset: [None]  |
|   Batch: 32      |
+------------------+
|        [output]→ |
+------------------+

Step 3-5: Add Dense Layers

  1. Right-click → Layers → Dense
  2. Configure first Dense: Units: 128, Activation: ReLU
  3. Connect DatasetInput output to Dense input (drag between pins)
  4. Add second Dense: Units: 10, Activation: Softmax
  5. Connect the layers together

Step 6: Add Output Node

  1. Right-click → Output → ModelOutput
  2. Connect the second Dense to the Output
+-------------+    +-------------+    +-------------+    +-------------+
| DatasetInput|-→--| Dense(128)  |-→--| Dense(10)   |-→--| ModelOutput |
+-------------+    | ReLU        |    | Softmax     |    +-------------+
                   +-------------+    +-------------+

Step 7: Validate the Graph

Go to Nodes → Validate Graph (or Ctrl+Shift+V)

A green checkmark appears if valid. Common errors:

  • "Missing input connection" → Ensure all inputs are connected
  • "Shape mismatch" → Check layer dimensions

Step 8: Generate Code

PyTorch Output
import torch.nn as nn

class GeneratedModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.dense_1 = nn.Linear(784, 128)
        self.relu_1 = nn.ReLU()
        self.dense_2 = nn.Linear(128, 10)
        self.softmax = nn.Softmax(dim=-1)

    def forward(self, x):
        x = self.dense_1(x)
        x = self.relu_1(x)
        x = self.dense_2(x)
        x = self.softmax(x)
        return x
PyCyxWiz Output
import pycyxwiz as cx

dense_1 = cx.Dense(784, 128)
relu_1 = cx.ReLU()
dense_2 = cx.Dense(128, 10)
softmax = cx.Softmax()

def forward(x):
    x = dense_1.forward(x)
    x = relu_1.forward(x)
    x = dense_2.forward(x)
    x = softmax.forward(x)
    return x

Common Node Types

CategoryNodes
DataDatasetInput, DataLoader
LayersDense, Conv2D, LSTM, Flatten
ActivationsReLU, Sigmoid, Tanh, Softmax, GELU
NormalizationBatchNorm, LayerNorm
OutputModelOutput, LossOutput

What You Learned

  • Adding nodes from the context menu
  • Connecting nodes with links
  • Configuring node properties
  • Validating the graph
  • Generating Python code