Now that you understand the fundamental building blocks like layers and activation functions, let's put theory into practice. This section guides you through constructing your first neural networks using both the Sequential and Functional APIs in Keras. We'll build simple models suitable for a basic classification task, reinforcing the concepts covered in this chapter.
We'll start by creating some synthetic data to work with. This allows us to focus purely on the network construction aspect without needing to worry about complex data loading procedures yet. We will use Scikit-learn's make_moons
function to generate a two-dimensional dataset where two classes are intertwined in a non-linear way, making it a good simple challenge for a neural network.
import numpy as np
from sklearn.datasets import make_moons
import keras
from keras import layers
from keras import models
from keras import utils
# Generate synthetic data
X, y = make_moons(n_samples=100, noise=0.15, random_state=42)
# Print shapes to verify
print(f"Features shape: {X.shape}")
print(f"Labels shape: {y.shape}")
# Optionally, visualize the data (requires matplotlib)
# import matplotlib.pyplot as plt
# plt.scatter(X[:, 0], X[:, 1], c=y, cmap='viridis')
# plt.title("Synthetic Moons Dataset")
# plt.xlabel("Feature 1")
# plt.ylabel("Feature 2")
# plt.show()
This code generates 100 data points, each with two features (X.shape
will be (100, 2)
), and corresponding binary labels (y.shape
will be (100,)
).
The Sequential API is ideal for models where layers are stacked linearly, one after the other. It's the simplest way to get started. Let's build a small network with one hidden layer:
input_shape
in the first layer (or explicitly using keras.Input
). Our data has 2 features.Dense
hidden layer with 8 neurons and a ReLU activation function.Dense
layer with 1 neuron and a sigmoid
activation function, suitable for binary classification.# Define the model using the Sequential API
model_sequential = models.Sequential(
[
keras.Input(shape=(2,), name="input_layer"), # Define input shape
layers.Dense(8, activation='relu', name="hidden_layer"),
layers.Dense(1, activation='sigmoid', name="output_layer")
],
name="sequential_model"
)
# Display the model's architecture
model_sequential.summary()
Running model_sequential.summary()
provides a concise overview of the layers, their output shapes, and the number of trainable parameters:
Model: "sequential_model"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type) ┃ Output Shape ┃ Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ hidden_layer (Dense) │ (None, 8) │ 24 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ output_layer (Dense) │ (None, 1) │ 9 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 33 (132.00 B)
Trainable params: 33 (132.00 B)
Non-trainable params: 0 (0.00 B)
The None
in the output shape represents the batch size, which can vary. Notice how the parameters are calculated:
The Functional API offers more flexibility for building complex models with multiple inputs/outputs or shared layers. While our current network is simple, let's rebuild it using this API to understand the workflow.
The core idea is to define layers as functions that operate on tensors.
keras.Input
, specifying the shape of the input data.Model
instance by specifying the input and output tensors.# Define the input tensor
inputs = keras.Input(shape=(2,), name="input_tensor")
# Define the hidden layer, connected to the input
hidden = layers.Dense(8, activation='relu', name="hidden_layer")(inputs)
# Define the output layer, connected to the hidden layer
outputs = layers.Dense(1, activation='sigmoid', name="output_layer")(hidden)
# Create the model
model_functional = models.Model(inputs=inputs, outputs=outputs, name="functional_model")
# Display the model's architecture
model_functional.summary()
The summary output will be identical to the Sequential model's summary, confirming we've built the same architecture:
Model: "functional_model"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type) ┃ Output Shape ┃ Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ input_tensor (InputLayer) │ [(None, 2)] │ 0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ hidden_layer (Dense) │ (None, 8) │ 24 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ output_layer (Dense) │ (None, 1) │ 9 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 33 (132.00 B)
Trainable params: 33 (132.00 B)
Non-trainable params: 0 (0.00 B)
A significant advantage of the Functional API is the ability to easily visualize the model graph. You can use Keras's plot_model
utility (which requires the pydot
and graphviz
libraries to be installed).
# Generate a visual representation of the functional model
# Note: Requires pydot and graphviz installation
# utils.plot_model(model_functional, show_shapes=True, show_layer_activations=True, to_file="functional_model.png")
This would generate a diagram like the one below, clearly showing the flow of data through the layers:
A diagram representing the functional model architecture, showing the input layer connected to the hidden dense layer (ReLU activation), which connects to the output dense layer (Sigmoid activation). Shapes are indicated for each layer's output.
You have now successfully built your first neural network architectures using Keras! You defined the layers, specified activation functions, and connected them using both the Sequential and Functional APIs. The next step, covered in Chapter 3, is to compile these models by defining a loss function and optimizer, and then train them on data using the fit
method.
© 2025 ApX Machine Learning