[ML 1] Neural Networks 🧠

[ML 1] Neural Networks 🧠

Table of contents

No heading

No headings in the article.

What are the structures of a neural network?

A simple neural 🧠 network includes three layers:

  1. input

  2. hidden

  3. output

Each layer contains a collection of nodes (units); which simulate brain activities. The input layer represents X features, and the output layer contains the target variables. The hidden layer uses edges (weights) to transition between each layer. An example is a binary classification with one node for the output layer.

from sklearn.model_selection import train_test_split
import tensorflow as tf

from tensorflow import keras

from sklearn.datasets import fetch_california_housing

📍Importing the libraries for the demonstration of a Tensorflow Neural network. 🎓

model = keras.models.Sequential([
    keras.layers.Dense(30, activation="relu", input_shape=X_train.shape[1:]),
    keras.layers.Dense(8, activation="relu"),
    keras.layers.Dense(1)
])

📍The code snippet creates a Keras sequential model using a list of layer instances, including two fully connected hidden layers with 30 and 8 nodes.

model.compile(loss="mean_squared_error",
              optimizer=tf.keras.optimizers.Adam(0.02))

📍We compile the model using an Adam optimizer with a learning rate (0.02) and an MSE error function.

model.fit(X_train, y_train, epochs=300)

📍Training the model on the training dataset derived from the Housing dataset 🏘️.

housing = fetch_california_housing()

X_train, X_test, y_train, y_test = train_test_split(
    housing.data, housing.target)

📍After training, produce the mean squared error (MSE) to evaluate the performance of a neural network model 🤖.

mse_test = model.evaluate(X_test, y_test)

print("The mean squared error is ", mse_test)

💡
What are your thoughts about the introductory sessions? 💫
💡
Connect with me on LinkedIn🔗 👉Here
💡
Follow my GitHub 💻for fascinating projects and opportunities for collabs 💫
💡
💫 Follow my Quora ✍️for daily writing updates on tech, French, and world history 🏛️