dmlfw (Daniyal Machine Learning Framework)

Demonstrates batch gradient descent for linear regression using ml-framework. More...

#include <stdio.h>
#include <dmlfw.h>
#include <stdlib.h>
#include <unistd.h>
Include dependency graph for batch_gd.c:
#define FREQUENCY_OF_PRINTING_COST   50000
 
dmlfw_gradient_descent_optionsget_gradient_descent_options ()
 Creates and configures gradient descent options object.
 
FILE * gnuplot
 
#define LEARNING_RATE   0.0001
 
void load_dataset (dmlfw_mat_double **x, dmlfw_column_vec_double **y)
 Loads the dataset into X (features matrix) and Y (target vector). Adds bias column filled with 1.0.
 
int main ()
 Main function to execute batch gradient descent linear regression example.
 
#define MODEL_FILE_NAME   "example-1-model.csv"
 
#define NUMBER_OF_ITERATIONS   3000000
 
int on_iteration_complete (uint64_t iteration_number, void *y, void *predicted_y, void *model, double regularization_parameter)
 Progress callback called on each gradient descent iteration. Logs cost, updates plot files, shows graph via gnuplot at intervals.
 
void print_error_and_exit ()
 Prints error string from ml-framework and exits program.
 
#define REGULARIZATION_PARAMETER   0.5
 
#define SHOW_GRAPH   1
 
#define TRAINING_DATASET   "IceCreamSales_training_examples.csv"
 

Detailed Description

Demonstrates batch gradient descent for linear regression using ml-framework.

Author
Mohammed Daniyal
Version
1.0
Date
2025-09-26

This example loads the IceCreamSales training dataset, trains a linear regression model using batch gradient descent with regularization, tracks cost progress, optionally plots cost and fitted line using gnuplot, and saves the final model to CSV.

Defines custom progress callback for iteration logging and visualization.

Usage: ./batch_gd

Macro Definition Documentation

◆ FREQUENCY_OF_PRINTING_COST

#define FREQUENCY_OF_PRINTING_COST   50000

Frequency of printing/logging cost during training

◆ LEARNING_RATE

#define LEARNING_RATE   0.0001

Learning rate for gradient descent

◆ MODEL_FILE_NAME

#define MODEL_FILE_NAME   "example-1-model.csv"

Output model CSV file path

◆ NUMBER_OF_ITERATIONS

#define NUMBER_OF_ITERATIONS   3000000

Maximum number of iterations for gradient descent

◆ REGULARIZATION_PARAMETER

#define REGULARIZATION_PARAMETER   0.5

Regularization parameter (lambda)

◆ SHOW_GRAPH

#define SHOW_GRAPH   1

Flag to enable (1) or disable (0) graph plotting

◆ TRAINING_DATASET

#define TRAINING_DATASET   "IceCreamSales_training_examples.csv"

Training dataset CSV file path

Function Documentation

◆ get_gradient_descent_options()

dmlfw_gradient_descent_options * get_gradient_descent_options ( )

Creates and configures gradient descent options object.

Sets learning rate, number of iterations, gradient descent type, and associates the progress callback.

Returns
Pointer to configured dmlfw_gradient_descent_options or NULL on error.

◆ load_dataset()

void load_dataset ( dmlfw_mat_double **  x,
dmlfw_column_vec_double **  y 
)

Loads the dataset into X (features matrix) and Y (target vector). Adds bias column filled with 1.0.

Parameters
[out]xPointer to the features matrix pointer.
[out]yPointer to the target column vector pointer.

◆ main()

int main ( )

◆ on_iteration_complete()

int on_iteration_complete ( uint64_t  iteration_number,
void *  y,
void *  predicted_y,
void *  model,
double  regularization_parameter 
)

Progress callback called on each gradient descent iteration. Logs cost, updates plot files, shows graph via gnuplot at intervals.

Parameters
iteration_numberCurrent iteration number.
yActual target vector.
predicted_yPredicted target vector.
modelCurrent model parameters.
regularization_parameterRegularization coefficient lambda.
Returns
0 on success, negative to abort.

◆ print_error_and_exit()

void print_error_and_exit ( )

Prints error string from ml-framework and exits program.

Variable Documentation

◆ gnuplot

FILE* gnuplot

Gnuplot file pointer used for plotting progress