Skip to content

econometron.utils.optimizers

Module: econometron.utils

Overview

The econometron.utils.optimizers module provides a suite of optimization algorithms for minimizing scalar objective functions, particularly suited for parameter estimation in econometric models, state-space models, and other statistical applications. The module includes three main optimization methods: Simulated Annealing (SA), Genetic Algorithm (GA), and Quasi-Newton (QN), along with a utility function for function evaluation. These methods are designed to handle non-linear, non-convex objective functions, such as negative log-likelihoods in maximum likelihood estimation (MLE) or posterior densities optimization in Bayesian inference. The module integrates with other econometron components, such as kalman_objective for state-space models or ols_estimator for regression, to perform robust parameter estimation.

Objective

The goal of these optimizers is to minimize a scalar objective function , typically the negative log-likelihood or negative log-posterior, subject to parameter bounds or constraints:

Where:

  • : Parameter vector (shape: n_params).
  • : Lower and upper bounds for parameters.
  • : Objective function, often the negative log-likelihood or a similar criterion.

Functions

1. evaluate_func(function, params)

Purpose: Evaluates a given objective function at a parameter vector, providing a consistent interface for function evaluation across optimization algorithms. Includes debugging output to track parameter evaluations.

Parameters:

NameTypeDescriptionDefault
functionCallableObjective function to evaluate, taking a parameter vector as input.None
paramsnp.ndarrayParameter vector to evaluate (shape: n_params).None

Returns: float – Function value at params. Returns float('inf') if function is not callable.

Explanation:

  • Checks if function is callable; if not, returns infinity to indicate an invalid evaluation.
  • Prints the evaluated parameter vector for debugging purposes.
  • Calls function(params) and returns the result.

Example:

python
from econometron.utils.optimizers import evaluate_func
import numpy as np

def obj_func(x):
    return np.sum(x**2)

params = np.array([1.0, 2.0])
result = evaluate_func(obj_func, params)
print(result)  # Output: 5.0

2. simulated_annealing(...)

Purpose: Implements Simulated Annealing (SA), a probabilistic optimization algorithm inspired by the annealing process in metallurgy. Designed to escape local minima by accepting worse solutions with a probability that decreases as the "temperature" cools, making it suitable for complex, non-convex objective functions.

Parameters:

NameTypeDescriptionDefault
functionCallableObjective function to minimize.None
xlist or np.ndarrayInitial parameter vector (shape: n_params).None
lower_boundslist or np.ndarrayLower bounds for parameters (shape: n_params).None
upper_boundslist or np.ndarrayUpper bounds for parameters (shape: n_params).None
TfloatInitial temperature, controlling initial exploration.None
cooling_ratefloatTemperature reduction factor (0 < cooling_rate < 1).None
num_temperaturesintNumber of temperature levels.None
num_stepsintNumber of trial points per temperature level.None
seed_valueintRandom seed for reproducibility.None
max_evalsintMaximum number of function evaluations.None
epsfloatConvergence threshold for objective value change.1e-2

Returns:

dict – Contains:

  • x (np.ndarray): Optimal parameter vector.
  • fun (float): Negated objective function value at the optimum.
  • N_FUNC_EVALS (int): Number of function evaluations.
  • message (str): Termination message.

Explanation:

  • Initialization: Random seed set; current parameters x, best xopt, and fopt initialized; step sizes VM set to bound ranges; evaluation counters initialized.

  • Main Loop: For each temperature, performs trials per parameter:

    • Proposes new parameter vector xp by perturbation.
    • Resamples out-of-bounds proposals.
    • Accepts proposals improving the objective or probabilistically if worse ().
    • Updates best solution if improved.
  • Step Adaptation: Step sizes VM adjusted based on acceptance rates.

  • Cooling: Temperature multiplied by cooling_rate.

  • Convergence: Stops if evaluations exceed max_evals or objective change below eps for sa_neps iterations.

Example:

python
from econometron.utils.optimizers import simulated_annealing
import numpy as np

def obj_func(x): return np.sum(x**2)

x0 = np.array([1.0, 2.0])
lb = np.array([-5.0, -5.0])
ub = np.array([5.0, 5.0])

result = simulated_annealing(
    obj_func, x0, lb, ub,
    T=5.0, cooling_rate=0.9,
    num_temperatures=5, num_steps=10,
    seed_value=42, max_evals=1000
)
print(result)

3. genetic_algorithm(...)

Purpose: Implements a Genetic Algorithm (GA), a population-based evolutionary optimization method inspired by natural selection. Suitable for global optimization of complex, non-convex functions.

Parameters:

NameTypeDescriptionDefault
funcCallableObjective function to minimize.None
x0np.ndarrayInitial parameter vector.None
lbnp.ndarrayLower bounds.None
ubnp.ndarrayUpper bounds.None
pop_sizeintPopulation size.50
n_genintNumber of generations.100
crossover_ratefloatCrossover probability.0.8
mutation_ratefloatMutation probability.0.1
elite_fracfloatFraction of elite individuals preserved.0.1
seedintRandom seed.1
verboseboolPrint progress every 10 generations.True

Returns:

dict – Contains:

  • x (np.ndarray): Optimal parameter vector.
  • fun (float): Objective function value.
  • nfev (int): Number of evaluations.
  • message (str): Termination message.

Explanation:

  • Input Validation: Converts inputs to arrays; checks bounds; ensures lb >= 1e-6.

  • Initialization: Population created; first individual set to x0; others uniformly sampled. Objective evaluated.

  • Main Loop: For each generation:

    • Tournament selection for parents.
    • Crossover with probability crossover_rate.
    • Mutation with probability mutation_rate.
    • Preserve elite individuals (elite_frac).
    • Evaluate offspring and update population.
  • Tracks best solution.

  • Output: Progress printed if verbose=True. Returns best solution.

Example:

python
from econometron.utils.optimizers import genetic_algorithm
import numpy as np

def obj_func(x): return np.sum(x**2)

x0 = np.array([1.0, 2.0])
lb = np.array([-5.0, -5.0])
ub = np.array([5.0, 5.0])

result = genetic_algorithm(obj_func, x0, lb, ub, pop_size=50, n_gen=100)
print(result)

4. minimize_qn(x0, func, maxit=500, gtol=None, ptol=1e-7, verbose=False)

Purpose: Implements a Quasi-Newton (QN) optimization method using BFGS. Gradient-based; suitable for smooth, differentiable functions.

Parameters:

NameTypeDescriptionDefault
x0np.ndarrayInitial parameter vector.None
funcCallableScalar objective function to minimize.None
maxitintMaximum iterations.500
gtolfloatGradient tolerance; default machine epsilon^(1/3).None
ptolfloatParameter change tolerance.1e-7
verboseboolPrint iteration details.False

Returns:

  • x (np.ndarray): Optimal parameter vector.
  • crit (np.ndarray): Convergence criteria array [status, grad_norm, param_change, func_value, iterations].

Explanation:

  • Hessian initialized as identity. Gradient computed.
  • Main loop: computes search direction, line search, updates parameters, updates Hessian with BFGS.
  • Stops if gradient norm < gtol, parameter change < ptol, or maxit reached.
  • Verbose prints iteration info.

Example:

python
from econometron.utils.optimizers import minimize_qn
import numpy as np

def obj_func(x): return np.sum(x**2)

x0 = np.array([1.0, 2.0])
x, crit = minimize_qn(x0, obj_func, verbose=True)
print("Solution:", x, "Criteria:", crit)

Notes

  • Algorithm Characteristics: SA is good for global minima; GA explores broad space; QN efficient for smooth functions.
  • Integration: Compatible with kalman_objective and ols_estimator.
  • Verbose Output: SA and GA print parameter/function values; QN prints iteration details.
  • Numerical Stability: Bounds handled explicitly; QN uses line search and BFGS stabilization.

Dependencies

  • numpy, scipy.stats.norm, econometron.utils.solver, colorama.

Use Case Examples

  • DSGE models with kalman_objective.
  • Regression via ols_estimator.
  • Bayesian inference via MAP estimates.