Metadata-Version: 2.4
Name: MDAF_benchmarks
Version: 0.1.0
Summary: A library for mathematical benchmark functions.
Author-email: Iannick Gagnon <iannick.gagnon@etsmtl.ca>
License: MIT
Requires-Python: >=3.8
Description-Content-Type: text/markdown
Requires-Dist: autograd==1.6.2
Requires-Dist: cloudpickle==3.0.0
Requires-Dist: colorama==0.4.6
Requires-Dist: contourpy==1.2.1
Requires-Dist: cycler==0.12.1
Requires-Dist: dill==0.3.8
Requires-Dist: fonttools==4.51.0
Requires-Dist: future==1.0.0
Requires-Dist: iniconfig==2.0.0
Requires-Dist: kiwisolver==1.4.5
Requires-Dist: line_profiler==4.1.3
Requires-Dist: matplotlib==3.8.4
Requires-Dist: numpy==1.26.4
Requires-Dist: packaging==25.0
Requires-Dist: pandas==2.2.2
Requires-Dist: pillow==10.3.0
Requires-Dist: pluggy==1.5.0
Requires-Dist: pyparsing==3.1.2
Requires-Dist: pytest==8.2.0
Requires-Dist: python-dateutil==2.9.0.post0
Requires-Dist: pytz==2024.1
Requires-Dist: scipy==1.13.0
Requires-Dist: six==1.16.0
Requires-Dist: tzdata==2024.1
Provides-Extra: dev

![](./doc/img/logo_top.png)
![](./doc/img/logo_bottom.png)

The Metaheuristics Design And Analysis Framework (MDAF) is an open-source collection of Python modules designed specifically for the metaheuristics research community. It provides off-the-shelf tools that promote and facilitate rigorous scientific research based on firmly established best practices.

# The Benchmarks Module

The `MDAF_benchmarks` module  is a specialized tool designed to streamline the process of defining and evaluating objective functions. It offers a low-code template that simplifies the setup and execution of complex objective functions, allowing researchers to concentrate more on refining their algorithms and less on the intricacies of coding.

## Version

This library uses calendar versioning (CalVer) (`MAJOR.YEAR.NUMBER`) where : 

- `MAJOR`: This represents a major release.
- `YEAR`: The year of the release.
- `NUMBER`: The release number for the given year.

Calendar versioning was preferred over semantic versioning (SemVer) to facilitate interpretation in journal articles.

## How to cite

**APA Style**:
```
Gagnon, I. (YEAR). Metaheuristics Design and Analysis Framework (MDAF), Version MAJOR.YEAR.NUMBER [Software]. Retrieved from https://github.com/iannickgagnon/MDAF_benchmarks.
```

**MLA Style**: 
```
Gagnon, Iannick. Metaheuristics Design And Analysis Framework. Version MAJOR.YEAR.NUMBER. YEAR. Software. Accessed DAY-MONTH-YEAR. https://github.com/iannickgagnon/MDAF_benchmarks.
```

**Chicago Style**:
```
Gagnon, Iannick. YEAR. Metaheuristics Design And Analysis Framework (MDAF), Version MAJOR.YEAR.NUMBER. Software. https://github.com/iannickgagnon/MDAF_benchmarks.
```

Replace `YEAR` with the year of the version used, `MAJOR.YEAR.NUMBER` with the full version and `DAY-MONTH-YEAR` with the date you accessed the software online (i.e., the date at which you cloned the repo).

## Features

- **Automated accounting**: The number times a function is evaluated is automatically incremented and stored (`ObjectiveFunction.nb_calls`).
- **Automatic differentiation**: Built-in numerical first- and second-order differentiation (`ObjectiveFunction.compute_first_derivative()` and `ObjectiveFunction.compute_second_derivative()`).
- **Parallel evaluation**: Built-in support for parallel evaluation (`ObjectiveFunction.parallel_evaluate()`).
- **Profiling**: Built-in profiling of the objective function calls (`ObjectiveFunction.profile()`).
- **Shifting**: Apply shifts to evaluate robustness to positional bias (`ObjectiveFunction.apply_shift()`).
- **Rotating**: Apply rotations to evaluate robustness to linkage (`ObjectiveFunction.apply_rotation()`).
- **Noisy evaluation**: Apply noise to evaluate robustness to noisy environments (`ObjectiveFunction.apply_noise()`).
- **Timing**: Built-in timing for performance optimization and comparison (`ObjectiveFunction.time()`).
- **Visualization**: Dynamic visualization in 2D or 3D (`ObjectiveFunction.visualize()`).

## How to install

Install Python's package manager if you don't have it already:

* https://pip.pypa.io/en/stable/installation/

Run the following command:

```bash
pip install MDAF_benchmarks
```

Use the following command to install this library:

## How to clone

Install the open source git version control software if you don't have it already: 

* https://git-scm.com/downloads

Clone this repository to your local machine using:

```bash
git clone https://github.com/iannickgagnon/MDAF_benchmarks.git
```

If you aren't familiar with git, you may wish to consult the following online e-book for free:

* https://git-scm.com/book/en/v2 

I have also greatly benefited from the following official cheat sheet from GitHub:
* https://education.github.com/git-cheat-sheet-education.pdf

## Documentation

Visit the `MDAF_benchmarks` wiki ([click here](https://github.com/iannickgagnon/MDAF_benchmarks/wiki)) for the following : 

* User guide
* Contribution guide
* API reference

## Future work

- **Multiple objectives**: Derive a `MultiobjectiveFunction` class from `ObjectiveFunction`. See Issue https://github.com/iannickgagnon/MDAF_benchmarks/issues/9.
- **Enhance timing**: Enable timing for the `ObjectiveFunction.parallel_evaluate()` method. See Issue https://github.com/iannickgagnon/MDAF_benchmarks/issues/11.

## Contact

**Please note that I am not looking for prospective students. I sincerely appreciate your messages, but I cannot respond to them because of other priorities.**

While I cannot guarantee a timely response, I invite you to contact me if you have any comments : iannick.gagnon@etsmtl.ca. Also, feel free to add me on LinkedIn : https://www.linkedin.com/in/iannick-gagnon-3304311a6/.
