From ef2d1923382d8b67a0432075206de323a2fa79f6 Mon Sep 17 00:00:00 2001 From: Zizhe Wang <zizhe.wang@tu-dresden.de> Date: Sat, 8 Jun 2024 09:27:19 +0200 Subject: [PATCH] docs update new feature --- .gitignore | 4 +-- README.md | 94 ++++++++++-------------------------------------------- 2 files changed, 18 insertions(+), 80 deletions(-) diff --git a/.gitignore b/.gitignore index 662a506..a0adb3d 100644 --- a/.gitignore +++ b/.gitignore @@ -9,7 +9,6 @@ *.mat *.tmp *.exe -*.json *.makefile archive/ __pycache__/ @@ -17,4 +16,5 @@ __pycache__/ *.tokens /src/modelicaListener.py /src/modelicaParser.py -/src/modelicaLexer.py \ No newline at end of file +/src/modelicaLexer.py +feature_model.json \ No newline at end of file diff --git a/README.md b/README.md index 7a8648b..d26c193 100644 --- a/README.md +++ b/README.md @@ -15,17 +15,18 @@ I want to design a general framework to solve these two challenges by 1. coupling Python's MOO frameworks to Modelica using OMPython, -2. Speed up MOO by enabling parallel computing. +2. Speed up MOO by enabling parallel computing and adaptive instance selection. ## Framework #### Highlights: -1. **Easy to configure:** All settings and configurations can be defined in `config.py`. -2. **SoTA algorithms for MOO:** Support different libraries and algorithms. +1. **Easy to configure:** All settings and configurations can be defined in `config.json`. +2. **SoTA algorithms for MOO:** Dynamic import of algorithms from *pymoo*. 3. **Enable use of** **parallel computing**: For accelerated process. -4. **Support transformation into feature models**: To better analyze and understand large-scale models. -5. **Comprehensive debugging system**: Debugging functions for all critical steps. +4. **Enable use of adaptive instance selection:** Automated search space reduction. +5. **Support transformation into feature models**: To better analyze and understand large-scale models. +6. **Comprehensive debugging system**: Debugging functions for all critical steps. #### Structure: @@ -36,10 +37,13 @@ I want to design a general framework to solve these two challenges by |-- parse_modelica.py |-- feature_model.py (Optimization Operation) - |-- config.py + |-- config.json + |-- config.py |-- optimize_main.py |-- parallel_computing.py + |-- adaptive_instance_selection.py |-- optimization_libraries.py + |-- evaluate.py ``` * (Feature Model Transformation) @@ -47,13 +51,12 @@ I want to design a general framework to solve these two challenges by * `parse_modelica.py`: parse a Modelica model to extract it components and their parameters * `feature_model.py`: create a feature model and add the extracted components * (Optimization Operation) - * `config.py`: global settings and configurations + * `config.json` & `config.py`: global settings and configurations * `optimize_main.py`: main optimization script - * `parallel_computing.py`: parallel_computing - * `optimization_libraries`: initialization libraries and algorithms - - -(*) There is another one-script Python file provided, which corresponds to the 4 optimization scripts. + * `parallel_computing.py`: parallel computing + * `adaptive_instance_selection.py`: automated search space reduction + * `optimization_libraries.py`: dynamic import of algorithms from *pymoo* + * `evaluate.py`: performance evaluation (time efficiency, optimization accuracy, additional statistical analysis) #### Usage @@ -63,9 +66,7 @@ I want to design a general framework to solve these two challenges by [https://wangzizhe.github.io/MOO4Modelica/docs/Example.html)](https://wangzizhe.github.io/MOO4Modelica/docs/Example.html) -## Related Work - -### I. Publications +## Background ### 1. A Multi-objective Optimization Algorithm and Process for Modelica Model @@ -97,66 +98,3 @@ Bender, Daniel. "DESA: Optimization of variable structure modelica models using **Zizhe's thoughts:** * This only works in Dymola... - -### II. Frameworks - -#### 0. pymoo: Multi-objective Optimization in Python - -https://pymoo.org - -The framework offers state-of-the-art single- and multi-objective optimization algorithms and many more features related to multi-objective optimization such as visualization and decision-making. - -#### 1. **DEAP (Distributed Evolutionary Algorithms in Python)** - -DEAP is a flexible framework for evolutionary algorithms. It provides tools for single and multi-objective optimization, making it suitable for a wide range of optimization problems. - -- **Documentation**: [https://deap.readthedocs.io](https://deap.readthedocs.io/) -- **GitHub**: https://github.com/DEAP/deap - -#### 2. **Platypus** - -Platypus is a framework for evolutionary computing with a focus on multi-objective optimization. It supports a variety of algorithms and is designed to be user-friendly. - -- **Documentation**: [https://platypus.readthedocs.io](https://platypus.readthedocs.io/) -- **GitHub**: https://github.com/Project-Platypus/Platypus - -#### 3. **PyGMO (Python Global Multiobjective Optimizer)** - -PyGMO is a scientific library for massively parallel optimization. It provides a wide range of optimization algorithms, including those for multi-objective optimization. - -- **Documentation**: https://esa.github.io/pygmo2/ -- **GitHub**: https://github.com/esa/pygmo2 - -#### 4. **SciPy** - -SciPy is a fundamental library for scientific computing in Python, which includes several optimization routines. While it focuses more on classical optimization algorithms, it is still quite powerful for certain types of optimization problems. - -- **Documentation**: https://docs.scipy.org/doc/scipy/reference/optimize.html - -#### 5. **Nevergrad** - -Nevergrad is a gradient-free optimization platform by Facebook AI Research. It provides a variety of algorithms suitable for optimization tasks where gradients are not available. - -- **Documentation**: https://facebookresearch.github.io/nevergrad/ -- **GitHub**: https://github.com/facebookresearch/nevergrad - -#### 6. **Optuna** - -Optuna is an automatic hyperparameter optimization software framework, particularly for machine learning, but it can be used for general optimization tasks. It supports both single-objective and multi-objective optimization. - -- **Documentation**: https://optuna.readthedocs.io/en/stable/ -- **GitHub**: https://github.com/optuna/optuna - -#### 7. **NSGA-II (Non-dominated Sorting Genetic Algorithm II) Implementations** - -While NSGA-II is available in `pymoo`, other libraries also provide implementations of this popular algorithm for multi-objective optimization: - -- **jMetalPy**: https://jmetalpy.readthedocs.io/en/latest/ (focused on multi-objective optimization) -- **Inspyred**: http://aarongarrett.inspyred.github.io/ (another evolutionary computing framework) - -#### 8. **CMA-ES (Covariance Matrix Adaptation Evolution Strategy)** - -CMA-ES is a robust optimization algorithm suitable for difficult optimization problems. Libraries such as `pycma` provide implementations of this algorithm. - -- **Documentation**: https://pypi.org/project/cma/ -- **GitHub**: https://github.com/CMA-ES/pycma -- GitLab