Improve mlflow integration and add more models (#1331)

* Add more spark models and improved mlflow integration

* Update test_extra_models, setup and gitignore

* Remove autofe

* Remove autofe

* Remove autofe

* Sync changes in internal

* Fix test for env without pyspark

* Fix import errors

* Fix tests

* Fix typos

* Fix pytorch-forecasting version

* Remove internal funcs, rename _mlflow.py

* Fix import error

* Fix dependency

* Fix experiment name setting

* Fix dependency

* Update pandas version

* Update pytorch-forecasting version

* Add warning message for not has_automl

* Fix test errors with nltk 3.8.2

* Don't enable mlflow logging w/o an active run

* Fix pytorch-forecasting can't be pickled issue

* Update pyspark tests condition

* Update synapseml

* Update synapseml

* No parent run, no logging for OSS

* Log when autolog is enabled

* upgrade code

* Enable autolog for tune

* Increase time budget for test

* End run before start a new run

* Update parent run

* Fix import error

* clean up

* skip macos and win

* Update notes

* Update default value of model_history
This commit is contained in:
Li Jiang
2024-08-13 15:53:47 +08:00
committed by GitHub
parent bd34b4e75a
commit 5bfa0b1cd3
22 changed files with 3145 additions and 317 deletions

View File

@@ -54,10 +54,15 @@ jobs:
pip install -e .
python -c "import flaml"
pip install -e .[test]
- name: On Ubuntu python 3.8, install pyspark 3.2.3
if: matrix.python-version == '3.8' && matrix.os == 'ubuntu-latest'
- name: On Ubuntu python 3.10, install pyspark 3.4.1
if: matrix.python-version == '3.10' && matrix.os == 'ubuntu-latest'
run: |
pip install pyspark==3.2.3
pip install pyspark==3.4.1
pip list | grep "pyspark"
- name: On Ubuntu python 3.11, install pyspark 3.5.1
if: matrix.python-version == '3.11' && matrix.os == 'ubuntu-latest'
run: |
pip install pyspark==3.5.1
pip list | grep "pyspark"
- name: If linux and python<3.11, install ray 2
if: matrix.os == 'ubuntu-latest' && matrix.python-version != '3.11'
@@ -77,11 +82,6 @@ jobs:
if: matrix.python-version == '3.8' || matrix.python-version == '3.9'
run: |
pip install -e .[vw]
- name: Uninstall pyspark on (python 3.9) or windows
if: matrix.python-version == '3.9' || matrix.os == 'windows-2019'
run: |
# Uninstall pyspark to test env without pyspark
pip uninstall -y pyspark
- name: Test with pytest
if: matrix.python-version != '3.10'
run: |