Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Devel #2897

Merged
merged 76 commits into from
Feb 2, 2024
Merged

Devel #2897

merged 76 commits into from
Feb 2, 2024

Conversation

kmantel
Copy link
Collaborator

@kmantel kmantel commented Jan 31, 2024

No description provided.

dependabot bot and others added 30 commits December 20, 2023 16:23
…o 4 (#2865)" (#2867)

This reverts commit a73ab28.

Fails to download documentation artifacts.
Not used in compiled code. It's been absent from compiled parameter
structures for some time.

Signed-off-by: Jan Vesely <[email protected]>
Not used, but present in ports and projections.

Signed-off-by: Jan Vesely <[email protected]>
Not used in compiled execution.

Signed-off-by: Jan Vesely <[email protected]>
Signed-off-by: Jan Vesely <[email protected]>
Costs are combined in 'net_outcome' function of OCM.

Signed-off-by: Jan Vesely <[email protected]>
Drop more unused parameters from compiled structures.
Convert Projection to the new API.
Convert Port to the new API.
Remove unused cost calculation code from ControlSignal.
Convert Autodiff to the new API.
…ruct optuna study

They currently refer to the same object but 'opt_func' was passed explicitly
to the '_fit_optuna' method and can theoretically be different.

Signed-off-by: Jan Vesely <[email protected]>
…er instances

The test is using instantiated samplers so it needs to provide fixed
seeds to guarantee deterministic behaviour.
Passes 100 iterations of
test_parameter_optimization_ddm[LLVM-optuna_cmaes_sampler] without
failures.

Bug: #2874
Signed-off-by: Jan Vesely <[email protected]>
Replacing `sampler._rng`, as done in `PECOptimizationFunction`, only works for `optuna.RandomSampler`.
…es a warning

Fixes 2 instances of: UserWarning: No inputs provided in call to ...

Signed-off-by: Jan Vesely <[email protected]>
Simplify parametrization ids.

Fixes: SyntaxWarning: "is" with a literal. Did you mean "=="?
Signed-off-by: Jan Vesely <[email protected]>
Return both result and num_executions_before_finished.

Fixes: PytestBenchmarkWarning: Benchmark fixture was not used at all in this test!
Signed-off-by: Jan Vesely <[email protected]>
Makes the test benchmark agnostic.
Remove explicit benchmark re-run from the test.

Fixes: PytestBenchmarkWarning: Benchmark fixture was not used at all in this test!
Signed-off-by: Jan Vesely <[email protected]>
Having multiple parameter ports for parameters of the same name is now an
error.
There is no difference in emitted warnings with or without this filter.

Signed-off-by: Jan Vesely <[email protected]>
Tests should not return anything.
Fixes: PytestReturnNotNoneWarning: Expected None, but tests/composition/test_composition.py::TestNestedCompositions::test_invalid_projection_deletion_when_nesting_comps returned (Composition ocomp), which will be an error in a future version of pytest.  Did you mean to use `assert` instead of `return`?

Signed-off-by: Jan Vesely <[email protected]>
The construct has been deprecated in pytest warns about it:
PytestRemovedIn8Warning: Passing None has been deprecated.
  See https://docs.pytest.org/en/latest/how-to/capture-warnings.html#additional-use-cases-of-warnings-in-tests for alternatives in common use cases.

Address three different situations in tests in a way that does not hide
unrelated warnings.
 * test_composition.py::*_subset_duplicate_warnings:
   capture all warnings and search them for the expected message if
   verbosity == True
   assert that there are no warnings if verbosity == False

 * test_projection_specifications.py::test_no_warning_when_matrix_specified:
   Use filterwarnings mark to change the undesired warning into an error

 * test_control.py::test_model_based_num_estimates
   Use null context or pytest.warns context based on whether warning is
   expected

Signed-off-by: Jan Vesely <[email protected]>
jvesely and others added 14 commits January 21, 2024 16:14
requirements: Add scipy to requirements file
A collection of minor fixes and updates improves the handling of warnings by updating tests to use new interfaces.
The original distribution of warnings when running 4 test jobs was 7299 emitted warnings in 211 locations:
5 DeprecationWarning
9 FutureWarning
1 MatplotlibDeprecationWarning
4 PendingDeprecationWarning
12 PNLCompilerWarning
2 PytestBenchmarkWarning
1 PytestRemovedIn8Warning
1 PytestReturnNotNoneWarning
4 RuntimeWarning
1 SyntaxWarning
171 UserWarning

Changes in this PR reduce this to 5696 emitted warnings in 191 locations:
2 FutureWarning
1 MatplotlibDeprecationWarning
4 PendingDeprecationWarning
12 PNLCompilerWarning
2 RuntimeWarning
170 UserWarning

Individual commits target different warning classes.
Instead of each job on PR uploading and overwriting the artifact,
upload it only in the 'base' job.

Fixes: bb9dc17
	("ci: fix missing PR number (#2109)")

Signed-off-by: Jan Vesely <[email protected]>
The package is identical to the vanilla run but for the fixed minimum
versions of requirements.

Fixes: 76eeef5
	("ci/ga: Add a CI run with version restricted dependencies")

Signed-off-by: Jan Vesely <[email protected]>
Fixes: 76eeef5
	("ci/ga: Add a CI run with version restricted dependencies")

Signed-off-by: Jan Vesely <[email protected]>
Restrict upload of "pr_number" artifact to "base" job, it only runs in pull_requests and there is only one such job.
Add version restriction to the name of test result artifact.
Do not upload built dist packages for "min" version test run, they are not different from the standard run other that using stricter dependencies.
* github-actions(deps): bump actions/download-artifact from 3 to 4

Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 3 to 4.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](actions/download-artifact@v3...v4)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <[email protected]>

* github-actions(deps): bump actions/upload-artifact from 3 to 4

Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 3 to 4.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](actions/upload-artifact@v3...v4)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <[email protected]>

---------

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
]
try:
next_res_elem = np.reshape(next_res_elem, node_output_shapes[node.id])
except KeyError:

Check notice

Code scanning / CodeQL

Empty except Note test

'except' clause does nothing but pass and there is no explanatory comment.
@coveralls
Copy link

coveralls commented Jan 31, 2024

Coverage Status

coverage: 84.947% (+0.08%) from 84.871%
when pulling 47a59ec on devel
into f062be4 on master.

@jvesely
Copy link
Collaborator

jvesely commented Jan 31, 2024

base build of docs will fail (so no docs diff) because of issues with sphinx transitive dependencies that are addressed on devel.

Split numpy info from OS machine info.

Signed-off-by: Jan Vesely <[email protected]>
Runners are running in a VM and test workers running on all
vCPUs might get better performance than bare metal threads
depending on the hypervisor vCPU scheduling.

Signed-off-by: Jan Vesely <[email protected]>
It doesn't transfer to tests using the fixture (and does nothing).
Apply marks to parameters instead.

Simplify model construction. Instead of a fixture that returns a
constructor function, just use the constructor function directly.

Signed-off-by: Jan Vesely <[email protected]>
@kmantel kmantel merged commit 9b68ab8 into master Feb 2, 2024
62 of 64 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants