Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Change Warn on excessive memory consumption in background jobs to info #45682

Closed
5 of 8 tasks
AndyXheli opened this issue Jun 5, 2024 · 6 comments · Fixed by #45530
Closed
5 of 8 tasks

[Bug]: Change Warn on excessive memory consumption in background jobs to info #45682

AndyXheli opened this issue Jun 5, 2024 · 6 comments · Fixed by #45530
Labels

Comments

@AndyXheli
Copy link

AndyXheli commented Jun 5, 2024

⚠️ This issue respects the following points: ⚠️

Bug description

Log file it getting spammed with these warning. i would recommended this to info and not warning,
Used memory grew by more than 10 MB when executing job OC\FilesMetadata\Job\UpdateSingleMetadata (id: 175902, arguments: ["usernae",1042806]): 26.1 MB (before: 10.7 MB)

Steps to reproduce

N/A

Expected behavior

Should not see this as a warning more info then a warning

Installation method

Community Manual installation with Archive

Nextcloud Server version

29

Operating system

Debian/Ubuntu

PHP engine version

PHP 8.3

Web server

Apache (supported)

Database engine version

MariaDB

Is this bug present after an update or on a fresh install?

Upgraded to a MAJOR version (ex. 22 to 23)

Are you using the Nextcloud Server Encryption module?

Encryption is Disabled

What user-backends are you using?

  • Default user-backend (database)
  • LDAP/ Active Directory
  • SSO - SAML
  • Other

Configuration report

No response

List of activated Apps

No response

Nextcloud Signing status

No response

Nextcloud Logs

No response

Additional info

image

@AndyXheli AndyXheli added 0. Needs triage Pending check for reproducibility or if it fits our roadmap bug labels Jun 5, 2024
AndyXheli added a commit to AndyXheli/server that referenced this issue Jun 5, 2024
Signed-off-by: Andy Xheli <[email protected]>

Fix 

[Bug]: Change Warn on excessive memory consumption in background jobs to info nextcloud#45682

Signed-off-by: Andy Xheli <[email protected]>
@solracsf solracsf linked a pull request Jun 6, 2024 that will close this issue
4 tasks
@solracsf solracsf added 2. developing Work in progress and removed 0. Needs triage Pending check for reproducibility or if it fits our roadmap labels Jun 6, 2024
@solracsf
Copy link
Member

solracsf commented Jun 6, 2024

Already on its way at #45530

@AndyXheli
Copy link
Author

Hey @joshtrichards sorry to bother .

Thanks for the update so under the Nc29 backport till shows 10mb but under the original pull the 10mb was changed to 50 ?

image

@joshtrichards
Copy link
Member

@AndyXheli Indeed. Finally got around to addressing the failed automated backport draft just now. #45843. Will be merged into v29 branch after tests pass.

@AndyXheli
Copy link
Author

Hi still seeing this on NC 29.0.4

{"reqId":"kImcJ5WNoR8EST80crIE","level":2,"time":"2024-07-28T21:17:03-05:00","remoteAddr":"","user":"--","app":"cron","method":"","url":"--","message":"Cron job used more than 300 MB of ram after executing job OCA\\Recognize\\BackgroundJobs\\ClusterFacesJob (id: 226426, arguments: {\"userId\":\"admin\"}): 652.1 MB (before: 99.5 MB)","userAgent":"--","version":"29.0.4.1","data":{"app":"cron"},"id":"66a7bdec119fb"}

{"reqId":"IXdi9NgDcaj3o5rTidfQ","level":2,"time":"2024-07-28T21:01:47-05:00","remoteAddr":"","user":"--","app":"cron","method":"","url":"--","message":"Cron job used more than 300 MB of ram after executing job OCA\\Recognize\\BackgroundJobs\\ClusterFacesJob (id: 226389, arguments: {\"userId\":\"admin\"}): 625.5 MB (before: 10.6 MB)","userAgent":"--","version":"29.0.4.1","data":{"app":"cron"},"id":"66a7bdec11a7f"}

@joshtrichards
Copy link
Member

OCA\\Recognize\\BackgroundJobs\\ClusterFacesJob

@AndyXheli That's a job in the Recognize app. The issue here was just to clean things up to avoid too many false positives. The logging is legitimate. Issues with individual jobs need to diagnosed in relation to the individual apps providing the jobs to assess whether such high memory usage is expected or not/etc.

@AndyXheli
Copy link
Author

Hi @joshtrichards thank you for the detailed explanation. Much appreciated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants