openverse icon indicating copy to clipboard operation
openverse copied to clipboard

Add verbose logging option to `ProviderDataIngester`

Open nicoepp opened this issue 10 months ago • 7 comments

To log for a certain DAG set SHOULD_VERBOSE_LOG to a list containing that dag id.

For example, set the SHOULD_VERBOSE_LOG Airflow variable to: ["smk_workflow"]

Fixes

Fixes #1420 by @AetherUnbound

Description

Testing Instructions

Checklist

  • [x] My pull request has a descriptive title (not a vague title likeUpdate index.md).
  • [x] My pull request targets the default branch of the repository (main) or a parent feature branch.
  • [x] My commit messages follow best practices.
  • [x] My code follows the established code style of the repository.
  • [x] I added or updated tests for the changes I made (if applicable).
  • [x] I added or updated documentation (if applicable).
  • [x] I tried running the project locally and verified that there are no visible errors.
  • [x] I ran the DAG documentation generator (if applicable).

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

nicoepp avatar Apr 08 '24 21:04 nicoepp

Thanks for you feedback @sarayourfriend! I pushed some commits and I'm running the linter locally now but I gotta go now. I will hopefully push the linting changes tonight.

nicoepp avatar Apr 09 '24 15:04 nicoepp

I'm playing around with the idea to log the full result of get_batch_data as suggested by @stacimc. I'm running into problems logging the full list of dictionaries of the returned batch. The chrome tab where I start the DAG in Airflow and where I try to view the log starts becoming unresponsive and then crashes since the log gets very very big!

I modified my code now to just log the first 5 and the last 5 dictionaries in each list returned by get_batch_data. See attached screenshot. Would that work for you guys?

image

nicoepp avatar Apr 10 '24 20:04 nicoepp

I modified my code now to just log the first 5 and the last 5 dictionaries in each list returned by get_batch_data. See attached screenshot. Would that work for you guys?

Absolutely perfect, even just the first five is probably fine for our needs!

stacimc avatar Apr 10 '24 20:04 stacimc

Here is the full log file (from the screenshot) for better reference: dag_id=smk_workflow_run_id=manual__2024-04-10T20_03_28.log

nicoepp avatar Apr 10 '24 20:04 nicoepp

Ok. Thanks for the feedback. I'm only logging the first 5 dictionaries of the batch for now.

I also pushed a commit making the verbose logging always controlled by the Airflow variable. No more checking the "ENVIRONMENT" variable while deciding to log or not.

The variable value has to look like this: image

nicoepp avatar Apr 10 '24 20:04 nicoepp

I've also just noticed the catalog test failures -- that's because we are mocking all the results of Variable.get in the provider data ingester tests and are now getting an unexpected new call. @nicoepp you'll need to update test_provider_data_ingester.py, everywhere you see a line like this:

        MockVariable.get.side_effect = [
            20,  # ingestion_limit
            {},  # skipped_ingestion_errors
+           [],  # should_verbose_log 
        ]

stacimc avatar Apr 12 '24 17:04 stacimc

Based on the low urgency of this PR, the following reviewers are being gently reminded to review this PR:

@krysal This reminder is being automatically generated due to the urgency configuration.

Excluding weekend[^1] days, this PR was ready for review 6 day(s) ago. PRs labelled with low urgency are expected to be reviewed within 5 weekday(s)[^2].

@nicoepp, if this PR is not ready for a review, please draft it to prevent reviewers from getting further unnecessary pings.

[^1]: Specifically, Saturday and Sunday. [^2]: For the purpose of these reminders we treat Monday - Friday as weekdays. Please note that the operation that generates these reminders runs at midnight UTC on Monday - Friday. This means that depending on your timezone, you may be pinged outside of the expected range.

openverse-bot avatar Apr 17 '24 00:04 openverse-bot

I've made a few changes to this PR but I think it's ready to undraft:

  • I changed the setup behavior so the ingester always logs verbosely locally
  • I changed the places where the verbose logging was outputting dictionaries to have it use json.dumps so we can copy those lines and parse them as JSON easily (for future runs, analysis, etc)
  • I added the variable and related documentation to the new variables.tsv definition

AetherUnbound avatar Jun 04 '24 21:06 AetherUnbound

@AetherUnbound just a heads up some of the tests are failing. It looks like maybe in some cases we're expecting the logging to always be a single parsable JSON document? I haven't looked deeply at it, but just a guess based on the changes you described and the test logs. I can look more into this if you'd like help sorting this PR out, just let me know.

sarayourfriend avatar Jun 05 '24 01:06 sarayourfriend

Ahh, thanks for the heads up @sarayourfriend - I'll fix the tests tomorrow!

AetherUnbound avatar Jun 05 '24 01:06 AetherUnbound