Rossi

Results 280 comments of Rossi

A draft list that answers the questions, organized by "How difficult each is to scrape" in the sense if we have the scraper already implemented > Which have it? How...

Happy to report that the citation backscraper is working, just ran it in prod on `md` and will soon run it with `scotus_slip`. ![image](https://github.com/user-attachments/assets/6067b9a9-97d8-4a9c-93e4-a96088453b40) Added 305 citations by running `manage.py...

We ran this for `scotus_slip`, only term [22](https://www.supremecourt.gov/opinions/slipopinion/22), and duplicated all records from that term. If the duplications are not too big of a problem, we could run it for...

For sources where the citations are inside the document's text, but we just recently implemented `extract_from_text` to get them, we can run a script like the following (currently, we can...

Just ran the command to get `md` lagged citations. Got - 56 citations added to an existing cluster - 14 citations added to a new cluster, meaning we got 2...

After `ga` versioning was mostly solved, I ran the backscraper and got 788 new `Ga.` citations, for years 2022 to 2025, with at most 55 versioning failures (meaning, the opinion...

Ran the command for `haw` and `hawapp`. Got 1683 "Haw." citations ``` ./manage.py cl_back_scrape_citations --courts juriscraper.opinions.united_states.state.haw --backscrape-start=2018/01/01 --backscrape-end=2025/01/01 --verbosity 3 --backscrape-wait=10 ./manage.py cl_back_scrape_citations --courts juriscraper.opinions.united_states.state.hawapp --backscrape-start=2018/01/01 --backscrape-end=2025/01/01 --verbosity 3 --backscrape-wait=10...

`connctapp` has started failing again. I guess `conn` may fail too after the fixed scraper is merged Sentry Issue: [COURTLISTENER-7HW](https://freelawproject.sentry.io/issues/5508185979/?referrer=github_integration)

Currently we are using custom adapter, which fixed the loading of the HTML results page. However, downloading the actual opinion is failing on the server. It does not fail when...

I think this is no longer an issue, we simplified the temporary solution of SSL errors by using `self.request["verify"] = False` as scrapers fail https://github.com/freelawproject/juriscraper/pull/1315 https://github.com/freelawproject/juriscraper/pull/1347 https://github.com/freelawproject/juriscraper/pull/876 https://github.com/freelawproject/juriscraper/pull/1119 etc Some...