open-energy-view icon indicating copy to clipboard operation
open-energy-view copied to clipboard

Ease development with Docker

Open aaronjwood opened this issue 2 years ago • 8 comments

One command and you're good to go :)

aaronjwood avatar Jan 20 '23 22:01 aaronjwood

@aaronjwood Very exciting! I am testing this out now!

FYI we'll have to maintain the original process guide in the README (move to bottom instead of replacing) since it's informative for how production is running (Linux systemd).

Hopefully I work up the courage to switch production to the docker container.

JPHutchins avatar Jan 21 '23 22:01 JPHutchins

Sounds good, I'll adjust the readme when I get some time in a few days.

When I got everything up locally and fixed some crashing around the test data parsing I found that the UI didn't show the test data that was loaded into the DB anywhere, and the UI was stuck on December 1969. Are you aware of this being an existing issue? I'm guessing it's specific to the local dev env since things are working for me on your live deployment with my PGE data but I didn't dig in very much to see exactly why it wasn't working. The test data is from 2019 it seems, but the front end doesn't allow to go anywhere besides 1969.

aaronjwood avatar Jan 22 '23 01:01 aaronjwood

@aaronjwood Unfortunately I am in a "how is this even working" sorta situation with the MQ and Celery tasks on the live server...

The docker container works for me up to the point of queuing the async jobs - LMK if this flow is working for you in the docker container: https://github.com/JPHutchins/open-energy-view#example-account-setup

Here's a description of what is supposed to be happening.

  • User registers for the first time and we'd like to get their historical data. PGE SMD team advised that I request 1 month of data at a time with some delay in between 🙄.
  • This starts a job "fetch historical data" that queues up to ~48 jobs that will make the requests for 1 month of data at a time to the PGE SMD API
    • real task: https://github.com/JPHutchins/open-energy-view/blob/e4410b403dcea97081b86dd9bfed1ad977afbfce/open_energy_view/celery_tasks.py#L106-L137
    • mock task (the one that should work in docker container): https://github.com/JPHutchins/open-energy-view/blob/e4410b403dcea97081b86dd9bfed1ad977afbfce/open_energy_view/celery_tasks.py#L140-L172
  • AFAICT the fetch task is running correctly in the highly-threaded celery-io pool.
  • Before the fetch task completes it queues up the insert_espi_xml_into_db task in the single-threaded celery-cpu pool (FIFO to write to the DB): https://github.com/JPHutchins/open-energy-view/blob/e4410b403dcea97081b86dd9bfed1ad977afbfce/open_energy_view/celery_tasks.py#L21-L82
  • I'm sure that this is going into the MQ, but what's not happening for me is the celery-cpu queue getting processed. In fact, you can see the "CALLED" print at the top - I think I left that in from when I was trying to get setup on AWS and ran into the same "how was this ever working" situation - anyway, if you see "CALLED" in the stdout that would be a good sign! 😭

As I mentioned, in production these are all running from systemd. I've inspected my config and it does not seem to differ from what you have setup in the docker container.

LMK what you might find when you run that flow.

It's critical for development to be able to mock the PGE request/response in the development environment so that we have an efficient way to test data parsing, fetching etc, thank you for your help!

EDIT: just confirmed that the "fake fetch" is working in production.

  • create a new account with fake email "[email protected]", pw admin
  • select fake utility, name whatever
  • you'll see it load in the first month and add a spinner in the upper right corner. Network will show 202s coming in as it checks on the celery tasks and eventually it will prompt you to reload.
  • this is exactly what should be working in the development environment.

EDIT2: if it's not clear, the architecturally f*(cky thing here is that the insert_to_db task needs the "flask application context" in order to setup the SQL ORM (sql alchemy).

JPHutchins avatar Jan 28 '23 21:01 JPHutchins

JFC there is some embarrassing code in here

finally: 
    pass

JPHutchins avatar Jan 28 '23 21:01 JPHutchins