Fix SQLAgent
Although I'm not sure what the exact intent was, I'm slightly sure that the following...
source = memory["current_source"]
memory["current_table"] = self.name
source.add_table(self.name, self.spec)
was added for a reason, but it was implemented incorrectly because self.name is SQLOutput02697 so subsequent calls to SQL broke (no table is named SQLOutput02697)
Edit: Perhaps
File "/Users/ahuang/repos/lumen/lumen/util.py", line 320, in wrapper
raise e
File "/Users/ahuang/repos/lumen/lumen/util.py", line 307, in wrapper
return func(*args, **kwargs)
File "/Users/ahuang/repos/lumen/lumen/pipeline.py", line 318, in _update_data
new_data = self._compute_data()
File "/Users/ahuang/repos/lumen/lumen/pipeline.py", line 264, in _compute_data
data = self.source.get(self.table, **query)
File "/Users/ahuang/repos/lumen/lumen/sources/base.py", line 84, in wrapped
df = method(self, table, **cache_query)
File "/Users/ahuang/repos/lumen/lumen/sources/duckdb.py", line 92, in get
df = self._connection.execute(sql_expr).fetch_df(date_as_object=True)
duckdb.duckdb.InvalidInputException: Invalid Input Error: No open result set
This reverts those changes by retrieving the pipeline stored in memory from earlier.
Also, even though the table is still valid, in cases where the Assistant chooses PipelineAgent, we need to check if the SQL statement is still valid, and if not refresh it; else it uses a stale SQL statement.
Have to review these changes more closely but the original intent here was to persist the tables to the source so they can all the exported to the notebook.
Okay, this seems to work well when https://github.com/holoviz/panel/pull/6971 is merged.
However, I couldn't pin down the warning:
WARNING:param.DuckDBSource00877: No such watcher Watcher(inst=Pipeline(_stale=False, auto_update=True, data=
from lumen.pipeline import Pipeline
from lumen.sources.duckdb import DuckDBSource
from lumen.ai.views import SQLOutput
import panel as pn
pn.extension("tabulator", "codeeditor", inline=False, template="fast", notifications=True)
pipeline = Pipeline(
source=DuckDBSource(
uri=":memory:",
),
table="windturbines.parquet"
)
sql = SQLOutput(component=pipeline, spec="SELECT * from windturbines.parquet")
chat = pn.chat.ChatInterface()
chat.send(sql)
Can you explain what type was causing the error in serialization?
LumenOutput and SqlOutput results in
File "/opt/continuum/.conda/envs/default/lib/python3.12/site-packages/panel/chat/feed.py", line 864, in _serialize_for_transformers
content = str(message)
^^^^^^^^^^^^
TypeError: __str__ returned non-string (type LumenOutput)
Seems like the simpler fix there is to implement __str__ method on the output objects.
Hmm, maybe that's not quite right, since I don't know what type message is above but something is confusing there. It shouldn't error when casting message to str.
Codecov Report
Attention: Patch coverage is 21.73913% with 54 lines in your changes missing coverage. Please review.
Project coverage is 60.88%. Comparing base (
a2e74f5) to head (c90c33f). Report is 2 commits behind head on main.
| Files | Patch % | Lines |
|---|---|---|
| lumen/ai/assistant.py | 0.00% | 19 Missing :warning: |
| lumen/ai/agents.py | 0.00% | 13 Missing :warning: |
| lumen/ai/views.py | 0.00% | 7 Missing :warning: |
| lumen/ai/models.py | 0.00% | 4 Missing :warning: |
| lumen/pipeline.py | 50.00% | 3 Missing :warning: |
| lumen/sources/duckdb.py | 62.50% | 3 Missing :warning: |
| lumen/sources/intake_dremio.py | 25.00% | 3 Missing :warning: |
| lumen/sources/base.py | 66.66% | 2 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## main #595 +/- ##
==========================================
- Coverage 60.89% 60.88% -0.01%
==========================================
Files 92 92
Lines 10724 10735 +11
==========================================
+ Hits 6530 6536 +6
- Misses 4194 4199 +5
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Will merge. Can follow up later.
This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.