semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

Python: Getting empty Plan and Step when working with a PromptFlow + Semantic Kernel Setup

Open pgarz opened this issue 1 year ago • 4 comments

my main file has something like this

async def main():
    kernel = sk.Kernel(log=sk.NullLogger())

    kernel.add_chat_service(
            "chat_completion",
            OpenAIChatCompletion(
                api_key="xxxxx",
                ai_model_id="gpt-4"
                
            ),
        )
    

    planner = SequentialPlanner(kernel=kernel)

    # Import the native functions
    llama_rag_plugin = kernel.import_skill(LlamaIndexRAGPlugin(), "LlamaIndexRAGPlugin")
    question = "what do i do if i want a refund?"
    ask = "Here is a question: " + question


    print(f"ask: {ask}")


    
    planner = SequentialPlanner(kernel=kernel)


    print(f"ask: {ask}")
    print(kernel.__dict__)


    plan = await planner.create_plan_async(ask)

    print("created plan")
    print(plan)
    print(plan.__dict__)

and then my plugin code looks like so

class LlamaIndexRAGPlugin: 

    @sk_function(
        description="Asks the LlamaIndex RAG to give back a generative answer in a chat like interface. Use for answering questions",
        name="llama_rag_func",
        input_description="User question query string",
    )
    def llama_rag_func(self, question: str) -> str:
    
        # initialize client
        db = chromadb.PersistentClient(path="/Users/pedrogarzon/Documents/UpworkCode/DeepEvalCustomerSupport/chroma_db")

        # get collection
        chroma_collection = db.get_or_create_collection("rag_colection")

        # assign chroma as the vector_store to the context
        vector_store = ChromaVectorStore(chroma_collection=chroma_collection)
        storage_context = StorageContext.from_defaults(vector_store=vector_store)

        # load your index from stored vectors
        index = VectorStoreIndex.from_vector_store(
            vector_store, storage_context=storage_context
        )

        # create a query engine
        query_engine = index.as_query_engine()
        response = query_engine.query(question)

        print("found response in LlamaIndexRAG")
        print(response.__dict__)

        return response.response

My problem is that when I create the Planer it always come out empty like {} and my actual Plugin code never gets executed. Am I missing a step?

I have been able to successfuly call my plugin manually, but not with using a Plan as desired

result = await kernel.run_async(
llama_rag_plugin["llama_rag_func"],
input_str="What is the best way to get a refund?", )

pgarz avatar Jan 23 '24 01:01 pgarz

As an update, I tried going into the Semantic Kernel source code and it looks like in the sequential_planner_parser.py file around line 80 the plan is not able to extract the function.

The code in that file looks like this

if plugin_function is not None:
                            plan_step = Plan.from_function(plugin_function)

                            print(f"plugin func: {plugin_function.__dict__}")
                            print(f"plan steps: {plan_step}")
                            print("plan step details")
                            print(plan_step.__dict__)



And that code prints

plugin func: {'_description': 'Asks the LlamaIndex RAG to give back a generative answer in a chat like interface. Use for answering questions', '_plugin_name': 'LlamaIndexRAGPlugin', '_name': 'llama_rag_func', '_is_semantic': False, '_stream_function': <bound method LlamaIndexRAGPlugin.llama_rag_func of <Plugins.RAGPlugins.LlamaIndexRAG.LlamaIndexRAGPlugin object at 0x28609bf10>>}
plan steps: 
plan step details
{}

pgarz avatar Jan 23 '24 21:01 pgarz

@juliomenendez, @moonbox3 said that you may have already fixed this; is that true?

matthewbolanos avatar Feb 06 '24 15:02 matthewbolanos

@matthewbolanos I worked on a SK + PromptFlow issue 2 weeks ago, let me take a look at this one later today.

juliomenendez avatar Feb 06 '24 16:02 juliomenendez

This issue is stale because it has been open for 90 days with no activity.

github-actions[bot] avatar May 07 '24 01:05 github-actions[bot]

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar May 24 '24 01:05 github-actions[bot]