Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing required configuration to use LangChain integration: '_llm_type' parameter not found in PANDASAI LangChain LLM settings #1612

Open
jasongonin opened this issue Feb 14, 2025 · 0 comments

Comments

@jasongonin
Copy link

System Info

similar to "Unfortunately, I was not able to get your answers, because of the following error:\n\n'HuggingFaceLLM' object has no attribute '_llm_type'\n" #1322

pandasai==3.0.0b10
Python 3.11
pandasai-langchain==0.1.4
langchain==0.1.16
langchain-community==0.0.38
langchain-core==0.1.53
langchain-openai==0.1.3

as described below
as simple code (to exec in Jupyter notebook)

using as a df source https://github.com/datasciencedojo/datasets/blob/master/titanic.csv

🐛 Describe the bug

cell 1 : setup venv one shot

!pip install pandas pandasai==3.0.0b10 pandasai-langchain==0.1.4

!pip install langchain==0.1.16 langchain-community==0.0.38 langchain-core==0.1.53 langchain-openai==0.1.3

 

cell 2 : actual test

import pandas as pd

from langchain_openai import ChatOpenAI

from langchain import LLMChain

from pandasai_langchain import LangchainLLM

from langchain_core.prompts import PromptTemplate, MessagesPlaceholder

from pandasai import SmartDataframe

 

llm= ChatOpenAI(api_key="sk_xxxx")

systemPrompt="my system prompt"

myQueryWrapperPrompt=" my query wrapper prompt"

template= PromptTemplate(

    input_variables=["query"],

    template = systemPrompt + "{query}" + myQueryWrapperPrompt,

)

chain= LLMChain(llm=llm, prompt=template)

 

from pandasai_langchain import LangchainLLM

langchainllm= LangchainLLM(chain)

df = pd.read_csv("/home/[email protected]/Downloads/titanic.csv")

smartdf=SmartDataframe(df,config={"llm":langchainllm})

smartdf.chat(query="How does the age (Age) of passengers affect their likelihood of survival (Survived)?")

error:

AttributeError: 'LangchainLLM' object has no attribute '_llm_type'"

See bellow complete error

{

            "name": "AttributeError",

            "message": "'LangchainLLM' object has no attribute '_llm_type'",

            "stack": "---------------------------------------------------------------------------

AttributeError Traceback (most recent call last)

Cell In[22], line 21

 19 df = pd.read_csv(\"/home/[email protected]/Downloads/titanic.csv\")

 20 smartdf=SmartDataframe(df,config={\"llm\":langchainllm})

---> 21 smartdf.chat(query="question?")

File ~/project/gpt-be-fastapi/jupyter.venv/lib/python3.11/site-packages/pandasai/smart_dataframe/init.py:80, in SmartDataframe.chat(self, query, output_type)

 61 def chat(self, query: str, output_type: Optional[str] = None):

 62     \"\"\"

 63     Run a query on the dataframe.

 64     Args:

(...)

78         ValueError: If the query is empty

 79     \"\"\"

---> 80 return self._agent.chat(query, output_type)

File ~/project/gpt-be-fastapi/jupyter.venv/lib/python3.11/site-packages/pandasai/agent/base.py:92, in Agent.chat(self, query, output_type)

 88 \"\"\"

 89 Start a new chat interaction with the assistant on Dataframe.

 90 \"\"\"

 91 self.start_new_conversation()

---> 92 return self._process_query(query, output_type)

File ~/project/gpt-be-fastapi/jupyter.venv/lib/python3.11/site-packages/pandasai/agent/base.py:247, in Agent._process_query(self, query, output_type)

244 query = UserQuery(query)

245 self._state.logger.log(f\"Question: {query}\")

246 self._state.logger.log(

--> 247 f"Running PandaAI with {self._state.config.llm.type} LLM..."

248 )

250 self._state.output_type = output_type

251 try:

File ~/project/gpt-be-fastapi/jupyter.venv/lib/python3.11/site-packages/pandasai_langchain/langchain.py:60, in LangchainLLM.type(self)

 58 @property

 59 def type(self) -> str:

---> 60 return f"langchain_{self.langchain_llm._llm_type}"

AttributeError: 'LangchainLLM' object has no attribute '_llm_type'"

}

@jasongonin jasongonin changed the title barrage to use the langchain integration : _llm_type not present in langchainllm Missing required configuration to use LangChain integration: '_llm_type' parameter not found in PANDASAI LangChain LLM settings Feb 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant