0

I am getting this error while building a RAG model while using qwen2 model instead of the default llama2 which chroma uses. My code:

from langchain_community.embeddings import OllamaEmbeddings
from langchain_community.vectorstores import Chroma

# Initialize OllamaEmbeddings with 'qwen2'
embedding_model = OllamaEmbeddings(model_name='qwen2:0.5b')

# Use 'qwen2' model in Chroma.from_documents()
db = Chroma.from_documents(documents[:20], embedding_model)

Error:

---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[30], line 5
      2 from langchain_community.vectorstores import Chroma
      4 # Initialize OllamaEmbeddings with 'qwen2'
----> 5 embedding_model = OllamaEmbeddings(model_name='qwen2:0.5b')
      7 # Use 'qwen2' model in Chroma.from_documents()
      8 db = Chroma.from_documents(documents[:20], embedding_model)

File c:\ChatBot 2.0 KN\myenv\Lib\site-packages\pydantic\v1\main.py:341, in BaseModel.__init__(__pydantic_self__, **data)
    339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
    340 if validation_error:
--> 341     raise validation_error
    342 try:
    343     object_setattr(__pydantic_self__, '__dict__', values)

ValidationError: 1 validation error for OllamaEmbeddings
model_name
  extra fields not permitted (type=value_error.extra)

My pc wont be able to accomodate another LLM, which is I dont want to pull llama2 and use the existing qwen2:0.5b model for embeddings, is there any way I resolve this and proceed with the existing qwen2 model. Thanks in Advance!

0