I am getting this error while creating embeddings #9518
-
Git path - semantic-kernel/python/samples/getting_started/06-memory-and-embeddings.ipynb |
Beta Was this translation helpful? Give feedback.
Replies: 8 comments 2 replies
-
Hello @rakshahulle, could you please provide us some more context about how you're arriving at this error? What embedding model do you have configured? Have you changed any portion of the code or you receive this error from running the code as-is? Which exact cell are you seeing this error? |
Beta Was this translation helpful? Give feedback.
-
I have change code for OpenAI API to AzureOpenAI API service and gave its
configuration. This is the below code with error logs
On Mon, Nov 4, 2024 at 7:57 PM Evan Mattson ***@***.***> wrote:
Hello @rakshahulle <https://github.com/rakshahulle>, could you please
provide us some more context about how you're arriving at this error? What
embedding model do you have configured? Have you changed any portion of the
code or you receive this error from running the code as-is? Which exact
cell are you seeing this error? await populate_memory(memory) is called
three times in that notebook. Thanks.
—
Reply to this email directly, view it on GitHub
<#9518 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXCZPRNDV6MXF2TYKCHTHKLZ65Y3PAVCNFSM6AAAAABREGWTE6VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJUGMZTKMI>
.
You are receiving this because you were mentioned.Message ID:
***@***.***
com>
Populating memory...
Traceback (most recent call last):
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_handler.py", line 72, in _send_embedding_request
response = await self.client.embeddings.create(**settings.prepare_settings_dict())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/openai/resources/embeddings.py", line 236, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1838, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1532, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1633, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Unsupported data type
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/raksha/project_work/toyota/semantic_kernel/semantic-kernel/python/samples/concepts/memory/memory.py", line 115, in <module>
asyncio.run(main())
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/home/raksha/project_work/toyota/semantic_kernel/semantic-kernel/python/samples/concepts/memory/memory.py", line 95, in main
await populate_memory(memory)
File "/home/raksha/project_work/toyota/semantic_kernel/semantic-kernel/python/samples/concepts/memory/memory.py", line 17, in populate_memory
await memory.save_information(collection=collection_id, id="info1", text="Your budget for 2024 is $100,000")
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/semantic_kernel/memory/semantic_text_memory.py", line 56, in save_information
embedding = (await self._embeddings_generator.generate_embeddings([text], **embeddings_kwargs))[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_text_embedding_base.py", line 36, in generate_embeddings
raw_embeddings = await self.generate_raw_embeddings(texts, settings, batch_size, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_text_embedding_base.py", line 70, in generate_raw_embeddings
raw_embedding = await self._send_embedding_request(settings=settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/raksha/anaconda3/envs/sk-3.12/lib/python3.12/site-packages/semantic_kernel/connectors/ai/open_ai/services/open_ai_handler.py", line 76, in _send_embedding_request
raise ServiceResponseException(
semantic_kernel.exceptions.service_exceptions.ServiceResponseException: ("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_text_embedding.AzureTextEmbedding'> service failed to generate embeddings", BadRequestError('Unsupported data type'))
|
Beta Was this translation helpful? Give feedback.
-
Also, can you please have a look at the following closed issue that seems to be related? I think it may be related to an incorrect endpoint configuration: |
Beta Was this translation helpful? Give feedback.
-
text-embedding-ada-002
…On Mon, Nov 4, 2024 at 8:23 PM Evan Mattson ***@***.***> wrote:
Thanks for your response. Which Azure OpenAI embedding model are you using?
—
Reply to this email directly, view it on GitHub
<#9518 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXCZPRPMHZNW4K6UMYHO2Z3Z6535ZAVCNFSM6AAAAABREGWTE6VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJUGM3DQMY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
These are the configurations I am using:
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="gpt-4o-mini"
AZURE_OPENAI_ENDPOINT="
https://openaisk123.openai.azure.com/openai/deployments/gpt-4o-mini/chat/completions?api-version=2024-08-01-preview
"
AZURE_OPENAI_API_KEY=""
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME="text-embedding-ada-002"
…On Mon, Nov 4, 2024 at 8:28 PM Evan Mattson ***@***.***> wrote:
Note, some of the code in that screenshot on the issue is old, but the
main part is the deployment_name as part of AzureTextEmbedding.
—
Reply to this email directly, view it on GitHub
<#9518 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXCZPRNYE6KE2BJXG4TFXBLZ654O3AVCNFSM6AAAAABREGWTE6VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJUGM3TKNI>
.
You are receiving this because you were mentioned.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
Thank you so much for the reference. Now the code is working.
…On Mon, Nov 4, 2024 at 8:26 PM Evan Mattson ***@***.***> wrote:
Also, can you please have a look at the following closed issue that seems
to be related? I think it may be related to an incorrect endpoint
configuration:
#5697 <#5697>
—
Reply to this email directly, view it on GitHub
<#9518 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AXCZPRIAWRNYNICSAOX43DDZ654IZAVCNFSM6AAAAABREGWTE6VHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJUGM3TEOI>
.
You are receiving this because you were mentioned.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
-
The response was very quick and helpful, which speed up my development process |
Beta Was this translation helpful? Give feedback.
-
Glad to hear! |
Beta Was this translation helpful? Give feedback.
Also, can you please have a look at the following closed issue that seems to be related? I think it may be related to an incorrect endpoint configuration:
#5697