Skip to main content

SparkLLM

SparkLLM is a large-scale cognitive model independently developed by iFLYTEK. It has cross-domain knowledge and language understanding ability by learning a large amount of texts, codes and images. It can understand and perform tasks based on natural dialogue.

Prerequisite

  • Get SparkLLM's app_id, api_key and api_secret from iFlyTek SparkLLM API Console (for more info, see iFlyTek SparkLLM Intro ), then set environment variables IFLYTEK_SPARK_APP_ID, IFLYTEK_SPARK_API_KEY and IFLYTEK_SPARK_API_SECRET or pass parameters when creating ChatSparkLLM as the demo above.

Use SparkLLM

import os

os.environ["IFLYTEK_SPARK_APP_ID"] = "app_id"
os.environ["IFLYTEK_SPARK_API_KEY"] = "api_key"
os.environ["IFLYTEK_SPARK_API_SECRET"] = "api_secret"
from langchain_community.llms import SparkLLM

# Load the model
llm = SparkLLM()

res = llm.invoke("What's your name?")
print(res)
API Reference:SparkLLM
/Users/liugddx/code/langchain/libs/core/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.7 and will be removed in 0.2.0. Use invoke instead.
warn_deprecated(
``````output
My name is iFLYTEK Spark. How can I assist you today?
res = llm.generate(prompts=["hello!"])
res
LLMResult(generations=[[Generation(text='Hello! How can I assist you today?')]], llm_output=None, run=[RunInfo(run_id=UUID('d8cdcd41-a698-4cbf-a28d-e74f9cd2037b'))])
for res in llm.stream("foo:"):
print(res)
Hello! How can I assist you today?

Was this page helpful?


You can leave detailed feedback on GitHub.