-
Notifications
You must be signed in to change notification settings - Fork 2.8k
feat: Ali Bailian supplier model list adds qwen3 model #3026
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -22,3 +22,4 @@ | |
| from .radio_card_field import * | ||
| from .label import * | ||
| from .slider_field import * | ||
| from .switch_field import * | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -51,6 +51,23 @@ | |
| _("Universal text vector is Tongyi Lab's multi-language text unified vector model based on the LLM base. It provides high-level vector services for multiple mainstream languages around the world and helps developers quickly convert text data into high-quality vector data."), | ||
| ModelTypeConst.EMBEDDING, aliyun_bai_lian_embedding_model_credential, | ||
| AliyunBaiLianEmbedding), | ||
| ModelInfo('qwen3-0.6b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-1.7b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-4b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-8b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-14b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-32b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-30b-a3b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen3-235b-a22b', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
|
|
||
| ModelInfo('qwen-turbo', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen-plus', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The provided code snippet seems to be adding more model information entries to the list of supported models. The changes include adding new Qwen LLM models with different sizes:
Optimization suggestions:
# Example optimized approach using a dictionary
models_dict = {
'universal_text_vector': 'Tongyi Lab\'s multi-language text unified vector model',
'aliyun_bai_lian_embedding_model': 'Aliyun Bai Liang embedding model credentials',
'bailian_chat_model': BaiLianChatModel,
# Other original models...
}
def add_model(model_name, description):
global models_dict
try:
models_dict[model_name] = {'description': description}
except Exception as e:
print(f"Could not add {model_name}: {e}")
# Usage
add_model('qwen3-0.6b', "Description of qwen3-0.6b")
add_model('qwen3-1.7b', "Description of qwen3-1.7b")This refactored approach uses a Python dictionary to store model information, which simplifies managing and accessing them while maintaining consistency across your configuration system. |
||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -30,6 +30,29 @@ class BaiLianLLMModelParams(BaseForm): | |
| precision=0) | ||
|
|
||
|
|
||
| class BaiLianLLMStreamModelParams(BaseForm): | ||
| temperature = forms.SliderField(TooltipLabel(_('Temperature'), | ||
| _('Higher values make the output more random, while lower values make it more focused and deterministic')), | ||
| required=True, default_value=0.7, | ||
| _min=0.1, | ||
| _max=1.0, | ||
| _step=0.01, | ||
| precision=2) | ||
|
|
||
| max_tokens = forms.SliderField( | ||
| TooltipLabel(_('Output the maximum Tokens'), | ||
| _('Specify the maximum number of tokens that the model can generate')), | ||
| required=True, default_value=800, | ||
| _min=1, | ||
| _max=100000, | ||
| _step=1, | ||
| precision=0) | ||
|
|
||
| stream = forms.SwitchField(label=TooltipLabel(_('Is the answer in streaming mode'), | ||
| _('Is the answer in streaming mode')), | ||
| required=True, default_value=True) | ||
|
|
||
|
|
||
| class BaiLianLLMModelCredential(BaseForm, BaseModelCredential): | ||
|
|
||
| def is_valid(self, model_type: str, model_name, model_credential: Dict[str, object], model_params, provider, | ||
|
|
@@ -72,4 +95,6 @@ def encryption_dict(self, model: Dict[str, object]): | |
| api_key = forms.PasswordInputField('API Key', required=True) | ||
|
|
||
| def get_model_params_setting_form(self, model_name): | ||
| if 'qwen3' in model_name: | ||
| return BaiLianLLMStreamModelParams() | ||
| return BaiLianLLMModelParams() | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The provided code has several minor optimizations and corrections: Optimizations and Corrections
Here's the revised code: import django.forms as forms
class BaiLianLLMModelParamsBase(forms.BaseForm):
precision = forms.IntegerField(default=0)
class BaiLianLLMStreamModelParams(BaiLianLLMModelParamsBase):
temperature = forms.FloatField(TooltipLabel(
_('Temperature'), _('Higher values make the output more random, while lower values make it more focused and deterministic')), min_value=0.1, max_value=1.0, step=0.01, decimal_places=2),
required=True,
default_value=0.7)
max_tokens = forms.IntegerField(TooltipLabel(
_('Output the maximum Tokens'), _('Specify the maximum number of tokens that the model can generate')), min_value=1, max_value=100000, step=1, help_text='Maximum token count'),
required=True, default_value=800)
stream = forms.BooleanField(label=TooltipLabel(_('Is the answer in streaming mode'),
_('Is the answer in streaming mode')), required=True, default_value=True)
class BaiLianLLMModelCredential(BaseForm, BaseModelCredential):
api_key = forms.CharField(widget=forms.PasswordInput(), label=_('API Key'), required=True)
def get_model_params_setting_form(self, model_name) -> BaiLianLLMStreamModelParams | BaiLianLLMModelParams:
"""Return the correct parameter form based on the model name."""
if 'qwen3' in model_name.lower():
return BaiLianLLMStreamModelParams()
else:
return BaiLianLLMModelParams()
# Additional functions like encryption_dict() could be moved here or kept outside the main classes.Summary of Changes
This should improve the overall quality and maintainability of the codebase. |
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The code you provided appears to be part of the
__init__method for a custom GUI component named either 'Switch' or 'SwitchInput'. Here are some potential improvements and checks:Issues/Improvements:
Component Name:
'Switch'and'SwitchInput'. It's unclear which name should be used consistently throughout the class.Parameter Naming:
Empty Dictionary Passing:
{}is being passed as one of the arguments. This may indicate that this argument defaults to an empty dictionary, but ensure that there isn't unintended behavior from passing it directly.Optimization Suggestions:
'Switch'and'SwitchInput'are intended to represent different types of switches, consider separating these into two separate classes with distinct functionalities if possible.{}has any special purpose (e.g., default value handling), confirm its functionality and explain it properly in the documentation.Revised Code Example:
This example maintains consistency in naming (
SwitchvsSwitchBase) while clarifying the use of optional parameters and their default values. Adjustments can be made based on the actual design requirements and constraints of your project.