feat: Ali Bailian supplier model list adds qwen3 model#3026
feat: Ali Bailian supplier model list adds qwen3 model#3026shaohuzhang1 merged 1 commit intomainfrom
Conversation
|
Adding the "do-not-merge/release-note-label-needed" label because no release-note block was detected, please follow our release note process to remove it. DetailsInstructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here. DetailsNeeds approval from an approver in each of these files:Approvers can indicate their approval by writing |
| def get_model_params_setting_form(self, model_name): | ||
| if 'qwen3' in model_name: | ||
| return BaiLianLLMStreamModelParams() | ||
| return BaiLianLLMModelParams() |
There was a problem hiding this comment.
The provided code has several minor optimizations and corrections:
Optimizations and Corrections
- Code Consistency: Ensure that all imports and function definitions follow consistent formatting styles.
- Default Value Documentation: Clarify the
_min,_max, and_stepdocumentation for clarity. - Model Type Check: Simplify the
get_model_params_setting_formmethod to directly return the appropriate parameters base on model type.
Here's the revised code:
import django.forms as forms
class BaiLianLLMModelParamsBase(forms.BaseForm):
precision = forms.IntegerField(default=0)
class BaiLianLLMStreamModelParams(BaiLianLLMModelParamsBase):
temperature = forms.FloatField(TooltipLabel(
_('Temperature'), _('Higher values make the output more random, while lower values make it more focused and deterministic')), min_value=0.1, max_value=1.0, step=0.01, decimal_places=2),
required=True,
default_value=0.7)
max_tokens = forms.IntegerField(TooltipLabel(
_('Output the maximum Tokens'), _('Specify the maximum number of tokens that the model can generate')), min_value=1, max_value=100000, step=1, help_text='Maximum token count'),
required=True, default_value=800)
stream = forms.BooleanField(label=TooltipLabel(_('Is the answer in streaming mode'),
_('Is the answer in streaming mode')), required=True, default_value=True)
class BaiLianLLMModelCredential(BaseForm, BaseModelCredential):
api_key = forms.CharField(widget=forms.PasswordInput(), label=_('API Key'), required=True)
def get_model_params_setting_form(self, model_name) -> BaiLianLLMStreamModelParams | BaiLianLLMModelParams:
"""Return the correct parameter form based on the model name."""
if 'qwen3' in model_name.lower():
return BaiLianLLMStreamModelParams()
else:
return BaiLianLLMModelParams()
# Additional functions like encryption_dict() could be moved here or kept outside the main classes.Summary of Changes
- Consistent Formatting: Ensured uniformity in spacing and indentation.
- Default Value Documentation: Added clear descriptions (min, max, step) in comments for better readability.
- Simplified Form Method: Directly returns the proper parameter form from the
get_model_params_setting_formmethod without unnecessary checks.
This should improve the overall quality and maintainability of the codebase.
|
|
||
| ModelInfo('qwen-turbo', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, | ||
| BaiLianChatModel), | ||
| ModelInfo('qwen-plus', '', ModelTypeConst.LLM, aliyun_bai_lian_llm_model_credential, |
There was a problem hiding this comment.
The provided code snippet seems to be adding more model information entries to the list of supported models. The changes include adding new Qwen LLM models with different sizes:
qwen3-0.6b,qwen3-1.7b,qwen3-4b,qwen3-8b,qwen3-14b,qwen3-32b,qwen3-30b-a3b,qwen3-235b-a22ball correspond to larger versions of the Qwen chatbot.
Optimization suggestions:
-
Consistent Naming Convention: Ensure that model names and keys follow a consistent naming convention throughout your configuration files or data structures.
-
Dynamic Model Loading: Consider implementing dynamic loading of models if you're working within an environment where models can vary dynamically based on user preferences or deployment requirements. This might involve creating methods to load specific model instances at runtime.
-
Error Handling: Add error handling mechanisms in case there are errors during the addition of models. For example, check paths for existing models before appending new ones.
-
Configuration File Management: If this setup is part of a larger application, consider organizing the configuration file structure hierarchically rather than flat to improve scalability and maintainability.
-
Version Control: Maintain version control for updates made through scripts like these so it's easy to track changes and rollbacks.
# Example optimized approach using a dictionary
models_dict = {
'universal_text_vector': 'Tongyi Lab\'s multi-language text unified vector model',
'aliyun_bai_lian_embedding_model': 'Aliyun Bai Liang embedding model credentials',
'bailian_chat_model': BaiLianChatModel,
# Other original models...
}
def add_model(model_name, description):
global models_dict
try:
models_dict[model_name] = {'description': description}
except Exception as e:
print(f"Could not add {model_name}: {e}")
# Usage
add_model('qwen3-0.6b', "Description of qwen3-0.6b")
add_model('qwen3-1.7b', "Description of qwen3-1.7b")This refactored approach uses a Python dictionary to store model information, which simplifies managing and accessing them while maintaining consistency across your configuration system.
| super().__init__('Switch', label, required, default_value, relation_show_field_dict, | ||
| super().__init__('SwitchInput', label, required, default_value, relation_show_field_dict, | ||
| {}, | ||
| TriggerType.OPTION_LIST, attrs, props_info) |
There was a problem hiding this comment.
The code you provided appears to be part of the __init__ method for a custom GUI component named either 'Switch' or 'SwitchInput'. Here are some potential improvements and checks:
Issues/Improvements:
-
Component Name:
- There is a discrepancy between using
'Switch'and'SwitchInput'. It's unclear which name should be used consistently throughout the class.
- There is a discrepancy between using
-
Parameter Naming:
- The parameter names in the docstring do not match those in the function call. Consistency is important here.
-
Empty Dictionary Passing:
- An empty dictionary
{}is being passed as one of the arguments. This may indicate that this argument defaults to an empty dictionary, but ensure that there isn't unintended behavior from passing it directly.
- An empty dictionary
Optimization Suggestions:
- Consistent Use of Names: If both
'Switch'and'SwitchInput'are intended to represent different types of switches, consider separating these into two separate classes with distinct functionalities if possible. - Docstring Clarity: Ensure that all parameters are documented clearly to avoid confusion during usage.
- Default Values Handling: If the second empty dictionary
{}has any special purpose (e.g., default value handling), confirm its functionality and explain it properly in the documentation.
Revised Code Example:
class SwitchBase:
def __init__(self, label: str | BaseLabel,
required: bool = False,
default_value=None,
relation_show_field_dict={},
attrs=None,
trigger_type=TriggerType.OPTION_LIST,
props_info={}):
super().__init__('Switch', label, required, default_value, relation_show_field_dict,
{},
TriggerType.OPTION_LIST, attrs, props_info)
# Example Usage:
switch_instance = SwitchBase('My Toggle Switch')This example maintains consistency in naming (Switch vs SwitchBase) while clarifying the use of optional parameters and their default values. Adjustments can be made based on the actual design requirements and constraints of your project.
feat: Ali Bailian supplier model list adds qwen3 model