Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Manage LLM model configurations via external files (config) #1587

Open
2 tasks done
Barca0412 opened this issue Feb 11, 2025 · 5 comments
Open
2 tasks done
Assignees
Labels
call for contribution New Feature P1 Task with middle level priority
Milestone

Comments

@Barca0412
Copy link

Required prerequisites

Motivation

Currently, large language model (LLM) configurations (e.g., model = ModelFactory.create(...)) are hardcoded in the codebase. This approach requires users to modify the source code directly if they want to add new model types or adjust configurations. However, this creates a maintenance burden: when the code is updated, local modifications may conflict with upstream changes.

Solution

Proposed Solution

  1. Abstract LLM-related parameters into a configuration class.
  2. Manage these configurations via external files (e.g., config.json or config.yaml).

This would allow users to:

  • Customize models without altering source code.
  • Avoid conflicts during upstream updates.
  • Easily share/reuse configurations across environments.

Example Implementation

# Load from config file  
with open("llm_config.yaml") as f:  
    config = load_yaml(f)  

model = ModelFactory.create(**config)  

This approach aligns with common practices for modular and maintainable code. Thanks for considering this suggestion!

Alternatives

No response

Additional context

No response

@Barca0412 Barca0412 added the enhancement New feature or request label Feb 11, 2025
@Barca0412 Barca0412 changed the title [Feature Request] [Feature Request] Manage LLM model configurations via external files (config) Feb 11, 2025
@lightaime
Copy link
Member

Thanks @Barca0412. This is a very good sugggestion. We should support serialization and persistence in the core modules. Using pydantic could be a solution to this. c.c @Wendong-Fan

@Wendong-Fan Wendong-Fan added New Feature and removed enhancement New feature or request labels Feb 14, 2025
@Wendong-Fan Wendong-Fan added call for contribution P1 Task with middle level priority labels Feb 21, 2025
@Wendong-Fan Wendong-Fan added this to the Sprint 24 milestone Feb 21, 2025
@SaranshPandya
Copy link

I want to contribute to this issue!!

@yiyiyi0817
Copy link
Member

I want to contribute to this issue!!

Thanks for your interest! Assigned to you. @SaranshPandya

@X-TRON404
Copy link
Collaborator

Hey @SaranshPandya, let us know if you need any help from our side. Happy coding!

@SaranshPandya
Copy link

Hey @SaranshPandya, let us know if you need any help from our side. Happy coding!

Yes.
Thanks.

SaranshPandya added a commit to SaranshPandya/camel that referenced this issue Mar 14, 2025
- Added functionality to load model configurations from both YAML and JSON files.
- Implemented "create_from_yaml()" and "create_from_json()" functions.
- Improved error handling for invalid or missing configurations.
- Ensured compatibility with existing model factory logic.

Closes camel-ai#1587
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
call for contribution New Feature P1 Task with middle level priority
Projects
Status: No status
Development

No branches or pull requests

6 participants