Notes to Developer
- Mevcutta 0.X.Y versiyon llm factory sorunsuz çalışıyor ve değişiklik gerekmiyorsa; breaking change olmaması için güvenlik, vs gibi gereksinim olmadığı sürece versiyon ilerletmeyelim.
- örn: (tcell_llm_factory==0.2.10)
- Query Engine implement ederken cloud ve onprem fark etmeksizin "response" ve "consumption" return edelim. Onpremler için (hic dönülmemesi yerine) "consumption" None döndürülür.
- tests/test_integration.py altında tüm query engine'ler tek tip input/output akışı gerçeklemesi test edilir.
- Async runtime'lar için LLMFactry & AsyncLLMFactory class'ları ayrı ayrı kullanılır ihtiyaca göre. Async tarafında olmayan QEngine ilgili developer implement edebilir.
Build
- python -m pip install -U wheel setuptools build
- increment project_id on pyproject.toml
- archrose (for mac users)
- python -m build
Upload
- artifactory urls:
- local-pypi-dist-dev/com/turkcell/sensai/llm_factory
- https://artifactory.turkcell.com.tr/ui/repos/tree/General/local-pypi-dist-dev/com/turkcell/sensai/llm_factory/
- Deploy tıkla
- Multiple Deploy seçerek yükle yeni build dosyaları
- target path güncelle:
- com/turkcell/sensai/llm_factory/0.2.0
Improvement Notes (TODO):
- azure provider altında langchain'den azure class import edilmiş; yukarıda openai'dan farklı frameworkler eşitlenmeli mi?
- provider'ların hepsi kendi içinde langchain, vs llm dev frameworkü yönetsin. Biz karışmayalım. LLMFactory pyproject.yml'dan dependency olarak çıkartmalı.
- partial pip installation için (örn: pip install llm_factory[ollama]) dependency importlar da dinamik olmalı (aşağıda örnek implementasyon var fakat paralelde best practice literatür taraması yapalım)
import os
from typing import List, Optional
from importlib.util import find_spec
# Import core dependencies
from . import query_engine as qe
try:
from ..errors import LLMFactoryError
except ImportError: # for debug runtime
from errors import LLMFactoryError
# Initialize variables to hold imported classes
ChatOpenAI = None
AzureChatOpenAI = None
OllamaClient = None
OllamaAsyncClient = None
AzureOpenAI = None
AsyncAzureOpenAI = None
# Helper function to check if a module is installed
def is_package_installed(package_name: str) -> bool:
return find_spec(package_name) is not None
# Dynamic imports for Azure/OpenAI
if is_package_installed("langchain_openai"):
from langchain_openai import ChatOpenAI, AzureChatOpenAI
if is_package_installed("openai"):
from openai import AzureOpenAI, AsyncAzureOpenAI
# Dynamic imports for Ollama
if is_package_installed("ollama"):
from ollama import Client as OllamaClient, AsyncClient as OllamaAsyncClient
# You might want to add error classes for missing dependencies
class DependencyNotInstalledError(LLMFactoryError):
"""Raised when a required dependency is not installed."""
pass
# Then in your code, you can check before using these classes
def get_azure_client(*args, **kwargs):
if ChatOpenAI is None or AzureOpenAI is None:
raise DependencyNotInstalledError(
"Azure/OpenAI support requires additional dependencies. "
"Please install with: pip install 'llm-factory-tcell[azure]'"
)
# Your Azure client initialization code here
return AzureOpenAI(*args, **kwargs)
def get_ollama_client(*args, **kwargs):
if OllamaClient is None:
raise DependencyNotInstalledError(
"Ollama support requires additional dependencies. "
"Please install with: pip install 'llm-factory-tcell[ollama]'"
)
# Your Ollama client initialization code here
return OllamaClient(*args, **kwargs)
# Example usage in a factory method
def create_llm_client(provider: str, *args, **kwargs):
if provider == "azure":
return get_azure_client(*args, **kwargs)
elif provider == "ollama":
return get_ollama_client(*args, **kwargs)
# Add other providers as needed
else:
raise ValueError(f"Unsupported provider: {provider}")
[project]
name = "llm-factory-tcell"
version = "0.2.12"
authors = [
{ name="Mithat Sinan Ergen", email="mithat.ergen@turkcell.com.tr" },
{ name="Emre Çalışır", email="emre.calisir@atmoswere.turkcell.com.tr" },
{ name="Umut Alihan Dikel", email="alihan.dikel@turkcell.com.tr" },
{ name="Mahmut Yılmaz", email="yilmaz.mahmut@turkcell.com.tr" }
]
description = "LLM Factory package"
readme = "README.md"
requires-python = ">=3.11"
classifiers = [
"Programming Language :: Python :: 3",
"Operating System :: OS Independent",
]
# Core dependencies that are always needed
dependencies = [
"fastapi>=0.110.0",
"langchain>=0.2.0",
"langchain-community>=0.2.0",
"langchain-core>=0.2.0",
"pydantic>=2.0.0",
"SQLAlchemy>=2.0.0",
"loguru>=0.7.0",
"httpx>=0.24.0",
"rich>=13.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0.0",
"pytest-asyncio>=0.23.0",
"pytest-cov>=5.0.0",
]
ollama = [
"ollama>=0.4.6",
]
vllm = [
"vllm>=0.2.0",
]
azure = [
"langchain-openai>=0.1.0",
"openai>=1.0.0",
"tiktoken>=0.7.0",
]
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[tool.setuptools.packages.find]
where = ["src"]
[build-system]
requires = ["setuptools>=40.8.0"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
packages = ["your_library"]
[tool.setuptools.dependencies]
# Required dependencies for your library
install_requires = [
"requests>=2.25.0"
]
[tool.setuptools.extras]
# Optional dependencies (extras)
data-analysis = ["pandas", "numpy"]
web-scraping = ["beautifulsoup4", "requests"]
ml = ["scikit-learn", "tensorflow"]
dev = ["black", "flake8", "pre-commit"]
testing = ["pytest", "coverage"]
Description
Languages
Python
100%

