Files
mcphost-api/mcphost_openai_api.log
2025-05-10 17:38:30 +03:00

549 lines
51 KiB
Plaintext
Raw Permalink Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

2025-05-10 17:10:23.482 | INFO | __main__:<module>:378 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:10:23.482 | INFO | __main__:<module>:379 - Configuration:
2025-05-10 17:10:23.483 | INFO | __main__:<module>:380 - MCPhost Path: ./mcphost
2025-05-10 17:10:23.483 | INFO | __main__:<module>:381 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:10:23.483 | INFO | __main__:<module>:382 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:10:23.483 | INFO | __main__:<module>:383 - Debug: True
2025-05-10 17:10:25.677 | ERROR | __main__:chat_completions:216 - MCPhost process object is None
2025-05-10 17:11:54.622 | INFO | __main__:<module>:367 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:11:54.622 | INFO | __main__:<module>:368 - Configuration:
2025-05-10 17:11:54.622 | INFO | __main__:<module>:369 - MCPhost Path: ./mcphost
2025-05-10 17:11:54.622 | INFO | __main__:<module>:370 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:11:54.622 | INFO | __main__:<module>:371 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:11:54.622 | INFO | __main__:<module>:372 - Debug: True
2025-05-10 17:11:56.959 | ERROR | __main__:chat_completions:205 - MCPhost process object is None
2025-05-10 17:14:38.976 | INFO | __main__:<module>:367 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:14:38.976 | INFO | __main__:<module>:368 - Configuration:
2025-05-10 17:14:38.976 | INFO | __main__:<module>:369 - MCPhost Path: ./mcphost
2025-05-10 17:14:38.976 | INFO | __main__:<module>:370 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:14:38.976 | INFO | __main__:<module>:371 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:14:38.976 | INFO | __main__:<module>:372 - Debug: True
2025-05-10 17:14:38.984 | INFO | __main__:lifespan:74 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:14:38.984 | INFO | __main__:lifespan:75 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:14:39.002 | INFO | __main__:lifespan:80 - MCPhost process spawned with PID: 565922
2025-05-10 17:14:39.003 | INFO | __main__:lifespan:89 - Waiting for model to load...
2025-05-10 17:14:39.019 | ERROR | __main__:lifespan:142 - Failed to start MCPhost: 'function' object has no attribute 'write'
2025-05-10 17:14:39.019 | ERROR | __main__:lifespan:144 - MCPhost final output:
2025-05-10 17:14:39.019 | ERROR | __main__:lifespan:145 - Process alive: True
2025-05-10 17:14:39.019 | INFO | __main__:lifespan:151 - Shutting down MCPhost...
2025-05-10 17:14:39.019 | INFO | __main__:lifespan:159 - Terminating MCPhost process...
2025-05-10 17:14:39.120 | INFO | __main__:lifespan:162 - MCPhost process stopped
2025-05-10 17:16:06.172 | INFO | __main__:<module>:367 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:16:06.172 | INFO | __main__:<module>:368 - Configuration:
2025-05-10 17:16:06.172 | INFO | __main__:<module>:369 - MCPhost Path: ./mcphost
2025-05-10 17:16:06.172 | INFO | __main__:<module>:370 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:16:06.172 | INFO | __main__:<module>:371 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:16:06.173 | INFO | __main__:<module>:372 - Debug: True
2025-05-10 17:16:06.180 | INFO | __main__:lifespan:74 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:16:06.180 | INFO | __main__:lifespan:75 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:16:06.199 | INFO | __main__:lifespan:80 - MCPhost process spawned with PID: 565949
2025-05-10 17:16:06.199 | ERROR | __main__:lifespan:142 - Failed to start MCPhost: name 'LoggerWriter' is not defined
2025-05-10 17:16:06.199 | ERROR | __main__:lifespan:144 - MCPhost final output: None
2025-05-10 17:16:06.199 | ERROR | __main__:lifespan:145 - Process alive: True
2025-05-10 17:16:06.199 | INFO | __main__:lifespan:151 - Shutting down MCPhost...
2025-05-10 17:16:06.200 | INFO | __main__:lifespan:159 - Terminating MCPhost process...
2025-05-10 17:16:06.300 | INFO | __main__:lifespan:162 - MCPhost process stopped
2025-05-10 17:16:52.374 | INFO | __main__:<module>:368 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:16:52.374 | INFO | __main__:<module>:369 - Configuration:
2025-05-10 17:16:52.374 | INFO | __main__:<module>:370 - MCPhost Path: ./mcphost
2025-05-10 17:16:52.374 | INFO | __main__:<module>:371 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:16:52.374 | INFO | __main__:<module>:372 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:16:52.374 | INFO | __main__:<module>:373 - Debug: True
2025-05-10 17:16:52.382 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:16:52.382 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:16:52.403 | INFO | __main__:lifespan:81 - MCPhost process spawned with PID: 565987
2025-05-10 17:16:52.403 | INFO | __main__:lifespan:90 - Waiting for model to load...
2025-05-10 17:16:52.420 | DEBUG | local_logger:write:11 - ]11;?\
2025-05-10 17:16:57.428 | DEBUG | local_logger:write:11 - ]10;?\
2025-05-10 17:17:02.434 | DEBUG | local_logger:write:11 - ]11;?\
2025-05-10 17:17:07.440 | DEBUG | local_logger:write:11 - 2025/05/10 17:17:07 INFO <cmd/root.go:495> Model loaded provider=openai model=qwen3:8b-q8_0
2025-05-10 17:17:07.440 | INFO | __main__:lifespan:98 - MCPhost model loaded successfully
2025-05-10 17:17:07.440 | INFO | __main__:lifespan:109 - Waiting for prompt...
2025-05-10 17:17:07.441 | DEBUG | local_logger:write:11 - 2025/05/10 17:17:07 INFO <cmd/mcp.go:180> Created default config file path=./config.json
[?25l[?2004h
2025-05-10 17:17:07.458 | DEBUG | local_logger:write:11 -
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 
┃   
┃  
┃  
┃  
┃  
┃  
alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:17:07.974 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:08.508 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:09.041 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:09.574 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:10.108 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:10.641 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:11.158 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:11.692 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:12.224 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:12.758 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:13.291 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:13.824 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:14.341 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:14.875 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:15.408 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:15.941 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:16.475 | DEBUG | local_logger:write:11 - ┃ 

2025-05-10 17:17:17.008 | DEBUG | local_logger:write:11 - ┃   

2025-05-10 17:17:17.441 | ERROR | __main__:lifespan:119 - Timeout waiting for prompt
2025-05-10 17:17:17.441 | ERROR | __main__:lifespan:120 - Output so far: provider=openai model=qwen3:8b-q8_0
2025/05/10 17:17:07 INFO <cmd/mcp.go:180> Created default config file path=./config.json
[?25l[?2004h
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 
┃   
┃  
┃  
┃  
┃  
┃  
alt+enter / ctrl+j new line • ctrl+e open editor • enter submit┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   

2025-05-10 17:17:17.442 | ERROR | __main__:lifespan:143 - Failed to start MCPhost: Timeout waiting for prompt
2025-05-10 17:17:17.442 | ERROR | __main__:lifespan:145 - MCPhost final output: provider=openai model=qwen3:8b-q8_0
2025/05/10 17:17:07 INFO <cmd/mcp.go:180> Created default config file path=./config.json
[?25l[?2004h
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 
┃   
┃  
┃  
┃  
┃  
┃  
alt+enter / ctrl+j new line • ctrl+e open editor • enter submit┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   
┃ 
┃   

2025-05-10 17:17:17.442 | ERROR | __main__:lifespan:146 - Process alive: True
2025-05-10 17:17:17.442 | INFO | __main__:lifespan:152 - Shutting down MCPhost...
2025-05-10 17:17:17.442 | INFO | __main__:lifespan:160 - Terminating MCPhost process...
2025-05-10 17:17:17.543 | INFO | __main__:lifespan:163 - MCPhost process stopped
2025-05-10 17:22:30.034 | INFO | __main__:<module>:368 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:22:30.034 | INFO | __main__:<module>:369 - Configuration:
2025-05-10 17:22:30.034 | INFO | __main__:<module>:370 - MCPhost Path: ./mcphost
2025-05-10 17:22:30.035 | INFO | __main__:<module>:371 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:22:30.035 | INFO | __main__:<module>:372 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:22:30.035 | INFO | __main__:<module>:373 - Debug: True
2025-05-10 17:22:30.207 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:22:30.207 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:22:30.227 | ERROR | __main__:lifespan:83 - Failed to spawn MCPhost process: [Errno 8] Exec format error: b'./mcphost'
2025-05-10 17:22:30.228 | ERROR | __main__:lifespan:143 - Failed to start MCPhost: [Errno 8] Exec format error: b'./mcphost'
2025-05-10 17:22:30.228 | INFO | __main__:lifespan:152 - Shutting down MCPhost...
2025-05-10 17:23:22.399 | INFO | __main__:<module>:368 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:23:22.400 | INFO | __main__:<module>:369 - Configuration:
2025-05-10 17:23:22.400 | INFO | __main__:<module>:370 - MCPhost Path: ./mcphost
2025-05-10 17:23:22.400 | INFO | __main__:<module>:371 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:23:22.400 | INFO | __main__:<module>:372 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:23:22.400 | INFO | __main__:<module>:373 - Debug: True
2025-05-10 17:23:22.545 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:23:22.546 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:23:22.571 | INFO | __main__:lifespan:81 - MCPhost process spawned with PID: 4385
2025-05-10 17:23:22.572 | INFO | __main__:lifespan:90 - Waiting for model to load...
2025-05-10 17:23:22.574 | DEBUG | local_logger:write:11 - pydev debugger: bytes arguments were passed to a new process creation function. Breakpoints may not work correctly.
2025-05-10 17:23:23.188 | DEBUG | local_logger:write:11 - ]11;?\
2025-05-10 17:23:28.196 | DEBUG | local_logger:write:11 - ]10;?\
2025-05-10 17:23:33.199 | DEBUG | local_logger:write:11 - ]11;?\
2025-05-10 17:23:38.203 | DEBUG | local_logger:write:11 - 2025/05/10 17:23:38 INFO <cmd/root.go:495> Model loaded provider=openai model=qwen3:8b-q8_0
2025-05-10 17:23:38.204 | INFO | __main__:lifespan:98 - MCPhost model loaded successfully
2025-05-10 17:23:38.205 | INFO | __main__:lifespan:109 - Waiting for prompt...
2025-05-10 17:23:38.206 | DEBUG | local_logger:write:11 - [?25l[?2004h
2025-05-10 17:23:38.222 | DEBUG | local_logger:write:11 -
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:23:48.211 | ERROR | __main__:lifespan:119 - Timeout waiting for prompt
2025-05-10 17:23:48.211 | ERROR | __main__:lifespan:120 - Output so far: provider=openai model=qwen3:8b-q8_0
[?25l[?2004h
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:23:48.212 | ERROR | __main__:lifespan:143 - Failed to start MCPhost: Timeout waiting for prompt
2025-05-10 17:23:48.212 | ERROR | __main__:lifespan:145 - MCPhost final output: provider=openai model=qwen3:8b-q8_0
[?25l[?2004h
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:23:48.212 | ERROR | __main__:lifespan:146 - Process alive: True
2025-05-10 17:23:48.213 | INFO | __main__:lifespan:152 - Shutting down MCPhost...
2025-05-10 17:23:48.213 | INFO | __main__:lifespan:160 - Terminating MCPhost process...
2025-05-10 17:23:48.315 | INFO | __main__:lifespan:163 - MCPhost process stopped
2025-05-10 17:27:44.919 | INFO | __main__:<module>:375 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:27:44.920 | INFO | __main__:<module>:376 - Configuration:
2025-05-10 17:27:44.921 | INFO | __main__:<module>:377 - MCPhost Path: ./mcphost
2025-05-10 17:27:44.921 | INFO | __main__:<module>:378 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:27:44.921 | INFO | __main__:<module>:379 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:27:44.921 | INFO | __main__:<module>:380 - Debug: True
2025-05-10 17:27:45.074 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:27:45.074 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:27:45.094 | INFO | __main__:lifespan:81 - MCPhost process spawned with PID: 4500
2025-05-10 17:27:45.095 | INFO | __main__:lifespan:90 - Waiting for model to load...
2025-05-10 17:27:45.096 | DEBUG | local_logger:write:11 - pydev debugger: bytes arguments were passed to a new process creation function. Breakpoints may not work correctly.
2025-05-10 17:27:45.185 | DEBUG | local_logger:write:11 - ]11;?\
2025-05-10 17:27:50.203 | DEBUG | local_logger:write:11 - ]10;?\
2025-05-10 17:27:55.206 | DEBUG | local_logger:write:11 - ]11;?\
2025-05-10 17:28:00.209 | DEBUG | local_logger:write:11 - 2025/05/10 17:28:00 INFO <cmd/root.go:495> Model loaded provider=openai model=qwen3:8b-q8_0
2025-05-10 17:28:00.210 | INFO | __main__:lifespan:98 - MCPhost model loaded successfully
2025-05-10 17:28:00.211 | INFO | __main__:lifespan:109 - Waiting for prompt...
2025-05-10 17:28:00.213 | DEBUG | local_logger:write:11 - [?25l[?2004h
2025-05-10 17:28:00.230 | DEBUG | local_logger:write:11 -
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:28:00.231 | INFO | __main__:lifespan:128 - MCPhost process started and ready
2025-05-10 17:28:00.232 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:00.232 | INFO | __main__:lifespan:137 - Started process monitoring thread
2025-05-10 17:28:00.234 | INFO | __main__:lifespan:141 - MCPhost process is alive and ready to accept requests
2025-05-10 17:28:05.248 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:10.266 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:15.281 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:20.298 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:23.421 | DEBUG | __main__:chat_completions:231 - Sending to MCPhost: Hello, how are you?
2025-05-10 17:28:23.580 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.581 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.582 | DEBUG | local_logger:write:11 - 8
2025-05-10 17:28:23.583 | DEBUG | local_logger:write:11 - A
2025-05-10 17:28:23.584 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.585 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.585 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.586 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.587 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.587 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.588 | DEBUG | local_logger:write:11 - ┃
2025-05-10 17:28:23.589 | DEBUG | local_logger:write:11 - H
2025-05-10 17:28:23.589 | DEBUG | local_logger:write:11 - e
2025-05-10 17:28:23.590 | DEBUG | local_logger:write:11 - l
2025-05-10 17:28:23.590 | DEBUG | local_logger:write:11 - l
2025-05-10 17:28:23.591 | DEBUG | local_logger:write:11 - o
2025-05-10 17:28:23.592 | DEBUG | local_logger:write:11 - ,
2025-05-10 17:28:23.593 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.594 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.595 | DEBUG | local_logger:write:11 - 0
2025-05-10 17:28:23.595 | DEBUG | local_logger:write:11 - K
2025-05-10 17:28:23.596 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.596 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.597 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.598 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.598 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.599 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.600 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.600 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.601 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.602 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.602 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.603 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.603 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.604 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.604 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.604 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.605 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.605 | DEBUG | local_logger:write:11 - 8
2025-05-10 17:28:23.605 | DEBUG | local_logger:write:11 - 0
2025-05-10 17:28:23.606 | DEBUG | local_logger:write:11 - D
2025-05-10 17:28:23.606 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.606 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.607 | DEBUG | local_logger:write:11 - 8
2025-05-10 17:28:23.607 | DEBUG | local_logger:write:11 - A
2025-05-10 17:28:23.608 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.608 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.609 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.609 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.610 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.610 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.611 | DEBUG | local_logger:write:11 - ┃
2025-05-10 17:28:23.611 | DEBUG | local_logger:write:11 - H
2025-05-10 17:28:23.612 | DEBUG | local_logger:write:11 - e
2025-05-10 17:28:23.612 | DEBUG | local_logger:write:11 - l
2025-05-10 17:28:23.612 | DEBUG | local_logger:write:11 - l
2025-05-10 17:28:23.613 | DEBUG | local_logger:write:11 - o
2025-05-10 17:28:23.613 | DEBUG | local_logger:write:11 - ,
2025-05-10 17:28:23.613 | DEBUG | local_logger:write:11 - h
2025-05-10 17:28:23.614 | DEBUG | local_logger:write:11 - o
2025-05-10 17:28:23.614 | DEBUG | local_logger:write:11 - w
2025-05-10 17:28:23.615 | DEBUG | local_logger:write:11 - a
2025-05-10 17:28:23.615 | DEBUG | local_logger:write:11 - r
2025-05-10 17:28:23.616 | DEBUG | local_logger:write:11 - e
2025-05-10 17:28:23.616 | DEBUG | local_logger:write:11 - y
2025-05-10 17:28:23.617 | DEBUG | local_logger:write:11 - o
2025-05-10 17:28:23.617 | DEBUG | local_logger:write:11 - u
2025-05-10 17:28:23.617 | DEBUG | local_logger:write:11 - ?
2025-05-10 17:28:23.619 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.619 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.619 | DEBUG | local_logger:write:11 - 0
2025-05-10 17:28:23.620 | DEBUG | local_logger:write:11 - K
2025-05-10 17:28:23.620 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.620 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.621 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.621 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.621 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.622 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.622 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.622 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.623 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.623 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.623 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.624 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.624 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.624 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.625 | DEBUG | local_logger:write:11 - B
2025-05-10 17:28:23.625 | DEBUG | local_logger:write:11 - 
2025-05-10 17:28:23.625 | DEBUG | local_logger:write:11 - [
2025-05-10 17:28:23.626 | DEBUG | local_logger:write:11 - 8
2025-05-10 17:28:23.626 | DEBUG | local_logger:write:11 - 0
2025-05-10 17:28:23.626 | DEBUG | local_logger:write:11 - D
2025-05-10 17:28:25.314 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:30.327 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:35.340 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:40.356 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:45.373 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:50.390 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:28:53.604 | DEBUG | __main__:chat_completions:299 - MCPhost response: ┃ Hello,  ┃ Hello, how are you?  
2025-05-10 17:28:55.406 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:00.422 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:05.443 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:10.464 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:15.481 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:20.499 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:25.519 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:30.539 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:35.558 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:40.577 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:45.597 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:50.617 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:29:55.634 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:30:00.653 | INFO | __main__:monitor_process:39 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:30:03.910 | INFO | __main__:lifespan:158 - Shutting down MCPhost...
2025-05-10 17:30:04.668 | INFO | __main__:lifespan:166 - Terminating MCPhost process...
2025-05-10 17:30:05.273 | INFO | __main__:lifespan:169 - MCPhost process stopped
2025-05-10 17:31:20.410 | INFO | __main__:<module>:404 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:31:20.411 | INFO | __main__:<module>:405 - Configuration:
2025-05-10 17:31:20.411 | INFO | __main__:<module>:406 - MCPhost Path: ./mcphost
2025-05-10 17:31:20.411 | INFO | __main__:<module>:407 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:31:20.412 | INFO | __main__:<module>:408 - OpenAI URL: http://0.0.0.0:33740/v1
2025-05-10 17:31:20.412 | INFO | __main__:<module>:409 - Debug: True
2025-05-10 17:31:20.550 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://0.0.0.0:33740/v1 --openai-api-key anything
2025-05-10 17:31:20.551 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:31:20.574 | INFO | __main__:lifespan:81 - MCPhost process spawned with PID: 4603
2025-05-10 17:31:20.574 | INFO | __main__:lifespan:90 - Waiting for model to load...
2025-05-10 17:31:20.575 | DEBUG | helpers.local_logger:write:11 - pydev debugger: bytes arguments were passed to a new process creation function. Breakpoints may not work correctly.
2025-05-10 17:31:20.678 | DEBUG | helpers.local_logger:write:11 - ]11;?\
2025-05-10 17:31:25.705 | DEBUG | helpers.local_logger:write:11 - ]10;?\
2025-05-10 17:31:30.709 | DEBUG | helpers.local_logger:write:11 - ]11;?\
2025-05-10 17:31:35.713 | DEBUG | helpers.local_logger:write:11 - 2025/05/10 17:31:35 INFO <cmd/root.go:495> Model loaded provider=openai model=qwen3:8b-q8_0
2025-05-10 17:31:35.714 | INFO | __main__:lifespan:98 - MCPhost model loaded successfully
2025-05-10 17:31:35.715 | INFO | __main__:lifespan:109 - Waiting for prompt...
2025-05-10 17:31:35.718 | DEBUG | helpers.local_logger:write:11 - [?25l[?2004h
2025-05-10 17:31:35.735 | DEBUG | helpers.local_logger:write:11 -
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:31:35.735 | INFO | __main__:lifespan:128 - MCPhost process started and ready
2025-05-10 17:31:35.736 | INFO | __main__:lifespan:137 - Started process monitoring thread
2025-05-10 17:31:35.736 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:31:35.737 | INFO | __main__:lifespan:141 - MCPhost process is alive and ready to accept requests
2025-05-10 17:31:40.752 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:31:45.770 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:31:50.789 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:31:55.321 | DEBUG | __main__:chat_completions:233 - Sending to MCPhost: Hello, how are you?
2025-05-10 17:31:55.485 | DEBUG | helpers.local_logger:write:11 - ┃ Hello, how are you? 

2025-05-10 17:31:55.808 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:31:56.641 | DEBUG | __main__:chat_completions:325 - MCPhost response: Hello, how are you?
2025-05-10 17:32:00.824 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:05.837 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:10.849 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:15.857 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:20.875 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:25.893 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:27.241 | INFO | __main__:lifespan:158 - Shutting down MCPhost...
2025-05-10 17:32:27.900 | INFO | __main__:lifespan:166 - Terminating MCPhost process...
2025-05-10 17:32:28.505 | INFO | __main__:lifespan:169 - MCPhost process stopped
2025-05-10 17:32:32.580 | INFO | __main__:<module>:404 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:32:32.580 | INFO | __main__:<module>:405 - Configuration:
2025-05-10 17:32:32.580 | INFO | __main__:<module>:406 - MCPhost Path: ../mcphost
2025-05-10 17:32:32.580 | INFO | __main__:<module>:407 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:32:32.581 | INFO | __main__:<module>:408 - OpenAI URL: http://10.8.0.10:33740/v1
2025-05-10 17:32:32.581 | INFO | __main__:<module>:409 - Debug: True
2025-05-10 17:32:32.745 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ../mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://10.8.0.10:33740/v1 --openai-api-key anything
2025-05-10 17:32:32.745 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:32:32.751 | ERROR | __main__:lifespan:83 - Failed to spawn MCPhost process: The command was not found or was not executable: ../mcphost.
2025-05-10 17:32:32.751 | ERROR | __main__:lifespan:149 - Failed to start MCPhost: The command was not found or was not executable: ../mcphost.
2025-05-10 17:32:32.752 | INFO | __main__:lifespan:158 - Shutting down MCPhost...
2025-05-10 17:32:43.168 | INFO | __main__:<module>:404 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:32:43.168 | INFO | __main__:<module>:405 - Configuration:
2025-05-10 17:32:43.169 | INFO | __main__:<module>:406 - MCPhost Path: ./mcphost
2025-05-10 17:32:43.169 | INFO | __main__:<module>:407 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:32:43.169 | INFO | __main__:<module>:408 - OpenAI URL: http://10.8.0.10:33740/v1
2025-05-10 17:32:43.169 | INFO | __main__:<module>:409 - Debug: True
2025-05-10 17:32:43.293 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://10.8.0.10:33740/v1 --openai-api-key anything
2025-05-10 17:32:43.293 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:32:43.313 | INFO | __main__:lifespan:81 - MCPhost process spawned with PID: 4681
2025-05-10 17:32:43.314 | INFO | __main__:lifespan:90 - Waiting for model to load...
2025-05-10 17:32:43.315 | DEBUG | helpers.local_logger:write:11 - pydev debugger: bytes arguments were passed to a new process creation function. Breakpoints may not work correctly.
2025-05-10 17:32:43.349 | DEBUG | helpers.local_logger:write:11 - ]11;?\
2025-05-10 17:32:48.357 | DEBUG | helpers.local_logger:write:11 - ]10;?\
2025-05-10 17:32:53.358 | DEBUG | helpers.local_logger:write:11 - ]11;?\
2025-05-10 17:32:58.364 | DEBUG | helpers.local_logger:write:11 - 2025/05/10 17:32:58 INFO <cmd/root.go:495> Model loaded provider=openai model=qwen3:8b-q8_0
2025-05-10 17:32:58.364 | INFO | __main__:lifespan:98 - MCPhost model loaded successfully
2025-05-10 17:32:58.365 | INFO | __main__:lifespan:109 - Waiting for prompt...
2025-05-10 17:32:58.366 | DEBUG | helpers.local_logger:write:11 - [?25l[?2004h
2025-05-10 17:32:58.383 | DEBUG | helpers.local_logger:write:11 -
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:32:58.384 | INFO | __main__:lifespan:128 - MCPhost process started and ready
2025-05-10 17:32:58.385 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:32:58.386 | INFO | __main__:lifespan:137 - Started process monitoring thread
2025-05-10 17:32:58.386 | INFO | __main__:lifespan:141 - MCPhost process is alive and ready to accept requests
2025-05-10 17:33:03.402 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:08.050 | DEBUG | __main__:chat_completions:233 - Sending to MCPhost: Hello, how are you?
2025-05-10 17:33:08.216 | DEBUG | helpers.local_logger:write:11 - ┃ Hello, how are you? 

2025-05-10 17:33:08.420 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:09.367 | DEBUG | __main__:chat_completions:325 - MCPhost response: Hello, how are you?
2025-05-10 17:33:13.435 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:18.455 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:23.476 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:28.487 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:33.502 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:36.958 | DEBUG | __main__:chat_completions:233 - Sending to MCPhost: Tell me a joke.
2025-05-10 17:33:37.116 | DEBUG | helpers.local_logger:write:11 - ┃ Tell me a joke. 

2025-05-10 17:33:38.258 | DEBUG | __main__:chat_completions:325 - MCPhost response: Tell me a joke.
2025-05-10 17:33:38.518 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:43.534 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:48.550 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:50.324 | DEBUG | __main__:chat_completions:233 - Sending to MCPhost: Tell me a joke.
2025-05-10 17:33:50.483 | DEBUG | helpers.local_logger:write:11 - ┃ Tell me a joke. 

2025-05-10 17:33:51.627 | DEBUG | __main__:chat_completions:325 - MCPhost response: Tell me a joke.
2025-05-10 17:33:53.570 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:33:58.586 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:03.604 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:08.610 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:13.621 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:18.644 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:23.665 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:28.681 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:33.700 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:38.717 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:34:42.098 | INFO | __main__:lifespan:158 - Shutting down MCPhost...
2025-05-10 17:34:42.734 | INFO | __main__:lifespan:166 - Terminating MCPhost process...
2025-05-10 17:34:43.337 | INFO | __main__:lifespan:169 - MCPhost process stopped
2025-05-10 17:34:50.679 | INFO | __main__:<module>:404 - Starting OpenAI-compatible MCPhost API server...
2025-05-10 17:34:50.679 | INFO | __main__:<module>:405 - Configuration:
2025-05-10 17:34:50.679 | INFO | __main__:<module>:406 - MCPhost Path: ./mcphost
2025-05-10 17:34:50.679 | INFO | __main__:<module>:407 - Model: openai:qwen3:8b-q8_0
2025-05-10 17:34:50.679 | INFO | __main__:<module>:408 - OpenAI URL: http://10.8.0.10:33740/v1
2025-05-10 17:34:50.680 | INFO | __main__:<module>:409 - Debug: True
2025-05-10 17:34:50.814 | INFO | __main__:lifespan:75 - Starting MCPhost with command: ./mcphost --debug --config ./config.json --model openai:qwen3:8b-q8_0 --openai-url http://10.8.0.10:33740/v1 --openai-api-key anything
2025-05-10 17:34:50.814 | INFO | __main__:lifespan:76 - Environment: OLLAMA_NUM_CTX=40960
2025-05-10 17:34:50.834 | INFO | __main__:lifespan:81 - MCPhost process spawned with PID: 4756
2025-05-10 17:34:50.834 | INFO | __main__:lifespan:90 - Waiting for model to load...
2025-05-10 17:34:50.835 | DEBUG | helpers.local_logger:write:11 - pydev debugger: bytes arguments were passed to a new process creation function. Breakpoints may not work correctly.
2025-05-10 17:34:50.870 | DEBUG | helpers.local_logger:write:11 - ]11;?\
2025-05-10 17:34:55.879 | DEBUG | helpers.local_logger:write:11 - ]10;?\
2025-05-10 17:35:00.884 | DEBUG | helpers.local_logger:write:11 - ]11;?\
2025-05-10 17:35:05.887 | DEBUG | helpers.local_logger:write:11 - 2025/05/10 17:35:05 INFO <cmd/root.go:495> Model loaded provider=openai model=qwen3:8b-q8_0
2025-05-10 17:35:05.887 | INFO | __main__:lifespan:98 - MCPhost model loaded successfully
2025-05-10 17:35:05.888 | INFO | __main__:lifespan:109 - Waiting for prompt...
2025-05-10 17:35:05.888 | DEBUG | helpers.local_logger:write:11 - [?25l[?2004h
2025-05-10 17:35:05.905 | DEBUG | helpers.local_logger:write:11 -
┃ Enter your prompt (Type /help for commands, Ctrl+C to 
quit) 






alt+enter / ctrl+j new line • ctrl+e open editor • enter submit
2025-05-10 17:35:05.906 | INFO | __main__:lifespan:128 - MCPhost process started and ready
2025-05-10 17:35:05.907 | INFO | __main__:lifespan:137 - Started process monitoring thread
2025-05-10 17:35:05.907 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:35:05.907 | INFO | __main__:lifespan:141 - MCPhost process is alive and ready to accept requests
2025-05-10 17:35:10.911 | INFO | __main__:monitor_process:38 - MCPhost process status - Alive: True, Exit status: None
2025-05-10 17:35:11.348 | INFO | __main__:lifespan:158 - Shutting down MCPhost...
2025-05-10 17:35:11.917 | INFO | __main__:lifespan:166 - Terminating MCPhost process...
2025-05-10 17:35:12.521 | INFO | __main__:lifespan:169 - MCPhost process stopped