-
Bug
-
Resolution: Done
-
Undefined
-
rhelai-1.3
Steps to reproduce:
- Serve a trained model.
- Attempt to chat with the model without specifying the path to the model
Result:
ilab model chat -qq "Hello world" Executing chat failed with: Model /var/home/cloud-user/.cache/instructlab/models/granite-8b-lab-v1 is not served by the server. These are the served models: ['/var/home/cloud-user/.local/sha re/instructlab/phased/phase2/checkpoints/hf_format/samples_29117']
Expected result:
The chat should connect to the served mode or we should have a proper note in the documentation
Device Info (please complete the following information):
- Hardware Specs: AWS p5.48xlarge
- Python Version: Python 3.9.18
- InstructLab Version: 0.21.0
- OS Version:
- NAME="Red Hat Enterprise Linux"
VERSION="9.20241104.0.4 (Plow)"
ID="rhel"
ID_LIKE="fedora"
VERSION_ID="9.4"
PLATFORM_ID="platform:el9"
PRETTY_NAME="Red Hat Enterprise Linux 9.20241104.0.4 (Plow)"
ANSI_COLOR="0;31"
LOGO="fedora-logo-icon"
CPE_NAME="cpe:/o:redhat:enterprise_linux:9::baseos"
HOME_URL="https://www.redhat.com/"
DOCUMENTATION_URL="https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/9"
BUG_REPORT_URL="https://issues.redhat.com/"
REDHAT_BUGZILLA_PRODUCT="Red Hat Enterprise Linux 9"
REDHAT_BUGZILLA_PRODUCT_VERSION=9.4
REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux"
REDHAT_SUPPORT_PRODUCT_VERSION="9.4"
OSTREE_VERSION='9.20241104.0'
VARIANT="RHEL AI"
VARIANT_ID=rhel_ai
RHEL_AI_VERSION_ID='1.3.0'