-
Epic
-
Resolution: Unresolved
-
Undefined
-
None
-
None
-
None
-
Research and Investigation
-
False
-
-
False
-
Not Selected
-
To Do
-
83% To Do, 17% In Progress, 0% Done
This work will be split in 2 phases:
PHASE 1: RHEL AI 1.4 NVIDIA ONLY
- Use tested 7b recipe data for 128k context granite-8b
- Research figure out what optimal context size is, after training the 128k context granite with the 7b dataset. Expect it could shrink to 64L.
PHASE 2: RHEL AI 1.5+
- New data set for enabling larger context window (needs to be ASAP). Approach: generate data, test
- Legal clearance for new context window (needs to be full support for 128K in an 8b model (not optimal for use cases)
- We will provide recommendations for what size context window customers should aim for, for optimal performance
- We will NOT need to provide knobs for tuning the context window size; just recommend to customers what size to expect to be able to use.
Action Items:
- Teams to investigate the design details needed to support this feature