Uploaded image for project: 'Product Technical Learning'
  1. Product Technical Learning
  2. PTL-13748

RHT2184539: AI267 user feedback on "Retrieving a token"

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Minor Minor
    • None
    • AI267 - RHOAI2.8-en-2-20240722
    • AI265, AI267
    • None
    • False
    • Hide

      None

      Show
      None
    • False
    • ROLE
    • en-US (English)
    • AppDev Sprint 11-25 Jul

      Please fill in the following information:


      URL: https://rol.redhat.com/rol/app/courses/ai267-2.8/pages/ch11s04 
      Reporter RHNID: wasim-rhls
      Section Title:           Guided Exercise: Consuming the Model Serving API                                                              

      Issue description

      The line "Retrieve a token by running oc whoami -t from a new terminal window in the workstation." in the notebook[1] should be replaced by "Copy the inference token from "Tokens" --> "Token secret" available under the model server "infer-model-server" in the "Models and model servers" section of the project.

      [1] - https://github.com/RedHatTraining/AI26X-apps/blob/main/deploying/rhoaiserving-consuming/inference-request.ipynb  

      Steps to reproduce:

       

      Workaround:

       

      Expected result:

              rhn-gps-gbianco Guy Bianco
              wraja@redhat.com Wasim Raja
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: