Uploaded image for project: 'Satellite'
  1. Satellite
  2. SAT-18534

Unable to create\upload new scap content in Satellite 6.10\6.11 if the datastream files are from https://access.redhat.com/security/data/metrics/ds/

XMLWordPrintable

    • 0
    • False
    • Moderate
    • None
    • None
    • None
    • No Coverage

      Description of problem:

      Unable to create\upload new scap content in Satellite 6.10\6.11 if the datastream files are from https://access.redhat.com/security/data/metrics/ds/

      The same XML datastream file is recognized by Scap Workbench and can be used to scan any RHEL systems using "xccdf eval".

      And surprisingly, The same issue is not there with any datastream files that are being downloaded from https://public.cyber.mil/stigs/scap/. Those files are getting uploaded in same satellite just fine.

      Version-Release number of selected component (if applicable):

      Satellite 6.11.3
      Satellite 6.10

      How reproducible:

      Always

      Steps to Reproduce:

      1. Install Satellite 6.11.

      2. Save the https://access.redhat.com/security/data/metrics/ds/com.redhat.rhsa-RHEL7.ds.xml file in your local desktop from where the

      3. Satellite UI --> Hosts --> Scap Contents --> Upload new scap content , Give it a name, select that com.redhat.rhsa-RHEL7.ds.xml file , select right org\location and submit.

      Actual results:

      in GUI: 500 Internal server error

      In production.log:

      2022-10-08T12:24:25 [I|app|74d01322] Started POST "/compliance/scap_contents" for X.X.X.X at 2022-10-08 12:24:25 +0530
      2022-10-08T12:24:25 [I|app|74d01322] Processing by ScapContentsController#create as HTML
      2022-10-08T12:24:25 [I|app|74d01322] Parameters: {"utf8"=>"✓", "authenticity_token"=>"qt0cul0/x/DVzbjKh4Iz6EeUcra2Ik0Exbbud+SuF0kjrMANI0tInydKne16HEQWFEgIK62SQHx/Xd1r/02EHg==", "scap_content"=>

      {"title"=>"Test2", "scap_file"=>"[FILTERED]", "location_ids"=>["", "2"], "organization_ids"=>["", "1"]}

      , "commit"=>"Submit"}
      2022-10-08T12:24:31 [W|app|74d01322] 500 Internal Server Error
      2022-10-08T12:24:31 [I|app|74d01322] Backtrace for '500 Internal Server Error' error (RestClient::InternalServerError): 500 Internal Server Error
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/abstract_response.rb:223:in `exception_with_response'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/abstract_response.rb:103:in `return!'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:809:in `process_result'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:725:in `block in transmit'
      74d01322 | /usr/share/ruby/net/http.rb:933:in `start'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:715:in `transmit'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:145:in `execute'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:52:in `execute'
      74d01322 | /usr/share/gems/gems/rest-client-2.0.2/lib/restclient/resource.rb:67:in `post'
      74d01322 | /usr/share/foreman/lib/proxy_api/resource.rb:85:in `block (2 levels) in post'
      74d01322 | /usr/share/foreman/app/services/foreman/telemetry_helper.rb:28:in `telemetry_duration_histogram'
      74d01322 | /usr/share/foreman/lib/proxy_api/resource.rb:84:in `block in post'
      74d01322 | /usr/share/foreman/lib/proxy_api/resource.rb:112:in `with_logger'
      74d01322 | /usr/share/foreman/lib/proxy_api/resource.rb:83:in `post'
      74d01322 | /usr/share/gems/gems/foreman_openscap-5.1.1/app/lib/proxy_api/openscap.rb:11:in `fetch_policies_for_scap_content'
      74d01322 | /usr/share/gems/gems/foreman_openscap-5.1.1/app/models/foreman_openscap/scap_content.rb:50:in `fetch_profiles'
      74d01322 | /usr/share/gems/gems/foreman_openscap-5.1.1/app/validators/foreman_openscap/data_stream_validator.rb:32:in `validate'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:428:in `block in make_lambda'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:200:in `block (2 levels) in halting'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:605:in `block (2 levels) in default_terminator'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:604:in `catch'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:604:in `block in default_terminator'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:201:in `block in halting'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:513:in `block in invoke_before'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:513:in `each'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:513:in `invoke_before'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:134:in `run_callbacks'
      74d01322 | /usr/share/gems/gems/activesupport-6.0.4.7/lib/active_support/callbacks.rb:825:in `_run_validate_callbacks'
      74d01322 | /usr/share/gems/gems/activemodel-6.0.4.7/lib/active_model/validations.rb:406:in `run_validations!'
      74d01322 | /usr/share/gems/gems/activemodel-6.0.4.7/lib/active_model/validations/callbacks.rb:117:in `block in run_validations!'

      Expected results:

      We should be able to use any datastream files from https://access.redhat.com/security/data/metrics/ds/ or https://www.redhat.com/security/data/oval/v2/ without any issues.

      Additional info:

              jira-bugzilla-migration RH Bugzilla Integration
              rhn-support-saydas Sayan Das
              Gaurav Talreja Gaurav Talreja
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: