<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Adam Miller on Test Science Research Document Library</title>
    <link>https://research.testscience.org/researchers/adam-miller/</link>
    <description>Recent content in Adam Miller on Test Science Research Document Library</description>
    <generator>Hugo -- 0.129.0</generator>
    <language>en-us</language>
    <copyright>Institute for Defense Analyses</copyright>
    <lastBuildDate>Mon, 01 Jan 2024 00:00:00 +0000</lastBuildDate>
    <atom:link href="https://research.testscience.org/researchers/adam-miller/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Introduction to Human-Systems Interaction in Operational Test and Evaluation Course</title>
      <link>https://research.testscience.org/post/2024-introduction-to-human-systems-interaction-in-operational-test-and-evaluation-course/</link>
      <pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate>
      <guid>https://research.testscience.org/post/2024-introduction-to-human-systems-interaction-in-operational-test-and-evaluation-course/</guid>
      <description>Human-System Interaction (HSI) is the study of interfaces between humans and technical systems. The Department of Defense incorporates HSI evaluations into defense acquisition to improve system performance and reduce lifecycle costs. During operational test and evaluation, HSI evaluations characterize how a system’s operational performance is affected by its users. The goal of this course is to provide the theoretical background and practical tools necessary to plan and evaluate HSI test plans, collect and analyze HSI data, and report on HSI results.</description>
      <content:encoded><![CDATA[<p>Human-System Interaction (HSI) is the study of interfaces between humans and technical systems. The Department of Defense incorporates HSI evaluations into defense acquisition to improve system performance and reduce lifecycle costs. During operational test and evaluation, HSI evaluations characterize how a system’s operational performance is affected by its users. The goal of this course is to provide the theoretical background and practical tools necessary to plan and evaluate HSI test plans, collect and analyze HSI data, and report on HSI results. We will discuss HSI concepts, measurement methods, design of experiments, data analysis, and evaluation and reporting, all from an operational testing perspective.</p>
<h4 id="suggested-citation">Suggested Citation</h4>
<blockquote>
<p>Miller, Dr Adam M, and Keyla Pagan-Rivera. Introduction to Human-Systems Interaction in Operational Test and Evaluation Course. IDA Product ID 3002009. Alexandria, VA: Institute for Defense Analyses, 2024.</p>
</blockquote>
<h4 id="slides">Slides:</h4>
<embed src= "slides_3002009.pdf" width= "100%" height= "700px" type="application/pdf" >

]]></content:encoded>
    </item>
    <item>
      <title>Operational T&amp;E of AI-Supported Data Integration, Fusion, and Analysis Systems</title>
      <link>https://research.testscience.org/post/2024-operational-t-e-of-ai-supported-data-integration-fusion-and-analysis-systems/</link>
      <pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate>
      <guid>https://research.testscience.org/post/2024-operational-t-e-of-ai-supported-data-integration-fusion-and-analysis-systems/</guid>
      <description>AI will play an important role in future military systems. However, large questions remain about how to test AI systems, especially in operational settings. Here, we discuss an approach for the operational test and evaluation (OT&amp;amp;E) of AI-supported data integration, fusion, and analysis systems. We highlight new challenges posed by AI-supported systems and we discuss new and existing OT&amp;amp;E methods for overcoming them. We demonstrate how to apply these OT&amp;amp;E methods via a notional test concept that focuses on evaluating an AI-supported data integration system in terms of its technical performance (how accurate is the AI output?</description>
      <content:encoded><![CDATA[

    
    <div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
      <iframe allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen="allowfullscreen" loading="eager" referrerpolicy="strict-origin-when-cross-origin" src="https://www.youtube.com/embed/JqlIzJh-RQI?autoplay=0&controls=1&end=0&loop=0&mute=0&start=0" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" title="YouTube video"
      ></iframe>
    </div>

<p>AI will play an important role in future military systems. However, large questions remain about how to test AI systems, especially in operational settings. Here, we discuss an approach for the operational test and evaluation (OT&amp;E) of AI-supported data integration, fusion, and analysis systems. We highlight new challenges posed by AI-supported systems and we discuss new and existing OT&amp;E methods for overcoming them. We demonstrate how to apply these OT&amp;E methods via a notional test concept that focuses on evaluating an AI-supported data integration system in terms of its technical performance (how accurate is the AI output?) and human systems interaction (how does the AI affect users?).</p>
<h4 id="suggested-citation">Suggested Citation</h4>
<blockquote>
<p>Anderson, Breeana G, Adam M Miller, Logan K Ausman, John T Haman, Keyla Pagan-Rivera, Sarah A Shaffer, and Brian D Vickers. Data Integration, Fusion, and Analysis Systems. IDA Product ID 3001848. Alexandria, VA: Institute for Defense Analyses, 2024.</p>
</blockquote>
<h4 id="slides">Slides:</h4>
<embed src= "slides.pdf" width= "100%" height= "700px" type="application/pdf" >

]]></content:encoded>
    </item>
    <item>
      <title>Statistical Advantages of Validated Surveys over Custom Surveys</title>
      <link>https://research.testscience.org/post/2024-statistical-advantages-of-validated-surveys-over-custom-surveys/</link>
      <pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate>
      <guid>https://research.testscience.org/post/2024-statistical-advantages-of-validated-surveys-over-custom-surveys/</guid>
      <description>Surveys play an important role in quantifying user opinion during test and evaluation (T&amp;amp;E). Current best practice is to use surveys that have been tested, or “validated,” to ensure that they produce reliable and accurate results. However, unvalidated (“custom”) surveys are still widely used in T&amp;amp;E, raising questions about how to determine sample sizes for—and interpret data from— T&amp;amp;E events that rely on custom surveys. In this presentation, I characterize the statistical properties of validated and custom survey responses using data from recent T&amp;amp;E events, and then I demonstrate how these properties affect test design, analysis, and interpretation.</description>
      <content:encoded><![CDATA[<p>Surveys play an important role in quantifying user opinion during test and evaluation (T&amp;E). Current best practice is to use surveys that have been tested, or “validated,” to ensure that they produce reliable and accurate results. However, unvalidated (“custom”) surveys are still widely used in T&amp;E, raising questions about how to determine sample sizes for—and interpret data from— T&amp;E events that rely on custom surveys. In this presentation, I characterize the statistical properties of validated and custom survey responses using data from recent T&amp;E events, and then I demonstrate how these properties affect test design, analysis, and interpretation. I show that validated surveys reduce the number of subjects required to estimate statistical parameters or to detect a mean difference between two populations. Additionally, I simulate the survey process to demonstrate how poorly designed custom surveys introduce unintended changes to the data, increasing the risk of drawing false conclusions.</p>
<h4 id="suggested-citation">Suggested Citation</h4>
<blockquote>
<p>Bell, Jonathan L, and Adam M Miller. Statistical Advantages of Validated  Surveys over Custom Surveys. IDA Product ID 3001858. Alexandria, VA: Institute for Defense Analyses, 2024.</p>
</blockquote>
<h4 id="poster">Poster:</h4>
<embed src= "poster.pdf" width= "100%" height= "700px" type="application/pdf" >

]]></content:encoded>
    </item>
  </channel>
</rss>
