Selected article for: "ability measure and machine statistical"

Author: Zhong, Weishun; Gold, Jacob M.; Marzen, Sarah; England, Jeremy L.; Yunger Halpern, Nicole
Title: Machine learning outperforms thermodynamics in measuring how well a many-body system learns a drive
  • Cord-id: xenebsn9
  • Document date: 2021_4_29
  • ID: xenebsn9
    Snippet: Diverse many-body systems, from soap bubbles to suspensions to polymers, learn and remember patterns in the drives that push them far from equilibrium. This learning may be leveraged for computation, memory, and engineering. Until now, many-body learning has been detected with thermodynamic properties, such as work absorption and strain. We progress beyond these macroscopic properties first defined for equilibrium contexts: We quantify statistical mechanical learning using representation learnin
    Document: Diverse many-body systems, from soap bubbles to suspensions to polymers, learn and remember patterns in the drives that push them far from equilibrium. This learning may be leveraged for computation, memory, and engineering. Until now, many-body learning has been detected with thermodynamic properties, such as work absorption and strain. We progress beyond these macroscopic properties first defined for equilibrium contexts: We quantify statistical mechanical learning using representation learning, a machine-learning model in which information squeezes through a bottleneck. By calculating properties of the bottleneck, we measure four facets of many-body systems’ learning: classification ability, memory capacity, discrimination ability, and novelty detection. Numerical simulations of a classical spin glass illustrate our technique. This toolkit exposes self-organization that eludes detection by thermodynamic measures: Our toolkit more reliably and more precisely detects and quantifies learning by matter while providing a unifying framework for many-body learning.

    Search related documents:
    Co phrase search for related documents
    • log likelihood and loss function: 1
    • long term memory and loss function: 1, 2, 3, 4, 5
    • loss function and low loss function: 1, 2, 3, 4, 5