Publications by authors named "Dimitrios Bachtis"

2 Publications

  • Page 1 of 1

Mapping distinct phase transitions to a neural network.

Phys Rev E 2020 Nov;102(5-1):053306

Department of Mathematics, Swansea University, Bay Campus, SA1 8EN, Swansea, Wales, United Kingdom.

We demonstrate, by means of a convolutional neural network, that the features learned in the two-dimensional Ising model are sufficiently universal to predict the structure of symmetry-breaking phase transitions in considered systems irrespective of the universality class, order, and the presence of discrete or continuous degrees of freedom. No prior knowledge about the existence of a phase transition is required in the target system and its entire parameter space can be scanned with multiple histogram reweighting to discover one. We establish our approach in q-state Potts models and perform a calculation for the critical coupling and the critical exponents of the ϕ^{4} scalar field theory using quantities derived from the neural network implementation. We view the machine learning algorithm as a mapping that associates each configuration across different systems to its corresponding phase and elaborate on implications for the discovery of unknown phase transitions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1103/PhysRevE.102.053306DOI Listing
November 2020

Extending machine learning classification capabilities with histogram reweighting.

Phys Rev E 2020 Sep;102(3-1):033303

Department of Mathematics, Swansea University, Bay Campus, SA1 8EN, Swansea, Wales, United Kingdom.

We propose the use of Monte Carlo histogram reweighting to extrapolate predictions of machine learning methods. In our approach, we treat the output from a convolutional neural network as an observable in a statistical system, enabling its extrapolation over continuous ranges in parameter space. We demonstrate our proposal using the phase transition in the two-dimensional Ising model. By interpreting the output of the neural network as an order parameter, we explore connections with known observables in the system and investigate its scaling behavior. A finite-size scaling analysis is conducted based on quantities derived from the neural network that yields accurate estimates for the critical exponents and the critical temperature. The method improves the prospects of acquiring precision measurements from machine learning in physical systems without an order parameter and those where direct sampling in regions of parameter space might not be possible.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1103/PhysRevE.102.033303DOI Listing
September 2020
-->