The study of real-time evolution of quantum field theories is known to be an extremely challenging problem for classical computers. Due to a fundamentally different computational strategy, quantum computers hold the promise of allowing for detailed studies of these dynamics from first principles. However, much like with classical computations, it is important that quantum algorithms do not have a cost that scales exponentially with the volume. In this paper, we present an interesting test case: a formulation of a compact U(1) gauge theory in 2+1 dimensions. A naive implementation onto a quantum circuit has a gate count that scales exponentially with the volume. We discuss how to break this exponential scaling by performing an operator redefinition that reduces the non-locality of the Hamiltonian and also provide explicit implementations using the Walsh function formalism. While we study only one theory as a test case, we expect the exponential gate scaling to persist for formulations of other gauge theories, including non-Abelian theories in higher dimensions.