Rising computing power, improved graphics quality, higher-resolution displays, and streaming delivery have rendered computer gaming an increasingly energy-intensive activity. However, the role of gaming-related energy use, and how it varies across platforms, has not been substantively examined by the energy or gaming research communities. We measured the energy consumption of 26 gaming systems representing the spectrum of technology, price, and performance. Among the findings, energy use varied widely by hardware, but equally widely depending on which of 37 game titles or 11 benchmarks were run. Cloud-gaming energy use in datacenters and networks is markedly higher than that for local gaming. Virtual-reality gaming can use significantly more or less energy than gaming with conventional displays, depending on hardware and software choices. In aggregate, we find that gaming represents $5 billion per year in energy expenditures across the United States or 34 TWh/year (2.4% of residential electricity nationally), with 24 MT/year of associated carbon-dioxide emissions equivalent to that of 85 million refrigerators or over 5 million cars. Targeted hardware and software strategies can reduce the gaming energy use by approximately half, while maintaining or improving metrics of user experience. In addition to system designers, gamers and game developers can play a significant role in managing the energy required for gaming.