Approximate Dynamic Programming Methods for Residential Water Heating
Date: Thu, December 03, 2015
Time: 10am - 12pm
Location: Holmes Hall 388
Speaker: Matthew Motoki
In this thesis we address the problem of minimizing the operating cost of a residential water heater while maintaining a desired level of comfort for the customer. We state problem as a discrete-time finite-state average cost Markov decision problem (MDP). We view hot water usage as a random process and develop a model of the water heater system. We develop approximate dynamic programming algorithms to solve the MDP. Specifically, we use aggregation to obtain a simplified but related problem, we use density estimation to calculate transition probabilities, and we consider the Q-Learning algorithm that can be used when a model of the water heater is not known and/or the transition probabilities are not available. We prove that our algorithms can be at least as good as existing methods in terms of minimizing the objective cost. Using numerical simulations, we evaluate our algorithms' performance. Our simulations suggest that our algorithms can decrease operating costs by about 15% while maintaining a specified level of comfort. Finally, we discuss modifications to the basic water heater optimization problem that apply to solar water heating and automated demand response.